Danial AneeqAhmed
AI Cloud Data Engineer with hands-on experience building scalable ETL pipelines on AWS. Skilled in Python, SQL, Snowflake, and Docker, with a strong foundation in data modeling and cloud architecture. Experienced in designing efficient data systems with ongoing exploration of AI concepts, including LLMs, agentic workflows and building automations.
STACK SUMMARY
01 // ABOUT ME
WHO I AM
Cloud Data Engineer | SQL | AWS | Snowflake I am a Computer Science student with a strong interest in Data Engineering and cloud-based data platforms. I have hands-on knowledge of SQL, database design principles, and performance optimization techniques, along with practical exposure to AWS services and Snowflake for modern data warehousing. I enjoy working with data pipelines, understanding how large-scale systems process information, and exploring best practices in scalable and reliable architectures. My learning approach is project-driven, I continuously build, experiment, and refine my skills to bridge the gap between theory and real-world implementation. Currently, I am focused on strengthening my expertise in cloud data engineering, ETL workflows, and data warehouse optimization, with the goal of designing efficient, secure, and high-performance data solutions.
Danial Aneeq Ahmed Qureshi
Cloud Data Engineer
02 // SKILLS
TECHNICALITIES
CLOUD & INFRASTRUCTURE
LANGUAGES & TOOLS
Snowflake
ASSOCIATE SQL DEVELOPER
PROFICIENCY LEGEND
Expert (85–100%)Proficient (70–84%)Familiar (50–69%)03 // PROJECTS
DEPLOYED SYSTEMS
AUTOMATED AWS-BASED STOCK PRICE ETL PIPELINE
Implemented a scalable, event-driven stock price data ingestion pipeline using Amazon SQS, AWS Lambda, and CloudWatch for scheduled triggering and reliable processing.
DOCKER & SQL CONTAINERIZATION
Containerized a full data ingestion workflow with Docker and PostgreSQL, using Docker Compose to orchestrate seamless communication between ingestion script, database, and pgAdmin services. Executed SQL queries inside the containerized PostgreSQL for efficient data loading, joins, and aggregations directly within the Docker environment.
SCD DATA WAREHOUSING WITH SNOWFLAKE
Showcases SCD Type 1 (overwrite) & Type 2 (history tracking) implementation in Snowflake using Streams + Tasks for CDC automation, Snowpipe for S3 loading, MERGE statements, Python/Faker data generation, Docker Compose setups, and NiFi orchestration complete with SQL scripts and step-by-step docs for building real-time, scalable data pipelines.
AGENTIC TODO APPLICATION
Built a smart, AI-powered Todo app using FastAPI (backend), Python with OpenAI Agents SDK for autonomous task planning/execution, and Next.js (frontend) — enabling natural language task creation, intelligent prioritization, reminders, and agent-driven automation of to-dos.
ETL WITH SNOWPIPE
Designed and implemented a robust ETL pipeline using Python for data extraction/transformation, Snowpipe for automated loading from cloud storage (S3), and Snowflake for scalable, high-performance data warehousing with Streams + Tasks for incremental processing and CDC.
04 // CERTIFICATIONS
VERIFIED CREDENTIALS
ASSOCIATE SQL DEVELOPER
DATACAMP
Feb 2026
Hands-On Essentials: Data Warehousing Workshop
SNOWFLAKE
Dec 2025
Hands-On Essentials: Collaboration, Marketplace & Cost Estimation Workshop
SNOWFLAKE
Jan 2026
Hands-On Essentials: Data Application Builders Workshop
SNOWFLAKE
Jan 2025
Hands-On Essentials: Data Lake Workshop
SNOWFLAKE
Jun 2026
Hands-On Essentials: Data Engineering Workshop
SNOWFLAKE
Jan 2026
05 // EXPERIENCE