FIND_THE_RIGHTJOB.
JOB_REQUIREMENTS
Hires in
Not specified
Employment Type
Not specified
Company Location
Not specified
Salary
Not specified
Job Description: Senior Data Engineer (Snowflake Specialist)
Position Title: Senior Data Engineer
Location: Bangalore /Pune
Experience: 8–12 years
Employment Type: Full-time
Role Overview
We are seeking a highly skilled Senior Data Engineer with strong expertise in Snowflake and cloud-based data engineering. This role requires deep hands-on experience with Snowflake’s core features including data warehousing, performance optimization, data modeling, Snowpipe, Streams, Tasks, Role-Based Access Control (RBAC), and cost optimization techniques. The ideal candidate will also be proficient in Python, SQL, Airflow, and have exposure to AI/ML and GenAI capabilities. You will play a key role in designing and maintaining high-performance Snowflake environments and scalable data pipelines.
Responsibilities
- Design, develop, and optimize scalable data pipelines using Python, SQL, and Snowflake
- Build complex ELT/ETL workflows with strong Snowflake integration (Snowpipe, Streams, Tasks)
- Design and implement Snowflake-based data models including dimensional models, data marts, and lakehouse patterns
- Optimize Snowflake compute (warehouses), storage usage, and overall cost efficiency
- Implement advanced Snowflake features (clustering keys, micro-partitioning, query tuning)
- Manage Snowflake RBAC, data governance, and secure data sharing
- Orchestrate workflows using Apache Airflow for automated and reliable data processing
- Collaborate with Data Architects to design scalable Snowflake-driven data platforms on AWS/Azure/GCP
- Ensure strong data quality, governance, lineage, and compliance within Snowflake and pipelines
- Troubleshoot Snowflake performance issues and resolve data reliability problems
- Support AI/ML and GenAI-driven data requirements through high-quality, production-ready datasets
- Conduct PoCs and evaluate new Snowflake capabilities such as Snowflake Native Apps, Snowpark, and Snowflake Marketplace integrations
Required Qualifications
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field
- 8–12 years of experience in data engineering with minimum 4+ years focused on Snowflake
- Strong programming experience with Python
- Deep SQL expertise, including optimization for analytical workloads
- Hands-on experience with Snowflake (warehouse management, performance tuning, ELT pipelines, security/RBAC)
- Experience with Snowflake features: Snowpipe, Streams, Tasks, Clustering Keys, Secure Views, Zero-Copy Cloning
- Working knowledge of data warehousing, lakehouse concepts, and ETL/ELT best practices
- Experience with Apache Airflow for workflow orchestration
- Strong cloud experience (AWS, Azure, or GCP) and understanding of related services (S3/ADLS/GCS)
- Strong analytical and problem-solving skills with ability to work in cross-functional teams
Good to Have (Preferred Skills)
- Experience with DBT (data transformations and modeling)
- Exposure to Snowpark or Snowflake Native Apps
- Experience with CI/CD (Git, GitHub Actions, Azure DevOps, Jenkins, etc.)
- Knowledge of Terraform or infrastructure-as-code tools
- Familiarity with AI/ML or GenAI data pipelines
- Experience with data governance, cataloging, and metadata management
- Snowflake certification(s) highly preferred
© 2025 Qureos. All rights reserved.