FIND_THE_RIGHTJOB.
Pune, India
We are looking for Immediate Joiner for below position
Data Engineer ( ADF , SQL , Python )
Experience – 5 to 10 years
Working Hours – 4AM to 1PM IST
Remote
Interview Process :
1st Round – Virtual
2nd Round – Face to Face ( Pune )
Primary Skill – Azure Data Factory ( ADF ) , SQL , DBT , Python , Snowflake .
Primary Role
We are seeking a skilled Senior Azure DBT Developer to join our data engineering team. The ideal candidate will have solid experience developing ETL/ELT pipelines on snowflake + DBT, strong SQL skills, and hands-on expertise working with Snowflake and DBT on the Azure cloud platform. This role involves designing, building, and maintaining scalable data transformation workflows and data models to support analytics and business intelligence.
Key Responsibilities:
• Design, develop, and maintain data transformation pipelines using DBT to build modular, reusable, and scalable data models on Snowflake.
• Develop and optimize SQL queries and procedures for data loading, transformation, and analysis in Snowflake.
• Load and manage data efficiently in Snowflake from various sources, ensuring data quality and integrity.
• Analyse and profile data using SQL to support business requirements and troubleshooting.
• Collaborate with data engineers, analysts, and business stakeholders to understand data needs and translate them into technical solutions.
• Implement best practices for DBT project structure, version control (Git), testing, and documentation.
• Work on Azure cloud platform, leveraging its integration capabilities with Snowflake.
• Participate in code reviews, unit testing, and deployment processes to ensure high-quality deliverables.
• Troubleshoot and optimize data pipelines for performance and cost-effectiveness.
Desired Skills
Qualification
• Bachelor’s degree in science, Engineering and related disciplines.
Work Experience
• 5-10 years of experience in data engineering or development roles, with at least 4 years of hands-on experience in SQL and 3 years with DBT.
• Experience developing ETL/ELT pipelines and working with data warehouse concepts.
• Strong proficiency in SQL, including complex query writing, data analysis, and performance tuning.
• Proven experience loading and transforming data in Snowflake.
• Hands-on experience working on Azure cloud platform and integrating Snowflake with Azure services.
• Familiarity with DBT Core features such as models, macros, tests, hooks, and modular project structure.
• Good understanding of data modeling concepts and dimensional modeling (star/snowflake schemas).
• Experience with version control systems like Git and CI/CD workflows is a plus.
• Strong analytical, problem-solving, and communication skills.
Similar jobs
The Coca-Cola Company
Egypt
about 10 hours ago
TAKMIL
Islamabad, Pakistan
about 10 hours ago
Oaktree Capital Management
Hyderabad, Pakistan
about 10 hours ago
Basharsoft
Riyadh, Saudi Arabia
about 11 hours ago
Squarera
Karachi, Pakistan
about 13 hours ago
Deloitte
Mangaluru, India
7 days ago
مجموعة الطاير
Dubai, United Arab Emirates
7 days ago
© 2025 Qureos. All rights reserved.