Job Description: Must-Have Skills:
- Strong experience with Azure Cloud Platform services.
- Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines.
- Proficiency in SQL for data analysis and transformation.
- Hands-on experience with Snowflake and SnowSQL for data warehousing.
- Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse.
- Experience working in cloud-based data environments with large-scale datasets.
Good-to-Have Skills:
- Experience with DataStage, Netezza, Azure Data Lake , Azure Synapse , or Azure Functions .
- Familiarity with Python or PySpark for custom data transformations.
- Understanding of CI/CD pipelines and DevOps for data workflows.
- Exposure to data governance , metadata management , or data catalog tools.
- Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus.
Responsibilities: Key Responsibilities:
- Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT .
- Build and maintain data integration workflows from various data sources to Snowflake.
- Write efficient and optimized SQL queries for data extraction and transformation.
- Work with stakeholders to understand business requirements and translate them into technical solutions.
- Monitor, troubleshoot, and optimize data pipelines for performance and reliability.
- Provide technical leadership and mentorship to junior data engineers.
- Maintain and enforce data quality, governance, and documentation standards.
Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment.
Must-Have Skills:
- Strong experience with Azure Cloud Platform services.
- Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines.
- Proficiency in SQL for data analysis and transformation.
- Hands-on experience with Snowflake and SnowSQL for data warehousing.
- Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse.
- Experience working in cloud-based data environments with large-scale datasets.
Good-to-Have Skills:
- Experience with DataStage, Netezza, Azure Data Lake , Azure Synapse , or Azure Functions .
- Familiarity with Python or PySpark for custom data transformations.
- Understanding of CI/CD pipelines and DevOps for data workflows.
- Exposure to data governance , metadata management , or data catalog tools.
Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus
Qualifications: Qualifications:
- Bachelor’s degree in Data Engineering or a related field.
- 8+ years of experience in data engineering roles using Azure and Snowflake.
Strong problem-solving, communication, and collaboration skills.
The base salary for this role ranges from $110-140k per year, based on skills and experience.