We are looking for a highly skilled Senior Azure Data Engineer with extensive experience in building and managing modern data pipelines using the Azure ecosystem. The ideal candidate will have strong hands-on expertise in Azure Data Factory, Databricks, Synapse Analytics, and Python/PySpark.
Payroll : HTC Global
Location : Remote
Key Responsibilities:
- Design and implement end-to-end data pipelines on Azure platform.
- Work with Azure Data Factory, Databricks, Synapse, and other Azure services.
- Write efficient Python and PySpark code for data transformation.
- Implement data ingestion from diverse sources including APIs, SFTP, and messaging platforms.
- Optimize performance of ETL/ELT jobs and manage medallion architecture (bronze/silver/gold layers).
- Design and implement data models and data warehouse architecture.
- Collaborate with BI teams to enable data access via Power BI and other tools.
- Mentor junior team members and promote best practices.
- Ensure security, reliability, and performance of data infrastructure.
- Participate in Agile/Scrum ceremonies and contribute to sprint planning and delivery.
Required Skills:
- 8+ years of total IT experience, with 5+ years in Azure data engineering.
- Expertise in ADF, Azure Databricks, Synapse, ADLS Gen2.
- 3+ years of hands-on experience with Python/PySpark.
- Solid SQL skills and data modeling expertise.
- Working knowledge of Power BI, APIs, and SFTP.
- Understanding of Agile delivery methodologies.
- Experience with CI/CD pipelines and source control (Git/Azure DevOps).
- Strong analytical and communication skills.
Preferred:
- DP-200 / DP-201 / DP-203 certification.
- Experience with Cosmos DB, Event Hub, or other Azure messaging services.
We are looking for a highly skilled Senior Azure Data Engineer with extensive experience in building and managing modern data pipelines using the Azure ecosystem. The ideal candidate will have strong hands-on expertise in Azure Data Factory, Databricks, Synapse Analytics, and Python/PySpark.Payroll : HTC Global Location : Remote Key Responsibilities:
- Design and implement end-to-end data pipelines on Azure platform.
Required Skills:
- 8+ years of total IT experience, with 5+ years in Azure data engineering.
- Expertise in ADF, Azure Databricks, Synapse, ADLS Gen2.
- 3+ years of hands-on experience with Python/PySpark.
- Solid SQL skills and data modeling expertise.
- Working knowledge of Power BI, APIs, and SFTP.
- Understanding of Agile delivery methodologies.
- Experience with CI/CD pipelines and source control (Git/Azure DevOps).
- Strong analytical and communication skills.
Preferred:
- DP-200 / DP-201 / DP-203 certification.
- Experience with Cosmos DB, Event Hub, or other Azure messaging services.
Job Types: Full-time, Permanent