Freelancing Opportunity
Job Title: Python Data Engineer
Experience: 5+ Years
Job Summary:
We are looking for an experienced Python Data Engineer with deep expertise in Azure Data Factory (ADF), ETL processes, and Azure Functions. The ideal candidate will handle ADF pipeline corrections, enhancements, and integrate Azure Functions effectively within ADF workflows to deliver optimized, scalable, and reliable data solutions.
Key Responsibilities:
- Design, build, and enhance ETL workflows in Azure Data Factory (ADF) for seamless data ingestion, transformation, and loading.
- Develop and maintain Python scripts for complex data transformations, data quality checks, and automation tasks.
- Integrate and optimize Azure Functions within ADF pipelines for advanced data processing and orchestration.
- Identify and resolve ADF pipeline issues, performance bottlenecks, and deployment challenges.
- Troubleshoot and fix space utilization and performance issues in Azure Functions and data pipelines.
- Work with Azure Storage, Data Lake, Key Vault, Synapse Analytics, and other Azure services as part of end-to-end data workflows.
- Ensure high performance, reliability, and reusability of all ADF components and Python-based utilities.
- Collaborate with architects, analysts, and DevOps teams for version control, CI/CD, and production support.
Required Skills & Experience:
- 5+ years of hands-on experience in Data Engineering or ETL development.
- Strong expertise in Azure Data Factory (ADF) — building, debugging, and enhancing data pipelines.
- Proven proficiency in Python programming, including complex data manipulation, exception handling, and performance tuning.
- Experience integrating and invoking Azure Functions within ADF pipelines.
- Solid understanding of ETL concepts, data flow orchestration, and dependency management.
- Working knowledge of Azure services — Blob Storage, Data Lake, Key Vault, and Synapse Analytics.
- Good SQL skills for querying, validation, and performance optimization.
- Familiarity with Azure DevOps / Git for version control and CI/CD.
Preferred Qualifications:
- Experience with Databricks, PySpark, or ADF Mapping Data Flows.
- Understanding of event-driven data architectures and serverless design patterns.
- Excellent analytical, debugging, and problem-solving skills.
- Strong communication and teamwork abilities in a collaborative environment.
Job Types: Part-time, Freelance
Contract length: 12 months
Pay: ₹30,000.00 - ₹40,000.00 per month
Benefits:
Experience:
- total work: 6 years (Required)
Shift availability:
- Night Shift (Required)
- Overnight Shift (Required)
Work Location: Remote