Job Description: Senior Python/DAG Developer
Job Title: Senior Python/DAG Developer
Location: Remote
Experience Level: 7+ years
Job Type: Contract
About the Role
We are seeking a highly experienced and skilled Senior Python/DAG Developer to join our data engineering team. In this role, you will be responsible for designing, developing, and maintaining complex data pipelines and workflows. The ideal candidate will have a deep understanding of data orchestration principles, extensive experience with Python, and a proven track record of building robust and scalable Directed Acyclic Graphs (DAGs) using tools like Apache Airflow. You will be a key player in our effort to build the next generation of data infrastructure, ensuring data is processed efficiently and reliably across the organization.
Key Responsibilities
- Design & Development: Architect, build, and maintain efficient and scalable data pipelines using Python and DAG-based orchestration tools (e.g., Apache Airflow, Dagster, Prefect).
- Orchestration: Develop, schedule, and monitor complex data workflows, ensuring timely and accurate data delivery for business intelligence, analytics, and machine learning initiatives.
- Optimization: Identify performance bottlenecks and refactor data pipelines to improve efficiency, reliability, and cost-effectiveness.
- Collaboration: Work closely with data scientists, analysts, and other engineers to understand data requirements and deliver solutions that meet business needs.
- Code Quality: Uphold and promote best practices in coding, including code reviews, documentation, and automated testing to ensure long-term maintainability.
- Troubleshooting: Diagnose and resolve issues within data pipelines and orchestration systems, responding to incidents and minimizing downtime.
- Mentorship: Act as a subject matter expert and mentor junior developers, sharing knowledge of best practices in Python and data engineering.
Required Qualifications
- Experience: 7+ years in software development with strong focus on Python for data engineering/ETL.
- Python: Expert-level proficiency with clean, production-ready coding.
- DAGs & Orchestration: 3–5 years of experience with Apache Airflow or similar platforms; strong understanding of Airflow concepts (operators, sensors, hooks, XComs).
- Database Skills: Strong SQL and relational database expertise (PostgreSQL, MySQL); knowledge of NoSQL and data warehouses (Snowflake, BigQuery) is a plus.
- Cloud Platforms: Hands-on experience with AWS, GCP, or Azure and their data services (e.g., S3, Cloud Storage, EMR, Dataproc).
- Data Formats: Familiarity with Parquet, Avro, JSON, and transformation techniques.
- Version Control: Strong knowledge of Git and collaborative workflows.
- Problem-Solving: Excellent analytical and troubleshooting skills.
Preferred Qualifications
- Experience with streaming data technologies (Kafka, Spark Streaming, Flink).
- Knowledge of Docker and Kubernetes.
- Experience with CI/CD pipelines for data engineering.
- Familiarity with data governance and security.
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
Job Type: Contract
Pay: ₹55.82 - ₹67.23 per hour
Work Location: Remote