Qureos

FIND_THE_RIGHTJOB.

Databricks engineer

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Data Engineer – Databricks (Azure) | Astronomer (Airflow)

Location: Remote

Employment Type: Long-term Contract (initial 6 months, extendable)

About the Role

We are seeking a highly skilled Data Engineer with strong expertise in the Azure data ecosystem and Astronomer (Airflow) to build and orchestrate large-scale data solutions. This role is primarily focused on Astronomer-based Airflow development, with day-to-day work centered around DAG design, development, scheduling, monitoring, and operationalization for pipelines running within the Azure ecosystem (ADLS, ADF, Databricks, Synapse, etc.).
You will work closely with stakeholders to implement reliable orchestration patterns, ensure pipeline stability, and enable scalable, production-grade data workflows.

Key Responsibilities

  • Develop and maintain Airflow DAGs in Astronomer for Azure-based data pipelines (dependencies, retries, SLAs, alerting, backfills).
  • Orchestrate data processing workloads across Azure services (Databricks, ADLS, ADF, Synapse, etc.).
  • Build and optimize ETL/ELT pipelines using Databricks (PySpark) and SQL/PLSQL where required.
  • Implement robust monitoring and operational practices (logging, metrics, alerts, failure handling, runbooks).
  • Optimize pipeline performance and cost (efficient orchestration patterns, right-sizing compute, reducing reruns).
  • Ensure data quality, security, and compliance across all solutions.
  • Collaborate with cross-functional teams to gather requirements and deliver production-ready workflows.

Required Skills & Experience

  • 5+ years of experience in Data Engineering (or relevant field).
  • Strong hands-on experience with Apache Airflow is a MUST (DAG development, sensors/operators, scheduling, troubleshooting).
  • Experience working with Astronomer is preferred (deployments, Airflow best practices, CI/CD for DAGs).
  • Strong hands-on experience in PySpark (Python) for data processing.
  • Proficiency in SQL and/or PLSQL for transformation and analysis.
  • Experience with Databricks (jobs, workflows, clusters, Delta Lake).
  • Proven experience in Azure and its data services (ADLS, ADF, Databricks, Synapse, etc.).
  • Understanding of lakehouse concepts, incremental processing, and performance tuning.

Preferred Qualifications

  • Experience with Unity Catalog and Databricks governance patterns.
  • CI/CD experience for Airflow + data workloads (Git, Azure DevOps, GitHub Actions).
  • Monitoring/observability experience (Airflow logs/metrics, Azure Monitor, Log Analytics, etc.).
  • Strong communication and problem-solving skills.

Job Types: Contractual / Temporary, Freelance
Contract length: 6 months

Work Location: Remote

© 2025 Qureos. All rights reserved.