Qureos

FIND_THE_RIGHTJOB.

Data Engineer (Snowflake+DBT+Airflow)

India

Position-Data Engineer (Snowflake+DBT+Airflow)
Location- Pune,Ahmedabad
Experience-5yr
working Mode- Hybrid
Skills-Snowflake, Apache Airflow,Terraform
,
DBT,Git, SQL,
Spark
and
Python ,
Data Warehousing, CIi/CD Pipelines

Key Responsibilities:

  • Design, implement, and optimize data pipelines and workflows using Apache Airflow
  • Develop incremental and full-load strategies with monitoring, retries, and logging
  • Build scalable data models and transformations in dbt, ensuring modularity, documentation,
and test coverage
  • Develop and maintain data warehouses in Snowflake
  • Ensure data quality, integrity, and reliability through validation frameworks and automated
testing
  • Tune performance through clustering keys, warehouse scaling, materialized views, and query
optimization.
  • Monitor job performance and resolve data pipeline issues proactively
  • Build and maintain data quality frameworks (null checks, type checks, threshold alerts).
  • Partner with data analysts, scientists, and business stakeholders to translate reporting and
analytics requirements into technical specifications.
Qualifications:
  • Snowflake
(data
modeling,
performance
tuning,
access
control,
external
tables,
streams
&
tasks)
  • Apache
Airflow
(DAG
design,
task
dependencies,
dynamic
tasks,
error
handling)
dbt
(Data
Build
Tool)
(modular
SQL
development,
jinja
templating,
testing,
documentation)
  • Proficiency
in
SQL,
Spark
and
Python
  • Experience
building
data
pipelines
on
cloud
platforms
like
AWS,
GCP,
or
Azure
  • Strong
knowledge
of
data
warehousing
concepts
and
ELT
best
practices
  • Familiarity
with
version
control
systems
(e.g.,
Git)
and
CI/CD
practices
</

© 2025 Qureos. All rights reserved.