Position-Data Engineer (Snowflake+DBT+Airflow)
Location- Pune,Ahmedabad
Experience-5yr
working Mode- Hybrid
Skills-Snowflake, Apache Airflow,Terraform
,
DBT,Git, SQL,
Spark
and
Python ,
Data Warehousing, CIi/CD Pipelines
- Design, implement, and optimize data pipelines and workflows using Apache Airflow
- Develop incremental and full-load strategies with monitoring, retries, and logging
- Build scalable data models and transformations in dbt, ensuring modularity, documentation,
- Develop and maintain data warehouses in Snowflake
- Ensure data quality, integrity, and reliability through validation frameworks and automated
- Tune performance through clustering keys, warehouse scaling, materialized views, and query
- Monitor job performance and resolve data pipeline issues proactively
- Build and maintain data quality frameworks (null checks, type checks, threshold alerts).
- Partner with data analysts, scientists, and business stakeholders to translate reporting and
analytics requirements into technical specifications.
(data
modeling,
performance
tuning,
access
control,
external
tables,
streams
&
Airflow
(DAG
design,
task
dependencies,
dynamic
tasks,
error
handling)
dbt
(Data
Build
Tool)
(modular
SQL
development,
jinja
templating,
testing,
documentation)
building
data
pipelines
on
cloud
platforms
like
AWS,
GCP,
or
Azure
knowledge
of
data
warehousing
concepts
and
ELT
best
practices
with
version
control
systems
(e.g.,
Git)
and
CI/CD
practices