Design, build, maintain data pipelines (ETL / ELT); Ingest, transform, integrate from various sources; Optimize data storage (data lakes, warehouses); Ensure data quality, consistency, governance; Work with analytics / data science teams on datasets; Monitor, log, alert data infrastructure 6+ years in data engineering or related roles; Proficiency in SQL, Python, Scala, or Java; Experience with ETL/ELT tools (Airflow, Spark, NiFi etc.); Familiarity with cloud data platforms (AWS, GCP, Azure); Big data tech (Hadoop, Kafka, Spark) is a plus; Data modeling, partitioning, performance tuning