Qureos

FIND_THE_RIGHTJOB.

Senior AWS Data Lakehouse Engineer

India

Job Location: Remote
Job Experience: 10-20 Years
Model of Work: Remote
Technologies: AWS
Functional Area: Software Development

Job Summary:

  • Job Title: Senior Senior AWS Data Lakehouse Specialist
    Job Location: Madurai
    Working Mode: Remote & quarterly once travel to Madurai if required for team meeting.
    Experience: 10+ Years only and he/she will be working as an Individual contributor.
    Working Hours: 2 PM to 11 PM IST
    Notice Period: Immediate Joiners Only.
    Interview Process: 2-3 rounds of interviews from Tech Mango + 1 Level of Client Interview.

    Requirement Summary:
    The ideal candidate will be proficient in Python, PySpark, and AWS Glue, with a strong understanding of Data Lakehouse architecture—especially the medallion model. You will play a key role in designing, developing, and optimizing data pipelines, ensuring data quality, and implementing infrastructure as code using Terraform.

Key Responsibilities:

  • Key Responsibilities:
    Design and implement scalable ETL/ELT pipelines using AWS Glue and PySpark
    Develop and maintain data workflows using AWS Glue DataBrew and AWS Data Quality services
    Architect and manage Data Lakehouse solutions following the medallion architecture (Bronze, Silver, Gold layers)
    Optimize data lake performance (Parquet formats, partitioning strategies, DPU tuning)
    Implement S3 data encryption and security best practices
    Automate infrastructure provisioning using Terraform (IaC)
    Collaborate with data analysts, scientists, and business stakeholders to deliver clean, reliable data
    Integrate and manage DBT workflows for data transformation and modeling (preferred)
    Monitor, troubleshoot, and enhance data pipeline reliability and performance

    Required Skills & Qualifications:
    5+ years of relevant experience in data engineering with a focus on AWS
    Strong proficiency in SQL, Python and PySpark
    Hands-on experience with AWS Glue ETL, Glue DataBrew, and AWS Data Quality
    Proven expertise in building and managing Data Lakehouse architectures using medallion layering
    Deep understanding of Parquet file formats, partitioning, and DPU configuration for performance tuning
    Experience with S3 encryption and data security protocols Solid grasp of Infrastructure as Code using Terraform
    Familiarity with DBT for data transformation and modeling (preferred)
    Strong problem-solving skills and ability to work independently and collaboratively

    Preferred Qualifications:
    AWS certifications (e.g., AWS Certified Data Analytics, AWS Solutions Architect)
    Experience with CI/CD pipelines and DevOps practices
    Exposure to data governance and cataloging tools (e.g., AWS Glue Catalog, Lake Formation)

About our Talent Acquisition Team:

© 2025 Qureos. All rights reserved.