Qureos

FIND_THE_RIGHTJOB.

AWS Data Engineer 6+ years

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Urgent opening for AWS Data Engineers ( Remote)

Experience – 6+ years

Work timings – 1.00pm -10.00 p.m (Mon-Fri)

Contract duration – 3 months (can be extended)

Mandatory:AWS Data Engineering, AWS Services(AWS Glue, S3, Redshift, EMR, Lambda, Step Functions, Kinesis, Athena, and IAM). Python, PySpark, and Apache Spark,data modelling,on-prem/cloud data warehouse ,DevOps

Job Description

We are seeking an experienced AWS Data Engineer with 6+ years of experience, strong understanding of large, complex, and multi-dimensional datasets. The ideal candidate will design, develop, and maintain scalable data pipelines and transformation frameworks using AWS native tools and modern data engineering technologies.

The role requires hands-on experience in AWS Data Engineering services and strong data modelling expertise. Exposure to Veeva API integration will be a plus (not mandatory).

Responsibilities:

Design, develop, and optimize data ingestion, transformation, and storage pipelines on AWS.

· Manage and process large-scale structured, semi-structured, and unstructured datasets efficiently.

· Build and maintain ETL/ELT workflows using AWS native tools such as Glue, Lambda, EMR, and Step Functions.

· Design and implement scalable data architectures leveraging Python, PySpark, and Apache Spark.

· Develop and maintain data models and ensure alignment with business and analytical requirements.

· Work closely with stakeholders, data scientists, and business analysts to ensure data availability, reliability, and quality.

· Handle on-premises and cloud data warehouse databases and optimize performance.

· Stay updated with emerging trends and technologies in data engineering, analytics, and cloud computing.

Requirements:

Mandatory: Proven hands-on experience with AWS Data Engineering stack, including but not limited to:

· AWS Glue, S3, Redshift, EMR, Lambda, Step Functions, Kinesis, Athena, and IAM.

· Proficiency in Python, PySpark, and Apache Spark for data transformation and processing.

· Strong understanding of data modelling principles and ability to design and maintain conceptual, logical, and physical data models.

· Experience working with one or more modern data platforms: Snowflake, Dataiku, or Alteryx (Good to have not mandatory)

· Familiarity with on-prem/cloud data warehouse systems and migration strategies.

· Solid understanding of ETL design patterns, data governance, and best practices in data quality and security.

  • Knowledge of DevOps for Data Engineering – CI/CD pipelines, Infrastructure as Code (IaC) using Terraform/CloudFormation (Good to have not mandatory)

Job Types: Full-time, Permanent

Pay: ₹1,000,000.00 - ₹1,300,000.00 per year

Work Location: Remote

Similar jobs

No similar jobs found

© 2025 Qureos. All rights reserved.