Qureos

FIND_THE_RIGHTJOB.

AWS Senior Data Engineer 6 to 7 years ( Chennai)

India

Job Description:

We are seeking an experienced AWS Data Engineer with a strong background in data processing, cloud technologies, and distributed computing. In this role, you will design, develop, and maintain scalable data pipelines, integrating various data sources, and enabling efficient data processing workflows using Python, PySpark, SQL, and AWS cloud-based services. The ideal candidate will also have exposure to Terraform and experience or familiarity with Snowflake and DBT.

Responsibilities:

Data Pipeline Development: Design, implement, and maintain scalable ETL pipelines using AWS Glue, PySpark, and Python to ingest, process, and transform data from multiple sources.

AWS Cloud Solutions: Utilize AWS services such as Lambda, S3, Athena, Glue, DynamoDB, and Step Functions to build and manage data workflows, ensuring robust and efficient data processing.

Data Transformation and Storage: Develop solutions for data transformation and ensure proper storage and retrieval using AWS S3, DynamoDB, and other related services.

Terraform Integration: Leverage Terraform to manage and provision AWS infrastructure as code, ensuring a streamlined deployment process.

Data Modeling and SQL Development: Write, optimize, and maintain complex SQL queries for data extraction, transformation, and loading (ETL) processes. Ensure data quality, integrity, and accuracy across transformations.

Collaboration and Communication: Work closely with data engineers, architects, analysts, and other stakeholders to understand data requirements and translate them into scalable data solutions.

Performance Optimization: Analyze and optimize PySpark and SQL scripts for better performance, ensuring smooth and efficient data processing at scale.

Documentation: Create and maintain detailed documentation of data engineering workflows, codebase, and infrastructure configurations.

Troubleshooting: Diagnose and resolve data-related issues, bottlenecks, and performance challenges in a timely manner.

Continuous Improvement: Stay current with emerging data technologies, tools, and best practices to continuously enhance data solutions and processes.

Required Skills and Qualifications:

Proficiency in Python and PySpark: Strong experience writing robust Python code and utilizing PySpark for distributed data processing, transformation, and working with large datasets.

AWS Data Engineering Expertise: Hands-on experience with AWS Glue, Lambda, S3, Athena, DynamoDB, and Step Functions to build and manage data workflows.

SQL Mastery: Deep knowledge of SQL for complex query writing, database design, and performance optimization.

Terraform Experience: Familiarity with Terraform for managing AWS infrastructure as code is essential.

Cloud Data Architecture: Experience designing and managing data solutions on AWS cloud platforms.

Problem-Solving Skills: Strong troubleshooting skills for data pipeline issues, optimization challenges, and system bottlenecks.

Preferred Qualifications:

Experience with Snowflake: Familiarity with Snowflake for data warehousing and integration with AWS.

DBT Knowledge: Experience with DBT for data transformations, testing, and modeling is a strong plus.

Collaboration and Communication: Strong interpersonal and collaboration skills for working in cross-functional teams.

Version Control: Proficiency with version control systems like Git for managing code changes, reviews, and collaboration.

Job Type: Contractual / Temporary

Pay: ₹1,600,000.00 - ₹1,800,000.00 per year

Work Location: In person

Similar jobs

No similar jobs found

© 2025 Qureos. All rights reserved.