Qureos

FIND_THE_RIGHTJOB.

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Role : AWS Data Engineer
Required Technical Skill Set :
PySpark, Python, ETL/Datawarehouse, AWS Glue, Step Functions, EMR, Redshift, Amazon S3


Required Skills & Qualifications:


1. Expertise in PySpark, Python
2. Strong proficiency in AWS services: Glue, Redshift, S3, Lambda, EMR, Kinesis.
3. Hands-on experience with ETL tools and data pipeline orchestration.
4. Proficiency in Python or Scala for data processing.
5. Knowledge of SQL and NoSQL databases.
6. Familiarity with data modeling and data warehousing concepts.
7. Experience with CI/CD pipelines
8. Understanding of security best practices for data in AWS.
9. Good hands on experience on Python, Numpy , pandas.
10. Experience in building ETL/ Data Warehouse transformation process.
11. Experience working with structured and unstructured data.
12. Developing Big Data and non-Big Data cloud-based enterprise solutions in PySpark and SparkSQL and related frameworks/libraries,
13. Developing scalable and re-usable, self-service frameworks for data ingestion and processing,


Location
Chennai
Job Function
IT INFRASTRUCTURE SERVICES
Role
Developer
Job Id
381899
Desired Skills
AWS Admin | Python | Spark

Desired Candidate Profile

Qualifications : BACHELOR OF ENGINEERING

© 2025 Qureos. All rights reserved.