Qureos

FIND_THE_RIGHTJOB.

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Experience Required - 4 to 11 Year

Location : Bangalore/Hyderabad/Noida (2 days WFO)

Must skills : Databricks, Pyspark, AWS

Joining time : 20 days.

Hiring mode : Fulltime

About the Role :

We are looking for a highly skilled Data Engineer with strong expertise in AWS, Databricks, PySpark, and Airflow to join our growing Data Engineering team. The ideal candidate will be responsible for designing, building, and optimizing scalable data pipelines and solutions that enable advanced analytics, machine learning, and business intelligence across the organization.

Key Responsibilities :

Design, develop, and maintain scalable ETL/ELT pipelines using Databricks, PySpark, and Airflow.

Build and optimize data models and data lakes/warehouses on AWS

Implement best practices for data quality, data governance, and performance optimization.

Collaborate with cross-functional teams (data scientists, analysts, product, and business teams) to deliver data-driven solutions.

Ensure reliability, scalability, and efficiency of data workflows through automation and monitoring.

Troubleshoot complex data engineering issues and optimize processing performance.

Required Skills & Qualifications

Bachelor’s/Master’s degree in Computer Science, Engineering, or related field.

4+ years of hands-on experience in Data Engineering.

Strong expertise in PySpark, Databricks, and Airflow for large-scale data processing and orchestration.

Solid experience with AWS services such as S3, Glue, Redshift, EMR, Lambda, and IAM.

Strong knowledge of SQL and performance tuning.

Experience with CI/CD pipelines, Git, and containerization (Docker/Kubernetes) is a plus.

Strong problem-solving skills, communication, and ability to work in a fast-paced environment

Job Type: Full-time

Pay: ₹2,200,000.00 - ₹3,500,000.00 per year

Work Location: In person

© 2025 Qureos. All rights reserved.