FIND_THE_RIGHTJOB.
JOB_REQUIREMENTS
Hires in
Not specified
Employment Type
Not specified
Company Location
Not specified
Salary
Not specified
Please provide responses to the screening questionnaire.
Job Code: ITMP1411-01
Designation: Software Engineer (Data Engineer)
Location: Ghansoli, Mumbai
Nature: Work from office only
Budget: 17 LPA CTC per year (Including 7% Variable)
Experience: 3 to 5 Years
Job Description
We are seeking an experienced Data Engineer with 3+ years of hands-on expertise in building robust data pipelines, working with big data technologies, and managing cloud-based data infrastructures. The ideal candidate will play a key role in designing, developing, and optimizing data workflows to ensure reliable and scalable data operations across the organization.
Key Responsibilities
· Build and maintain scalable ETL/ELT data pipelines to support analytics, reporting, and machine learning workloads.
· Write clean, efficient, and maintainable Python or Scala code for data ingestion, transformation, and processing tasks.
· Use SQL extensively to query, manipulate, and optimize large datasets.
· Work with big data technologies such as Apache Spark, Airflow, Kafka, or similar tools for distributed processing and workflow orchestration.
· Design, implement, and manage data infrastructure on cloud platforms including AWS, Google Cloud Platform (GCP), or Microsoft Azure.
· Collaborate with data scientists, analysts, and cross-functional teams to ensure smooth data availability and quality.
· Monitor pipeline performance, troubleshoot issues, and optimize workflows for efficiency and reliability.
· Ensure best practices in data security, quality, and governance across all data engineering processes.
Required Qualifications
· 3+ years of experience as a Data Engineer or in a similar data-focused engineering role.
· Strong programming skills in Python or Scala.
· Excellent proficiency in SQL with experience working on large-scale datasets.
· Hands-on experience with Spark, Airflow, Kafka, or equivalent big data and orchestration tools.
· Practical exposure to cloud services such as AWS, GCP, or Azure.
· Solid understanding of ETL/ELT principles, data warehousing concepts, and distributed computing.
· Strong problem-solving abilities and attention to detail.
· Effective communication and collaboration skills to work in a team-oriented environment.
Job Type: Full-time
Pay: Up to ₹1,700,000.00 per year
Application Question(s):
Please provide responses to the screening questions.
How many years experience do you have in the following skills:
- Programming skills in Python or Scala:
- SQL with experience working on large-scale datasets:
- Hands-on experience with Spark, Airflow, Kafka, or equivalent big data and orchestration tools:
- Practical exposure to cloud services such as AWS, GCP, or Azure:
- ETL/ELT principles, data warehousing concepts, and distributed computing:
Current Location:
Are you willing to work from Ghansoli office- purely WFO?:
Notice Period:
Total Experience in Years:
Nationality:
Mobile No.:
Email ID:
Upon offer issuance, how soon can you join?:
Do you have a Passport (mandatory)?:
Current CTC/monthly pay:
Expected CTC/monthly pay:
Work Location: In person
Similar jobs
Optum
India
11 days ago
Rojgar group
India
11 days ago
Amazon.com
India
11 days ago
Medtronic
India
11 days ago
Aganitha AI
India
11 days ago
GE Vernova
India
11 days ago
© 2025 Qureos. All rights reserved.