FIND_THE_RIGHTJOB.
JOB_REQUIREMENTS
Hires in
Not specified
Employment Type
Not specified
Company Location
Not specified
Salary
Not specified
Data Engineer
Type: Permanent / Contractual (C2H)
Experience: 8+ Years (Minimum 5 years of relevant experience in Databricks)
Location: Noida / Hyderabad
Work Mode: Onsite (5 Days a Week)
Notice Period: Immediate
Position Overview
We are seeking a highly skilled Data Engineer to design and implement scalable data platforms leveraging Databricks. The ideal candidate will have deep expertise in data architecture, pipeline development, and integration of diverse data sources including SQL Server, MongoDB, and InfluxDB. This requires proficiency in both real-time and batch data processing, with a strong foundation in cloud data solutions (preferably Azure). This position offers the opportunity to work on advanced analytics, machine learning enablement, and enterprise-scale data solutions that drive business insights and innovation.
Key Responsibilities
Design, build, and maintain a robust data platform on Databricks.
Develop scalable ETL/ELT pipelines s to ingest data from multiple sources (SQL Server, MongoDB, InfluxDB) into Databricks Delta Lake.
Implement both real-time and batch data ingestion strategies using Kafka, Azure Event Hubs, or equivalent tools.
Optimize data storage and processing for performance, scalability, and cost efficiency.
Build and maintain data models supporting BI, analytics, and machine learning use cases. Collaborate closely with Data Scientists, Analysts, and Product Teams to define and deliver data requirements.
Ensure data quality, security, and governance across all pipelines and data repositories.
Conduct performance tuning, monitoring, and troubleshooting to ensure reliability of data workflows.
Required Skills & Qualifications
Proven hands-on experience in Databricks, including Delta Lake, Spark, PySpark, and SQL. Strong understanding of data integration from heterogeneous systems — SQL Server, MongoDB, and InfluxDB.
Expertise in ETL/ELT pipeline development and workflow orchestration using tools like Apache Airflow, Azure Data Factory, or similar.
Proficiency in data modeling, data warehousing, and performance optimization techniques.
Experience in real-time data streaming using Kafka, Azure Event Hubs, or related technologies.
Advanced programming skills in Python and SQL.
Working knowledge of Azure Cloud and its data services.
Experience with Change Data Capture (CDC) techniques for incremental data processing. Excellent problem-solving, debugging, and analytical skills.
Job Type: Full-time
Pay: ₹2,000,000.00 - ₹2,400,000.00 per year
Experience:
Work Location: In person
Similar jobs
Databricks
Mangaluru, India
5 days ago
Rearc
India
5 days ago
HNM Solutions
India
5 days ago
MKS Instruments
India
5 days ago
ExxonMobil
India
5 days ago
ValGenesis
Hyderabad, India
5 days ago
Glasier Inc
India
5 days ago
© 2025 Qureos. All rights reserved.