Qureos

FIND_THE_RIGHTJOB.

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Experience: 8+ Years (Minimum 5 years of relevant experience in Databricks)

Location: Hyderabad

Work Mode: Onsite (5 Days a Week)

Notice Period: ImmediatePosition Overview

We are seeking a highly skilled Data Engineer to design and implement scalable data platformsleveraging Databricks. The ideal candidate will have deep expertise in data architecture,pipeline development, and integration of diverse data sources including SQL Server,MongoDB, and InfluxDB. This requires proficiency in both real-time and batch dataprocessing, with a strong foundation in cloud data solutions (preferably Azure).This position offers the opportunity to work on advanced analytics, machine learningenablement, and enterprise-scale data solutions that drive business insights and innovation.

Key Responsibilities

 Design, build, and maintain a robust data platform on Databricks.

 Develop scalable ETL/ELT pipelines to ingest data from multiple sources (SQL Server,MongoDB, InfluxDB) into Databricks Delta Lake.

 Implement both real-time and batch data ingestion strategies using Kafka, AzureEvent Hubs, or equivalent tools.

 Optimize data storage and processing for performance, scalability, and cost efficiency.

 Build and maintain data models supporting BI, analytics, and machine learning usecases.

 Collaborate closely with Data Scientists, Analysts, and Product Teams to define anddeliver data requirements.

 Ensure data quality, security, and governance across all pipelines and data repositories.

 Conduct performance tuning, monitoring, and troubleshooting to ensure reliability ofdata workflows.

Required Skills & Qualifications

 Proven hands-on experience in Databricks, including Delta Lake, Spark, PySpark, andSQL.

 Strong understanding of data integration from heterogeneous systems — SQL Server,MongoDB, and InfluxDB.

 Expertise in ETL/ELT pipeline development and workflow orchestration using toolslike Apache Airflow, Azure Data Factory, or similar.

 Proficiency in data modeling, data warehousing, and performance optimizationtechniques.

 Experience in real-time data streaming using Kafka, Azure Event Hubs, or relatedtechnologies.

 Advanced programming skills in Python and SQL.

 Working knowledge of Azure Cloud and its data services.

 Experience with Change Data Capture (CDC) techniques for incremental dataprocessing.

 Excellent problem-solving, debugging, and analytical skills.

Job Type: Full-time

Pay: ₹1,500,000.00 - ₹2,500,000.00 per year

Work Location: In person

© 2025 Qureos. All rights reserved.