Qureos

FIND_THE_RIGHTJOB.

Data Engineer (PySpark | Databricks)

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

We are looking for a highly skilled Data Engineer with expertise in PySpark and Databricks to design, build, and optimize scalable data pipelines for processing massive datasets.

IMMEDIATE JOINERS PREFERRED

Responsibilities

- Build & Optimize Pipelines: Develop high-throughput ETL workflows using PySpark on Databricks.

- Data Architecture & Engineering: Work on distributed computing solutions, optimize Spark jobs, and build efficient data models.

- Performance & Cost Optimization: Fine-tune Spark configurations, optimize Databricks clusters, and reduce compute/storage costs.

- Collaboration: Work closely with Data Scientists, Analysts, and DevOps teams to ensure data reliability.

- ETL & Data Warehousing: Implement scalable ETL processes for structured & unstructured data.

- Monitoring & Automation: Implement logging, monitoring, and alerting mechanisms for data pipeline health and fault tolerance.

Qualifications

- Bachelor's/Master's in Computer Science, Data Engineering, or a related field.

- 3+ years of experience in Big Data Engineering with PySpark & Databricks.

- Strong understanding of distributed computing, partitioning, and Spark internals(RDDs, DAGs, Catalyst Optimizer).

- Expertise in SQL & NoSQL databases for structured & semi-structured data.

- Proficiency in Python and Linux-based environments.

- Experience with Kafka (or other streaming platforms) and HealthCare Data [Optional].

- Familiarity with AWS/GCP/Azure and storage solutions (S3, Delta Lake, HDFS).

## Tech Stack

- Python | PySpark | Databricks

- SQL | Data Warehousing | Delta Lake

- Linux | Shell Scripting

- Kafka (Optional)

- Cloud Platforms (AWS/GCP/Azure)

Join us and build the future of data engineering!

Job Types: Full-time, Permanent, Contractual / Temporary

Work Location: In person

© 2025 Qureos. All rights reserved.