Qureos

FIND_THE_RIGHTJOB.

Senior Data Engineer

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Senior Data Engineer – GCP & Databricks

Contract Duration: 1 Year (extendable based on performance)
Experience: 6–9 Years
Location: Remote (India)

Key Skills Required

  • Must Have:
  • GCP (Google Cloud Platform) – Level 3 proficiency
  • Databricks – Level 3 proficiency
  • Python, PySpark
  • Oracle SQL
  • Good to Have:
  • BigQuery, Cloud Composer
  • Docker, Kubernetes, CI/CD pipelines
  • RESTful Services, API design & development
  • Configuration management tools (Ansible, Salt, etc.)

Job Description

We are looking for a Senior Data Engineer with 6–9 years of experience to design, build, and optimize data pipelines and platforms using Databricks and Google Cloud Platform services. The ideal candidate will have strong experience in Python, PySpark, and SQL, and should be capable of delivering scalable, secure, and efficient data solutions in a cloud environment.

Responsibilities

  • Design and develop data pipelines using Databricks, GCP, and related technologies.
  • Work extensively with BigQuery, Cloud Composer, and PySpark for ETL and data processing.
  • Write optimized SQL queries (Oracle SQL preferred) for data transformation and analysis.
  • Ensure scalability, security, and reliability across distributed systems.
  • Collaborate with cross-functional teams including data scientists, analysts, and architects.
  • Implement CI/CD pipelines and containerization (Docker/Kubernetes).
  • Develop REST APIs and automate configuration management.
  • Stay updated on emerging technologies and apply them effectively.

Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.
  • 6–9 years of overall experience with a strong focus on data engineering and cloud technologies.
  • Proven expertise in Databricks and GCP (BigQuery, Composer).
  • Strong coding skills in Python and PySpark.
  • Hands-on experience with CI/CD, Docker, and Kubernetes preferred.
  • Excellent problem-solving and analytical skills.

Job Types: Full-time, Permanent

Pay: ₹1,501,672.41 - ₹2,845,896.21 per year

Benefits:

  • Health insurance
  • Provident Fund
  • Work from home

Application Question(s):

  • Do have experience in Data Engineer – GCP & Databricks
  • Please mention your skills-

Experience:

  • data engineer: 6 years (Required)

Work Location: Remote

© 2025 Qureos. All rights reserved.