Qureos

FIND_THE_RIGHTJOB.

Data Engineering Lead

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Data Engineering Lead Description

We are seeking a talented Data Engineering Lead to design, develop, and govern our data architecture and pipelines. Join as soon as possible. The candidate will need to be collaborative, organised, think out-of-the-box, and be ready to pursue new opportunities. Most importantly, this role is for an individual who is passionate about making a difference through healthcare.

We have good budget for this position.

Key Responsibilities

  • Lead, mentor, and manage a team of data engineers, fostering a culture of technical excellence and continuous improvement.
  • Act as a hands-on technical expert, designing and implementing scalable and robust data architectures for data warehousing and enterprise data platforms.
  • Communicate effectively with stakeholders, translating complex business needs into technical requirements and solutions.
  • Define and enforce data engineering best practices, including coding standards, documentation, and quality assurance.
  • Develop and optimize data pipelines and ETL/ELT/CDC (Change Data Capture) workflows using tools such as Fivetran and Cloud Composer.
  • Collaborate with data scientists, product managers, and business stakeholders to define data requirements and create logical and physical data models.
  • Manage and administer various database systems, including BigQuery, SAP HANA, and PostgreSQL.
  • Ensure data quality, integrity, and security across all data platforms and pipelines.
  • Work with our AI/ML teams to design data serving layers and feature stores that support Vertex AI workloads.
  • Design and develop reporting frameworks and data marts to support business intelligence needs.
  • Define and implement data governance, master data management, and data cataloging strategies.
  • Contribute to the full data lifecycle: requirements gathering, architecture, data modeling, development, testing, and deployment.
  • Troubleshoot and resolve data platform issues to ensure high availability and optimal performance.
  • Document technical designs, data lineage, and architecture for cross-functional reference.

Required Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Software Engineering, Data Science, or a related field.
  • Proven experience in a leadership or senior role, mentoring and guiding junior engineers.
  • Exceptional communication and interpersonal skills, with the ability to articulate complex technical concepts to diverse audiences.
  • Proficiency in one or more backend languages/frameworks, with a strong preference for Python or Go.
  • Experience with building RESTful APIs and designing microservices for data delivery.
  • Solid grasp of data modeling fundamentals, including Kimball and Inmon methodologies.
  • Proficiency in writing complex SQL queries and experience with SQL and NoSQL databases.
  • Familiarity with data warehousing concepts and best practices, including CDC.
  • Strong version-control habits (Git) and experience with CI/CD pipelines.
  • Excellent problem-solving and collaboration skills.
  • Passion for continuous learning and adapting to emerging data technologies.

Preferred Qualifications

  • Hands-on experience designing and deploying production-grade data warehouses.
  • Deep experience with Google Cloud Platform (GCP)
  • BigQuery for large-scale analytical workloads.
  • Cloud Composer for orchestrating complex data pipelines.
  • Vertex AI for AI/ML model serving and feature stores.
  • Experience with other cloud providers (AWS, Azure) and their data services.
  • Working knowledge of data governance frameworks, master data management, and data cataloging tools.
  • Experience with data ingestion tools like Fivetran.
  • Business-intelligence expertise in building dashboards and reports with Power BI or Tableau.
  • Familiarity with other data technologies such as SAP HANA.
  • Understanding of MLOps concepts and their application to data pipelines.
  • Contributions to open-source data projects or technical blogging/presentations.

Application Process:

  • If your resume is shortlisted, you will be invited to take an online AI-based assessment.
  • Candidates who pass this test will move on to the next stage: an on-call interview with the end client.
  • Successful candidates from all rounds will receive an offer based on the initial discussion during the first call.

© 2025 Qureos. All rights reserved.