Qureos

Find The RightJob.

Data Engineer (Onshore Lead

We are seeking a highly skilled Senior Data Engineer (Onshore Lead) to drive the design, development, and delivery of scalable data pipelines within a modern cloud-based data platform. This role requires strong technical expertise, leadership capabilities, and the ability to collaborate closely with cross-functional teams to translate business requirements into robust engineering solutions.

Roles and Responsibilities

Data Engineering & Pipeline Development. Design and develop scalable data workflows using DBT with Spark SQL.

Orchestrate and deploy pipelines using Argo Workflows on Kubernetes. Build and optimize batch and incremental data pipelines for large-scale datasets.

Ensure high data quality, reliability, and performance across pipelines.

Application Development (Healthcare Focus) - Develop robust, reusable, and scalable Python applications using OOP principles. Model and process complex healthcare datasets including Claims o Episodes of Care o Member/Provider data & other clinical entities

Implement data transformations and business logic aligned with healthcare standards. 3.

Collaborate with onshore stakeholders, product owners, and cross-platform teams to gather and refine requirements.

Translate business needs into clear technical specifications and design documents for offshore teams.

Drive architecture discussions and ensure best practices in implementation.

Act as a bridge between business, onshore, and offshore teams. 4. Delivery & Offshore Team Management

Lead and manage offshore teams for timely and quality delivery.

Define sprint goals, track progress, and ensure adherence to timelines.

Stakeholder Communication & Reporting Provide weekly status updates to customers and leadership via dashboards and reports. Highlight risks, dependencies, and mitigation plans proactively. Ensure transparency in delivery and operational metrics. 6. Production Support & Reliability

Provide L2/L3 support for production data pipelines. Troubleshoot and resolve pipeline failures, performance issues, and outages. Implement monitoring, ing, and incident management best practices.

Platform Understanding & Solutioning Develop deep understanding of the enterprise data platform architecture. Propose scalable and efficient solutions aligned with business requirements. Contribute to platform evolution, optimization, and standardization.

Experience: -

10+ Years

Location: - USA

Remote

Duration: -

Contract role

Educational Qualifications: -

Engineering Degree BE/ME/BTech/MTech/BSc/MSc.

Technical certification in multiple technologies is desirable.

Skills: -

Mandatory skills

DBT (Data Build Tool) with Spark SQL,Apache Spark / PySpark, Python (Advanced) OOP, modular design, performance optimization, Kubernetes (K8s) deployment, scaling, troubleshooting

Argo Workflows (or similar orchestration tools) Cloud & Infrastructure, AWS Services: o S3, Glue, Athena o EMR / EKS / K8S / Argo Workflows o Lambda (nice to have) o IAM & security best practices

Docker containerization and image management Data Engineering Ecosystem Data Lake / Lakehouse frameworks (Hudi / Iceberg / Delta preferred)

Workflow orchestration tools (Argo, Airflow exposure helpful), CI/CD tools (GitHub Actions / Jenkins)

Data modelling & warehousing concepts

Additional Skills

Unix / Shell Scripting Strong SQL skills (complex queries, performance tuning) Data governance and data quality frameworks

Monitoring tools (Prometheus, Grafana, etc. nice to have) Domain Expertise (Preferred) Healthcare data models: o Claims processing o Episodes of care o Member & provider analytics

Understanding of healthcare data standards (HIPAA, FHIR plus) Soft Skills & Leadership Strong communication and stakeholder management skills Ability to translate complex requirements into actionable plans Proven experience managing distributed (onshore-offshore) teams

Problem-solving mindset with ownership and accountability Nice to Have Experience with Spark-Scala / Snowflake / GenAi / Agentic Ai

Exposure to multi-cloud architectures (AWS + GCP/Azure) Knowledge of data mesh / lakehouse architecture

Experience in performance tuning at scale Key Outcomes Expected High-quality, scalable data pipelines delivered on time

Strong alignment between business requirements and technical implementation Efficient coordination between onshore and offshore teams Stable and reliable production data platform

Similar jobs

No similar jobs found

© 2026 Qureos. All rights reserved.