Find The RightJob.
Full Time Role
Salary: $150k-$210k/yr
Required Skills & Experience
-3–5 years of professional experience designing and operating production data pipelines at scale.
Containerization & Orchestration: Expertise with Docker, Kubernetes, and Helm.
-Workflow Management: Hands-on experience building DAG-based pipelines in Apache Airflow.
-Programming: Strong proficiency in Python for data engineering tasks.
-Distributed Frameworks: Practical experience with Dask or Apache Spark for large-scale data processing.
-Cloud Fundamentals: Familiarity with deploying and managing services in a cloud environment.
-GCP Proficiency: Hands-on with Google Cloud services (e.g., Pub/Sub, Big Query, Cloud Storage, GKE). Equivalent experience in other public cloud providers is fine.
-ML Pipelines: Exposure to deploying cross-cluster model-training workflows using Ray or similar frameworks.
Infrastructure as Code: Familiarity with Terraform for deployment.
-Security & Compliance: Knowledge of data governance, encryption, and role-based access control.
Nice to Have Skills & Experience
• Experience with Go programming language.
• Familiarity with acceleration frameworks such as RAPIDS or Spark.
• Knowledge of cloud platforms (AWS, GCP, Azure).
Experience with data version control and MLOps practices.
Job Description
We’re seeking a highly skilled Data Engineer to design, build, and maintain production-grade data pipelines that process and transform terabytes of data. In this role, you’ll collaborate closely with data scientists and other SWEs to ensure that our data infrastructure is scalable, reliable, and cost-effective.
Similar jobs
Inceed
Oklahoma City, United States
3 days ago
Apt
Dallas, United States
3 days ago
Banesco USA
Miami, United States
3 days ago
General Motors (GM)
Warren, United States
3 days ago
Vanguard
Malvern, United States
3 days ago
Tandym Group
Stamford, United States
3 days ago
AeroVironment
Simi Valley, United States
3 days ago
© 2026 Qureos. All rights reserved.