Qureos

Find The RightJob.

Lead Cloud Data Platform Engineer (contract)

Title: Lead Cloud Data Platform Engineer

Location: 401 Las Colinas Blvd W Bldg A Irving, TX

Alternate Locations: Charlotte, NC or Minneapolis, MN

Duration: 12 months

Work Engagement: W2
Work Schedule: 3 days in office/2 days remote

Benefits on offer for this contract position: Health Insurance, Life insurance, 401K and Voluntary Benefits


Summary:

Cybersecurity is looking for a Lead Cloud Data Platform Engineer to help build, deploy, and operate reliable, scalable data pipelines across our on-prem and cloud data platforms. This role focuses on productionizing data engineering work – bringing DevOps best practices (automation, testing, observability, CI/CD) to our analytics pipelines. This role will bridge data engineering and production operations by focusing on continuously improving production data quality and ensuring data SLAs/OLAs through observability automation enhancements. The role will also collaborate with and guide data engineers on best practices to optimize data workflows, and enhance operational data quality, reliability, and resiliency.

This role will work closely with data engineers, analysts, and cloud/platform engineering teams on Devops best practices, ensure our data pipelines are optimized and controls are robust, and enhance operational data quality, reliability, resiliency, and security, and to make data flows auditable and easy to operate.

Responsibilities:

  • Help the business implement business DQ rules as necessary, recommend addition of NFRs based on operational trends, and recommend data pipeline optimization strategies.

  • Ensure business and technical DQ is managed and continuously improves, making recommendations for data cleansing, validation, and profiling to remediate issues.

  • Defines data contract terms and agreements with upstream and downstream consumers, and monitors data alignment with contract terms / conditions.

  • Optimize workflow orchestration with Apache Airflow / Cloud Composer, including scheduling, dependency management, retries, and self-healing

  • Ensure data processing meets defined SLA’s

  • Participate in incident response, root cause analysis, and post-incident reviews. Play a key escalation point for domain data publishing issues.

  • Implement and maintain CI/CD processes for data pipelines and orchestration workflows

  • Support hybrid data architectures including Cloudera Hadoop and Google Cloud Platform

  • Produce and maintain operational metrics to support controls to meet internal audit requirements

  • Produce and maintain runbooks and operational documentation in partnership with data engineering and L1/L2/L3 support teams

  • Implement and operationalize modern AI-enabled data capabilities on Google Cloud to ingest, transform, and distribute data for a variety of big data apps

  • Leverage AI/Agentic frameworks to automate data management, governance, and data consumption capabilities - data pipelines, data quality, metadata, data compliance, etc.

  • Work within a matrix org. with principal engineers, product managers, and data engineers to roadmap, plan, and deliver key data capabilities based on priority

Qualifications:

  • Applicants must be authorized to work for ANY employer in the U.S. This position is not eligible for visa sponsorship.

  • Demonstrated, recent hands-on experience with:

    • LangChain, LangGraph/ADK

    • Agentic frameworks, RAG, GraphRAG

    • MCP for building agent‑based data capabilities

  • Strong experience in data engineering, including:

    • Cloud data platforms

    • Creating and supporting Spark-based ingestion and distributed processing

  • Proven experience with Data Lakehouse architectural patterns and tools, including:

    • Python, PySpark, Kafka, Airflow

    • Google Cloud Storage, BigQuery, Dataproc, Cloud Composer

  • Hands-on background building and supporting streaming data pipelines using Kafka, Flink, and Spark Streaming

  • Experience using AI to auto‑generate data engineering code, context engineering, and prompt engineering (preferred)

  • Strong background with cloud‑based data lakes/warehouses and automated data pipelines (preferred)

  • Public cloud certifications such as:

    • GCP Professional Data Engineer

    • Azure Data Engineer

    • AWS Data Analytics Specialty

  • Experience with web‑based UI development (React, Node.js) (preferred)

Similar jobs

No similar jobs found

© 2026 Qureos. All rights reserved.