FIND_THE_RIGHTJOB.
JOB_REQUIREMENTS
Hires in
Not specified
Employment Type
Not specified
Company Location
Not specified
Salary
Not specified
We are looking for an experienced Data Engineer Lead to design, build, and optimize scalable data platforms and pipelines. In this leadership role, you will take full ownership of data engineering initiatives, ensure high-quality data delivery, and work closely with cross-functional teams to drive data strategy and innovation across the organization.
You will lead end-to-end data engineering efforts—from ingestion and transformation to orchestration and infrastructure—ensuring reliability, performance, and scalability of the data ecosystem.
Lead the design, development, and maintenance of scalable, secure, and automated data pipelines (batch and streaming).
Build, optimize, and manage data lakes / data warehouses on platforms such as Redshift, Snowflake, BigQuery, or Delta Lake.
Develop and deploy data workflows using Airflow, DBT, Spark, Kafka, and other modern data engineering tools.
Implement and optimize ETL/ELT processes, ensuring data quality, governance, lineage, and observability.
Work closely with analytics, product, and engineering teams to define data strategy, architecture, and KPIs.
Lead and mentor junior and mid-level data engineers, ensuring adherence to best practices and coding standards.
Build and maintain CI/CD pipelines for data engineering processes using tools like GitHub Actions, Jenkins, or similar.
Deploy and manage container environments using Docker and Kubernetes.
Ensure performance optimization of queries, storage, pipelines, and cloud infrastructure.
Drive innovation by evaluating new tools, technologies, and architectural improvements.
Strong expertise in data processing & workflow automation.
Expert-level proficiency in Python and SQL.
Hands-on experience with Cloud Platforms (AWS / GCP / Azure) for data engineering services.
Proven experience building and managing pipelines using Airflow, DBT, Spark, Kafka, etc.
Strong understanding of modern data warehousing technologies:
Redshift
Snowflake
BigQuery
Delta Lake
Experience with CI/CD tools (GitHub Actions, Jenkins, Bitbucket Pipelines, etc.).
Strong working knowledge of Docker, Kubernetes, and containerized deployments.
Knowledge of data governance, lineage, metadata management, and observability frameworks.
Experience with distributed systems and performance tuning.
Familiarity with version control (Git), DevOps practices, and cloud cost optimization.
Strong leadership and mentoring capabilities.
Excellent communication and stakeholder management skills.
Analytical mindset with a strong problem-solving approach.
Ability to work in a fast-paced, data-driven environment.
© 2025 Qureos. All rights reserved.