Qureos

FIND_THE_RIGHTJOB.

Senior Data Engineer

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Be the change. Join the world’s most visionary developer.


Red Sea Global (RSG) is showing that there is a better way to positively shape the places we live, work and travel.


We are purpose-driven and committed to people and planet. Our transformative programs are a driving force to achieving Vision 2030, as well as leading the world towards regenerative tourism.


Join RSG and be part of the positive change for Saudi Arabia and the world.


Job Purpose:

  • The Senior Data Engineer is responsible for designing, building, and maintaining scalable, secure, and high-performance data pipelines and infrastructure that power the organization’s analytics, artificial intelligence, and mission-critical decision-making systems.
  • The role ensures that data is ingested, transformed, stored, and made available in a reliable, timely, and Personal Data Protection Law (PDPL)-compliant manner, enabling AI-ready datasets aligned with Saudi Vision 2030 and the National Data Strategy.


Job Responsibilities:


Architect and build end-to-end data pipelines:

  • Design, develop, and maintain scalable ETL/ELT workflows using tools such as Apache Spark, dbt, Airflow, Kafka, Informatica IDC, to ingest and process structured, semi-structured, and unstructured data from internal and external sources.


Ensure PDPL and data privacy compliance:

  • Implement data masking, tokenization, anonymization, and consent-aware processing; conduct Data Protection Impact Assessments (DPIAs) for new pipelines; enforce row-level security and audit logging in compliance with the Personal Data Protection Law (PDPL) and SDAIA guidelines.


Build and manage cloud-native data platforms:

  • Lead the implementation and optimization of enterprise data lakes/lakehouses (e.g., Delta Lake, Azure Synapse, AWS Lake Formation) and data warehouses (Snowflake) on AWS, Azure, or Google Cloud, ensuring high availability, disaster recovery, and cost governance.


Automate infrastructure and deployments Develop Infrastructure-as-Code:

  • (Terraform, CloudFormation) and CI/CD pipelines (GitHub Actions, Azure DevOps, Jenkins) to enable one-click deployments, automated testing of data quality, and zero-downtime schema migrations.


Guarantee data quality and observability:

  • Define and enforce data quality rules, monitoring, and alerting using Great Expectations, Monte Carlo, Soda, or custom frameworks; implement end-to-end lineage tracking and real-time anomaly detection.


Optimize performance and cost Continuously:

  • Tune data extraction jobs, partition strategies, and storage formats; reduce cloud spend by 20–30% annually through rightsizing, spot instances, and auto-scaling policies.


Collaborate with data consumers and stakeholders:

  • Partner with data scientists, analysts, and business units to translate requirements into reliable data products; publish datasets to the enterprise data catalog and maintain clear documentation and SLAs.


Lead technical governance and best practices:

  • Chair the Data Engineering Chapter; establish and enforce coding standards, peer review processes, and architectural decisions; mentor mid-level and junior engineers.


Support AI-readiness and national initiatives:

  • Deliver “AI-ready” datasets that meet SDAIA and Vision 2030 quality benchmarks (golden datasets, metadata completeness ≥95%, freshness SLAs); enable federated analytics across government entities where required.


Assist Incident response:

  • Assist in root-cause analysis and remediation of pipeline failures


Contribute to strategic roadmaps:

  • Provide input into the 3–5 year data platform strategy, technology evaluations, and annual budgeting for tools and cloud resources


Problem Solving and Innovation :

  • Collaborate with cross-functional teams to identify and address business challenges using data-driven approaches.
  • Develop innovative solutions and recommendations based on data-driven insights.


Stakeholder Engagement :

  • Effectively communicate findings and recommendations to stakeholders at all levels, tailoring the message to the audience's needs.
  • Build strong relationships with internal and external stakeholders to foster collaboration and support.


Job Requirements:


Qualifications & Experience:

  • 6+ years of prior relevant experience in data analysis or a similar IT field, with significant experience in leadership and strategic roles.
  • Proven ability to manage large teams and drive organizational change.
  • A bachelor's or master's degree in a quantitative field such as data science, computer science, statistics, mathematics, or engineering is preferred.


Skills:


Technical Skills:

  • Python (expert level) – including type hints, packaging, unit/integration testing (pytest), and building reusable libraries
  • SQL (expert level) – complex analytical queries, CTEs, window functions, performance tuning, and query optimization
  • Informatica – building and optimizing large-scale batch and streaming jobs on managed platforms (Snowflake)
  • Kafka (Confluent) – designing and operating real-time event-driven pipelines


At least one major cloud platform at expert level:

  • AWS (S3, Glue, EMR, Lambda, Athena, Lake Formation) or
  • Azure (ADLS Gen2, Azure Data Factory, Synapse Analytics, Databricks) or
  • Google Cloud (BigQuery, Dataflow, Cloud Storage, Pub/Sub)
  • Data orchestration with Apache Airflow (writing production-grade DAGs, custom operators, task dependencies) or Azure Data Factory
  • DBT (data build tool) – modeling, testing, documentation, incremental models, Jinja templating
  • Infrastructure as Code – Terraform or CloudFormation for provisioning data infrastructure
  • CI/CD for data pipelines – GitHub Actions, GitLab CI
  • Version control with Git (branching strategies, code reviews, GitFlow)
  • Linux/Unix fundamentals and shell scripting for automation and troubleshooting


Soft Skills:

  • Problem-solving and analytical thinking.
  • Strong communication and interpersonal skills.
  • Ability to work independently and as part of a team.
  • Curiosity and a passion for learning new technologies.
  • Business acumen and understanding of data-driven decision-making.
  • Strategic thinking with a deep understanding of industry trends and best practices.


Job Context:

  • The Senior Data Engineer will be based in Riyadh, Saudi Arabia with a requirement to work from any of the office location occupied by RSG and its subsidiaries.
  • In addition the Data Scientist will be expected to also work, from time to time and possibly for extended periods from any project site.
  • The Senior Data Engineer is part of a team of varying grades and different roles which delivers services to other areas of RSG, its subsidiaries and Customers as required.


Job Financial Dimensions:

  • Refers to the financial responsibilities and impacts associated with a role. This can include budget management, cost control, revenue generation, financial reporting, and resource allocation.
  • These dimensions highlight the role's influence on the organization's financial health, ensuring that the employee contributes to fiscal responsibility and the achievement of financial targets.


For more information about Red Sea Global, visit:

🌐 Website : https://www.redseaglobal.com/en

🔗 LinkedIn : https://www.linkedin.com/company/red-sea-global/

▶️ YouTube : https://www.youtube.com/channel/UCMo1fSbA3iOhvvC8OP0IYNA

🐦 Twitter: @TheRedSeaGlobal

Š 2026 Qureos. All rights reserved.