Qureos

FIND_THE_RIGHTJOB.

SSE - Data Engineering

About Us
CLOUDSUFI, a Google Cloud Premier Partner, is a global leading provider of data-driven digital transformation across cloud-based enterprises. With a global presence and focus on Software & Platforms, Life sciences and Healthcare, Retail, CPG, financial services and supply chain, CLOUDSUFI is positioned to meet customers where they are in their data monetization journey.
Our Values
We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.

Equal Opportunity Statement
CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, colour, religion, gender, gender identity or expression, sexual orientation and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/

Job Description
We are looking for a highly skilled Data Engineer to spearhead our data modernization initiatives. As a Lead, you will be responsible for designing and executing complex data migrations and building scalable, secure data pipelines within the Google Cloud Platform (GCP) ecosystem.
The ideal candidate is a hands-on technical expert who can bridge the gap between architectural design and implementation, ensuring our data infrastructure is robust, cost-effective, and secure.

Key Responsibilities
  • Data Migration Leadership: Lead end-to-end data migration strategies from legacy systems (on-prem or other clouds) to GCP, ensuring zero data loss and minimal downtime.
  • Pipeline Development: Design, develop, and maintain complex ETL/ELT pipelines using Cloud Data Fusion for low-code integration and Cloud Composer (Airflow) for sophisticated orchestration.
  • Warehouse Management: Optimize BigQuery architecture for performance and cost, implementing best practices for partitioning, clustering, and materialized views.
  • Data Transformation: Own the transformation layer using Dataform to manage SQL-based data modelling with version control and testing.
  • Security & Governance: Implement enterprise-grade security using IAM roles, Service Accounts, and Secret Manager.
  • Infrastructure Collaboration: Work closely with Cloud Architects to configure GCP Networking (VPC, Subnets, Firewalls, and Private Service Connect) for secure data transit.
  • Liaison with BI team: Act as the primary technical liaison to the BI team, conducting detailed logic walkthroughs to explain how source data is transformed into final reporting metrics.

Qualifications:
  • Bachelor's degree in Computer Science, a related technical field, or equivalent practical experience.
  • 5+ years in Data Engineering, with at least 3+ years specifically leading GCP-based projects
  • Strong expertise in SQL (preferably with BigQuery SQL), especially on joins, aggregations, and complex transformations to support the transition from Silver to Gold layers
  • Proven experience implementing data quality checks and complex business logic in BigQuery.
  • Proficiency in Python programming for data manipulation and automation.
  • Hands-on experience with Google Cloud Platform (GCP) and its data services.
  • Solid understanding of data warehousing concepts and ETL/ELT methodologies.
  • Experience with Dataform or DBT for data transformation and modeling.
  • Experience with workflow management tools such as Apache Airflow.
  • Basic understanding of data visualization tools, in particular Power BI, to better understand reporting needs and downstream usage.
  • Ability to work with well-documented transformation logic and follow best practices in data modeling.
  • Experience in optimizing query performance and cost on GCP.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills.
  • Ability to work independently and as part of a team.

Required Technical Skills
  • Cloud Platform: Expert-level proficiency in Google Cloud Platform (GCP).
  • Data Warehousing: Advanced BigQuery (SQL, optimization, and administration).
  • Orchestration: Hands-on experience with Cloud Composer (Apache Airflow).
  • ETL/ELT Tools: Proficiency in Dataform and Cloud Data Fusion.
  • Languages: Expert in SQL (complex joins, CTEs, window functions). Fair to solid proficiency in Python for scripting and Airflow DAGs.
  • Security & Ops: Deep understanding of IAM, Service Accounts, and Secret Manager.
  • Networking: Fair understanding of GCP Networking (VPC, Cloud SQL Auth Proxy, etc.).

Qualifications & Soft Skills
  • Experience: 5+ years in Data Engineering, with at least 3+ years specifically leading GCP-based projects.
  • Migration Track Record: Proven experience moving large-scale production datasets across environments.
  • Leadership: Experience mentoring junior engineers and leading technical sprints.
  • Communication: Ability to explain complex technical trade-offs to non-technical stakeholders.
  • Problem-Solving: A proactive approach to identifying bottlenecks in the data lifecycle.
  • Must have worked with US/Europe based clients in onsite/offshore delivery model

Preferred Qualifications:
Google Cloud Professional Data Engineer certification.

© 2026 Qureos. All rights reserved.