Qureos

FIND_THE_RIGHTJOB.

Data Engineer (Snowflake)

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Job Description

Key Responsibilities:

  • Design, develop, and maintain data pipelines and ETL/ELT workflows using GCP-native tools and services.
  • Build and optimize data warehouses using Snowflake.
  • Write complex and efficient SQL queries for data transformation, analysis, and reporting.
  • Collaborate with analysts, data scientists, and business stakeholders to understand data needs and deliver reliable solutions.
  • Implement data governance, security, and monitoring best practices across GCP projects.
  • Tune queries and optimize performance of large-scale datasets.
  • Automate workflows using Cloud Composer (Airflow) or similar orchestration tools.

Required Skills & Qualifications:

  • 3+ years of experience in a data engineering or data platform role.
  • Strong hands-on experience with Snowflake data warehousing
  • Expert-level skills in SQL — able to write optimized, scalable, and complex queries.
  • Experience with data modeling (star/snowflake schema), partitioning, clustering, and performance tuning in a data warehouse.
  • Familiarity with modern ELT tools such as dbt, Fivetran, or Cloud Data Fusion.
  • Experience in Python or similar scripting language for data engineering tasks.
  • Understanding of data governance, privacy, and Google Cloud Platform services, especially BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer.

© 2025 Qureos. All rights reserved.