
Responsibilities
-
End-to-End Solutions:
You will design and develop end-to-end solutions for processing large data volumes, spanning from the initial business problem to scalable and optimized ETL/ELT pipelines.
-
Tech Stack:
You will work with technologies such as
Databricks, Snowflake, dbt, and Microsoft Fabric
, using
Spark (Python/Scala)
,
SQL
; and orchestration with
Airflow, Data Factory
or similar tools.
-
Advanced Architectures:
You will help to define and implement advanced architectures under the
Data Lakehouse and Data Warehouse
paradigms, supporting diverse data processing patterns including
Batch, Near Real Time, and Streaming
.
-
Data Integration:
You will develop
data ingestion and transformation pipelines
from diverse sources (APIs, files, databases, streaming), to build analytical data models supporting business use cases.
-
Platform Foundation
: You will collaborate in the implementation of frameworks and engines that provide a standard framework for the main functions of a data platform:
Orchestration, Ingestion, Transformation, Quality, Security, Testing, Deployment, Observability
, among others.
-
Collaboration:
You will collaborate with multidisciplinary teams and stakeholders, translating complex requirements into efficient technical solutions.
-
Continuous Learning:
You will stay up to date with the latest technological trends—especially in Data & Analytics—through continuous training and the exploration of new tools and innovations.
-
Career Path
: You will be in control of your professional development together with your managers. SDG Group will give you the opportunity to build a professional career oriented to become a Data Architect in a world-class organization.
Requirement
-
Education
: Degree in Computer Engineering or Computer Science. It’s a plus to have a Master’s Degree in any Data-related field
-
Languages
: English C1 level or higher
-
Experience
:
2-3 year
s of experience in
Data Lake and/or Data Warehous
e projects
-
ETL/ELT
: Experience on data integration with
Spark (Python or Scala) and/or SQ
L and
dimensional data modelin
g
-
Cloud
: Experience working in Cloud environments
(AWS, GCP, or Azur
e) and platforms
(Databricks, Snowflake, Microsoft Fabri
c)
-
Code Management & DataOps
: Solid understanding of Git, CI/CD, pipeline monitoring, data quality control, and data versioning
-
Consulting Background
: Previous experience in consulting is a plus
.
Similar jobs
No similar jobs found
© 2026 Qureos. All rights reserved.