FIND_THE_RIGHTJOB.
JOB_REQUIREMENTS
Hires in
Not specified
Employment Type
Not specified
Company Location
Not specified
Salary
Not specified
Data Engineer
Location: Remote
Experience: 10+ Years
Work Timings: UK Shift
Job Overview:
We are seeking a highly experienced Data Engineer with a strong background in Microsoft
Fabric and Google Cloud Platform (GCP) to design, implement, and optimize large-scale data
solutions. The ideal candidate will have extensive hands-on expertise in building robust data
pipelines, managing complex data architectures, and enabling advanced analytics through
modern data engineering practices.
This role requires a professional with deep technical understanding, analytical thinking, and the
ability to work independently in a fast-paced environment aligned with UK business hours.
Key Responsibilities:
 Design, develop, and manage scalable data pipelines and workflows using Microsoft
Fabric and GCP.
 Build and maintain dataflows (Gen2), notebooks (Python and Spark SQL), and Direct
Lake models to support analytics and business intelligence.
 Implement and manage CI/CD processes within Fabric to ensure streamlined
deployments and environment consistency.
 Configure and monitor data security policies within Fabric to ensure compliance and
data governance standards.
 Develop optimized DAX queries and data models for performance and usability.
 Build and manage data ingestion pipelines from various sources to BigQuery,
leveraging Pub/Sub and Cloud Functions.
 Collaborate with cross-functional teams including analysts, data scientists, and architects
to deliver integrated, high-quality data solutions.
 Optimize data storage, processing, and access for performance, scalability, and cost
efficiency.
 Troubleshoot, monitor, and maintain data solutions to ensure reliability and data integrity.
Required Skills & Experience:
Microsoft Fabric Expertise:
 Proven hands-on experience in Fabric pipelines, Dataflow Gen2, and Notebooks.
 Strong proficiency in Python and Spark SQL for data transformation and analytics.
 Experience in CI/CD pipelines within Fabric and related DevOps practices.
 Deep understanding of data security configurations and permissions in Fabric.
 Experience building Direct Lake models and implementing DAX for analytics.
GCP Expertise:
 Practical experience in BigQuery, including schema design, performance tuning, and
query optimization.
 Hands-on experience with Cloud Functions for event-driven data processing.
 Expertise in Pub/Sub for data streaming and integration workflows.
General Skills:
 Strong understanding of data modeling, ETL/ELT frameworks, and data governance
principles.
 Ability to design and maintain high-performance, scalable cloud data architectures.
 Proficiency in version control (Git) and automated deployment practices.
 Excellent analytical and problem-solving skills with a strong attention to detail.
 Strong communication skills and ability to work with distributed, cross-functional teams.
Preferred Qualifications:
 Experience with Power BI and integration of Fabric data models for analytics.
 Knowledge of modern data lake architectures and data virtualization.
 Familiarity with machine learning data preparation pipelines and frameworks.
Additional Details:
 Contract Duration: Long-term, extendable based on performance and project
requirements.
 Work Mode: Fully remote (UK working hours).
 Engagement Type: Individual Contributor role within a collaborative global team.
Job Type: Full-time
Pay: ₹2,000,000.00 - ₹2,400,000.00 per year
Experience:
Work Location: Remote
Similar jobs

Goldman Sachs
India
6 days ago

Goldman Sachs
India
6 days ago

Goldman Sachs
India
6 days ago

Deloitte
India
6 days ago

Goldman Sachs
India
6 days ago
Amazon.com
India
6 days ago
Amazon.com
India
6 days ago
© 2025 Qureos. All rights reserved.