We are seeking an experienced Fabric Data Engineering Architect to design, implement, and manage scalable data solutions using Microsoft Fabric.
Responsibilities:
-
Design and architect end-to-end data solutions using Microsoft Fabric (Lakehouse, Data Warehouse, Data Factory, Real-Time Analytics).
-
Lead the development of scalable and high-performance data pipelines for batch and real-time processing.
-
Define data architecture standards, governance, security, and best practices across the organization.
-
Collaborate with business stakeholders, data scientists, and engineering teams to translate business requirements into technical solutions.
-
Implement and optimize data models (dimensional & normalized) for analytics and reporting.
-
Ensure data quality, reliability, and performance tuning across data platforms.
-
Drive integration with external systems, APIs, and third-party data sources.
-
Mentor and guide data engineering teams on Fabric and modern data engineering practices.
Requirement:
-
10–15+ years of experience in Data Engineering / Data Architecture.
-
Strong hands-on experience with Microsoft Fabric components: Lakehouse, Data Factory (pipelines), Synapse Data Engineering / Data Warehouse and Power BI integration
-
Expertise in SQL, Python, and Spark (PySpark preferred).
-
Experience with data modeling techniques (Star Schema, Snowflake Schema).
-
Strong understanding of ETL/ELT frameworks and data pipeline orchestration.
-
Experience with cloud platforms (Azure preferred).
-
Knowledge of data governance, security, and compliance practices.
-
Experience with Azure Data Services (ADLS, Azure Synapse, Databricks).