Qureos

FIND_THE_RIGHTJOB.

Senior Data Engineer (Remote)

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Title: Senior Data Engineer Location: Pakistan

Role Type: Fulltime

Time: Eastern Standard Time (EST)

Responsibilities:

· Design, implement, and maintain data platforms and pipelines leveraging Microsoft Fabric, Azure Data Factory (ADF), SQL Server, and SSIS.

· Develop and optimize ETL/ELT workflows for structured and unstructured data using ADF, Fabric Dataflows, and PySpark.

· Integrate data from multiple sources (APIs, relational, and non-relational databases) into Lakehouse and SQL Server environments.

· Implement data transformation and cleansing logic using Python, PySpark, and Fabric notebooks.

· Collaborate with Data Science and BI teams to prepare datasets for analytics, dashboards, and machine learning models.

· Design and enforce data modeling best practices, including dimensional modeling for data warehouses.

· Automate workflows using Power Automate for notifications, approvals, and orchestration of data refresh processes.

· Monitor and improve data pipeline performance, ensuring scalability and reliability.

· Manage metadata and data cataloging within Microsoft Fabric for discoverability and governance.

· Develop incremental and real-time data ingestion strategies for large datasets.

· Ensure compliance with data security and governance policies across all environments.

· Provide technical guidance and mentorship to junior data engineers.

Requirements

· Bachelor’s degree in Computer Science, Data Engineering, or related field.

· 5+ years of experience in data engineering with strong expertise in:

· Microsoft Fabric (Lakehouse, Dataflows, Pipelines)

· Azure Data Factory (ADF) and SSIS

· SQL Server (T-SQL, stored procedures, performance tuning)

· Python and PySpark for data processing

· Hands-on experience with ETL/ELT design, data warehousing, and dimensional modeling.

· Familiarity with Power Automate for workflow automation.

· Strong understanding of data governance, metadata management, and security best practices.

· Experience with incremental loads, CDC (Change Data Capture), and real-time streaming.

· Ability to troubleshoot and optimize data pipelines for performance and cost efficiency.

· Excellent communication skills for cross-functional collaboration.

· Knowledge of Azure services (Key Vault, Storage, Synapse) and Fabric integration.

Self-starter with a proactive approach to problem-solving and continuous improvement

Job Type: Full-time

Pay: Rs500,000.00 - Rs550,000.00 per month

Application Question(s):

  • Do you have at least 5 years of hands-on experience in Data Engineering?
  • Have you worked extensively with Microsoft Fabric (Lakehouse, Dataflows, Pipelines)?
  • Do you have strong experience with Azure Data Factory (ADF) and SSIS?
  • Are you proficient in SQL Server, including T-SQL, stored procedures, and performance tuning?
  • Do you have hands-on experience using Python and PySpark for data processing?
  • Have you designed and implemented ETL/ELT pipelines, including data warehousing and dimensional modeling?
  • Do you have experience working with incremental loads, CDC (Change Data Capture), or real-time data streaming?
  • Do you have experience integrating data from multiple sources (APIs, relational, non-relational) into Lakehouse or SQL Server environments?

Education:

  • Bachelor's (Preferred)

Location:

  • Pakistan (Required)

Work Location: Remote

Similar jobs

No similar jobs found

© 2025 Qureos. All rights reserved.