Senior Data Engineer
A2 Business Consulting | Hussain Industries Group Cloud Data Platforms, ETL/ELT Architecture & Advanced Integration
Monthly Take-Home (After Tax): PKR 350,000 – 500,000 Contract: 3-Month Renewable | Schedule: Mon–Fri, 9 AM–5 PM EST | Fully Remote
Company Overview
A2 Business Consulting, part of the Hussain Industries Group, delivers advanced data engineering and cloud platform solutions to US-based enterprise clients. We build scalable data infrastructure using modern Microsoft Azure and Fabric technologies to help organizations centralize, transform, and optimize their data for analytics and operations. Our team operates fully remote with a culture of collaboration, accountability, and technical excellence.
Position Summary
We are seeking a Senior Data Engineer to lead the design, development, and maintenance of enterprise-grade data platforms. This is a hands-on technical leadership role with deep focus on ETL/ELT pipeline architecture, advanced data modeling, complex multi-source integrations, and cloud infrastructure management. The ideal candidate brings 5+ years of data engineering experience, strong Python and SQL skills, and proven expertise in Microsoft Azure and Microsoft Fabric. You will work directly with US-based technical and business leadership to drive data platform strategy.
This is a data engineering and technology role. Candidates from accounting or finance administration backgrounds are not suited for this position.
Key Responsibilities
Data Pipeline Architecture & Development
- Architect and lead development of complex, scalable ETL/ELT pipelines using Azure Data Factory, Microsoft Fabric Pipelines, Dataflows Gen2, and Azure Functions
- Design and implement incremental load strategies including CDC (Change Data Capture), watermark-based loads, and partition-aware refresh patterns
- Build robust error handling frameworks, retry logic, dead-letter queues, and automated alerting for pipeline failures
- Optimize pipeline performance, execution costs, and scheduling across dev, staging, and production environments
- Mentor junior engineers on pipeline design patterns and best practices
Advanced Data Integration
- Architect integrations with complex enterprise sources: Microsoft Dynamics 365 (CRM and Business Central), Salesforce, SAP, NetSuite, REST/SOAP APIs, Azure SQL, Cosmos DB, Snowflake, relational databases, flat files, and streaming sources (Event Hubs, Kafka)
- Design and implement custom Python-based data connectors with OAuth/API key authentication, rate limiting, and pagination
- Lead integration projects end-to-end from requirements gathering through deployment
Data Modeling & Architecture
- Design enterprise dimensional data models: star schema, snowflake schema, data vault
- Build subject-area data marts for business domains (operations, finance, sales, claims, clinical data)
- Architect Lakehouse and Data Warehouse solutions in Microsoft Fabric
- Build semantic models, measures, and relationships in Microsoft Fabric and Power BI
- Implement SCD Types 1, 2, and 3; manage historical tracking and surrogate keys
- Optimize partitioning strategies, indexing, and columnar storage for query performance
Python Development
- Write production-quality Python scripts for data processing, transformation, and automation
- Proficient with Pandas, NumPy, PySpark, requests, SQLAlchemy, pyodbc, openpyxl
- Build custom ETL frameworks, validation engines, and reusable utility libraries
- Implement unit tests, data contract validations, and automated quality assertions
SQL & Database Development
- Write highly optimized SQL including complex CTEs, window functions, recursive queries, and dynamic SQL
- Build and maintain stored procedures, views, user-defined functions, and indexed views
- Lead database performance tuning: query plan analysis, index strategies, statistics management
Data Quality & Governance
- Design and implement enterprise data quality frameworks: profiling, validation rules, anomaly detection
- Build data quality monitoring dashboards with automated alerting
- Maintain data lineage documentation, data dictionaries, and metadata catalogs
- Enforce RBAC, column-level security, and row-level security in Fabric and Azure
Cloud Infrastructure & DevOps
- Deploy and manage Azure resources: Data Factory, Storage Accounts, Azure SQL, Key Vault, Function Apps
- Implement Infrastructure as Code using ARM Templates or Bicep
- Build and maintain CI/CD pipelines using Azure DevOps or GitHub Actions
- Maintain Git version control practices, branching strategies, and code review standards
Required Qualifications
- 5+ years of hands-on experience in Data Engineering, ETL/ELT Development, or Data Platform roles
- Expert-level experience with Azure Data Factory, Azure SQL, Azure Synapse, Azure Blob/ADLS Gen2, Azure Functions
- Strong Microsoft Fabric experience: Lakehouse, Warehouse, Pipelines, Dataflows Gen2, Semantic Modeling
- Advanced SQL: CTEs, window functions, query optimization, database design
- Python proficiency: Pandas, NumPy, requests, SQLAlchemy, PySpark
- Strong understanding of dimensional modeling, data warehousing, and ETL/ELT design patterns
- Experience with REST API integrations, OAuth flows, and web services
- Version control with Git/GitHub/Azure DevOps; Agile/Scrum experience
Preferred Qualifications
- Microsoft Certifications: DP-203 (Azure Data Engineer Associate), DP-600 (Fabric Analytics Engineer), AZ-900
- Power BI: data modeling, DAX, report development, deployment pipelines
- Experience with dbt, Apache Airflow, or Databricks
- Docker, containerization, and event-driven architectures (Event Hubs, Service Bus)
- ERP integrations: Dynamics 365, SAP, NetSuite; CRM: Salesforce
- US Healthcare data knowledge: HIPAA compliance, HL7/FHIR, HCPCS/CPT coding, DME billing, wound care claims — strong advantage
- Financial analytics exposure: revenue cycle data, AR aging, financial reporting pipelines — a plus, but this is not a role for finance or accounting professionals
- Based in Pakistan preferred for cross-timezone collaboration with US clients
How to Apply Email: techarchitectahussain@gmail.com | CC: hiring@salesteq.us Include: resume/CV, GitHub or portfolio (pipelines, SQL/Python scripts, data modeling, Azure/Fabric work), and a cover letter describing your ADF and Fabric experience, the most complex integration you have built, and your approach to data quality.
Job Type: Full-time
Pay: Rs350,000.00 - Rs500,000.00 per month
Work Location: Remote