Data Engineer
Cloud Data Platforms & ETL/ELT Development
Company Overview
A2 Business Consulting, part of the Hussain Industries Group led by Anthony Choi & Aoun Hussain, delivers advanced data engineering and cloud platform solutions to US-based enterprise clients. We build scalable data infrastructure using modern Microsoft Azure and Fabric technologies to help organizations centralize, transform, and optimize their data for analytics and operations.
Our team operates fully remote with a culture of collaboration, accountability, and technical excellence.
Position Summary
We are seeking a Data Engineer with deep expertise in Microsoft Azure, Microsoft Fabric, SQL, and Python to design and build enterprise-grade data platforms.
This role focuses on building scalable ETL/ELT pipelines, developing robust data models, integrating diverse data sources, and maintaining high-quality data infrastructure. The ideal candidate has strong technical skills in data engineering fundamentals, cloud architecture, and automation.
Key ResponsibilitiesData Pipeline Development
- Design, build, and maintain scalable ETL/ELT pipelines using:
- Azure Data Factory (data integration, orchestration)
- Microsoft Fabric Pipelines and Dataflows Gen2
- Azure Functions (event-driven data processing)
- Implement incremental data loads, CDC (Change Data Capture), and full refresh patterns
- Build automated workflows for recurring data ingestion, transformation, and processing
- Optimize pipeline performance, cost, and reliability
- Implement error handling, retry logic, and pipeline monitoring
Data Integration & Sources
- Connect and integrate data from:
- Relational databases (SQL Server, PostgreSQL, MySQL)
- Cloud databases (Azure SQL, Cosmos DB, Snowflake)
- ERP/CRM systems (Dynamics 365, Salesforce, SAP, NetSuite)
- REST APIs and web services
- Flat files (CSV, JSON, Parquet, Excel)
- Streaming sources (Event Hubs, Kafka)
- Build data connectors and custom integrations using Python and REST APIs
- Implement API authentication, rate limiting, and pagination handling
Data Modeling & Architecture
- Design and build dimensional data models (star schema, snowflake schema)
- Develop subject-area data marts for business domains (sales, operations, finance, etc.)
- Build semantic models in Microsoft Fabric and Power BI
- Create fact tables, dimension tables, and bridge tables
- Implement slowly changing dimensions (SCD Type 1, 2, 3)
- Design Lakehouse and Data Warehouse architectures in Microsoft Fabric
- Optimize data partitioning, indexing, and query performance
Data Quality & Governance
- Implement data validation rules and quality checks
- Build data profiling and anomaly detection processes
- Create data quality dashboards and monitoring alerts
- Document data lineage, data dictionaries, and metadata
- Implement data catalog and governance practices
- Ensure data security and access control (RBAC)
SQL & Database Development
- Write complex SQL queries for data transformation and analysis
- Build and optimize stored procedures, views, and functions
- Perform database performance tuning and query optimization
- Manage table schemas, constraints, and relationships
- Implement indexing strategies for query performance
Python Development
- Develop Python scripts for data processing and automation
- Use libraries: Pandas, NumPy, requests, SQLAlchemy, PySpark
- Build custom data transformation functions
- Implement data validation and cleansing logic
- Automate data workflows and scheduling
Cloud Infrastructure & DevOps
- Deploy and manage Azure resources (Data Factory, Storage Accounts, SQL Databases, Key Vault)
- Implement Infrastructure as Code using ARM Templates or Bicep
- Use Git for version control and collaboration
- Build CI/CD pipelines for data platform deployments
- Monitor pipeline execution, performance, and costs
Required Qualifications
- 3-5+ years in Data Engineering, ETL/ELT Development, or Data Integration roles
- Deep hands-on experience with:
- Microsoft Azure:
- Azure Data Factory (pipeline development, triggers, orchestration)
- Azure SQL Database / Azure Synapse Analytics
- Azure Blob Storage / Data Lake Storage Gen2
- Azure Functions (Python or C#)
- Microsoft Fabric:
- Lakehouse / Warehouse
- Pipelines and Dataflows Gen2
- Data modeling in Fabric
- SQL:
- Advanced SQL (joins, CTEs, window functions, aggregations)
- Query optimization and performance tuning
- Database design and normalization
- Python:
- Data manipulation (Pandas, NumPy)
- API integration (requests library)
- Data processing automation
- Strong understanding of:
- ETL/ELT design patterns and best practices
- Dimensional modeling (star schema, fact/dimension tables)
- Data warehousing concepts
- Data quality and validation techniques
- Experience with:
- Working with APIs and REST services
- Version control (Git, GitHub, Azure DevOps)
- Agile development workflows
- Strong problem-solving skills and attention to detail
- Good communication skills for collaborating with technical and business teams
Preferred Qualifications
- Certifications:
- Microsoft Certified: Azure Data Engineer Associate (DP-203)
- Microsoft Certified: Fabric Analytics Engineer Associate (DP-600)
- Microsoft Certified: Azure Fundamentals (AZ-900)
- Experience with:
- Power BI (data modeling, DAX, connecting to data sources)
- PySpark / Databricks for big data processing
- Azure Synapse Analytics or Snowflake
- dbt (data build tool) for transformation workflows
- Apache Airflow or other orchestration tools
- Docker and containerization
- Event-driven architectures (Azure Event Hubs, Service Bus)
- Experience integrating with:
- ERP systems (Dynamics 365, SAP, Oracle, NetSuite)
- CRM systems (Salesforce, HubSpot, Dynamics CRM)
- Accounting systems (QuickBooks, Xero)
- Experience in Agile or Scrum workflows
- Based in Pakistan or the Middle East region (preferred due to cross-time-zone collaboration with US clients)
Contract Details
- Contract Type: 3-month renewable
- Schedule: Monday–Friday, 9:00 AM – 5:00 PM EST
- Work Model: Fully remote
What We Offer
- High-impact work building enterprise data platforms and pipelines
- Opportunity to work with Microsoft Fabric and the latest Azure Data technologies
- Direct exposure to US-based business and technology leadership
- Continuous learning and growth in cloud data engineering and modern data architecture
- A supportive, high-performance culture
- Competitive compensation aligned with experience
How to Apply
Please submit the following to:
Application Materials:
- Updated resume/CV
- Portfolio or GitHub profile showcasing:
- Data pipelines you've built
- SQL scripts or Python code samples
- Data modeling projects
- Azure/cloud infrastructure work
- Brief cover letter explaining:
- Your experience with Azure Data Factory and Fabric
- Most complex data integration challenge you've solved
- Your approach to data quality and pipeline reliability
SalesTeq.us is an equal opportunity employer. We value technical excellence, collaboration, and continuous improvement.
Company Websites:
Job Type: Full-time
Work Location: Remote
Job Type: Full-time
Pay: Rs350,000.00 - Rs450,000.00 per month
Work Location: Remote