FIND_THE_RIGHTJOB.
Lahore, Pakistan
About the Role:
We are looking for a QA Engineer with strong AWS Data Warehouse (DWH) expertise to ensure data quality, accuracy, and reliability across our cloud data platform. You will work closely with Data Engineers, BI Developers, and Business Analysts to design and execute test strategies for ETL pipelines, data transformations, and reporting systems built on AWS services such as Redshift, Glue, Athena, S3, and DMS.
Key Responsibilities:
Design, develop, and execute test plans, test cases, and test scripts for data warehouse and ETL processes.
Validate data extraction, transformation, and loading (ETL) processes between source systems and target AWS DWH (Redshift, S3, Athena).
Perform data quality, reconciliation, and validation testing (counts, duplicates, schema, nulls, transformations).
Test CDC (Change Data Capture) pipelines built with AWS DMS and Glue.
Write complex SQL queries in Redshift/Athena to validate data integrity and business rules.
Test partitioning, performance, and query optimization in Redshift and Athena.
Collaborate with developers and DevOps teams to integrate QA into CI/CD pipelines.
Create and maintain test automation frameworks for data validation (e.g., using Python, Pytest, Great Expectations, dbt tests, etc.).
Document defects clearly, work with the engineering team to resolve issues, and re-test fixes.
Ensure compliance with data governance, lineage, and security policies.
Required Skills & Experience:
3–6 years of experience in Data Warehouse / ETL QA.
Hands-on experience with AWS DWH services: Redshift, Glue, Athena, S3, DMS, Glue Catalog.
Strong SQL skills for data validation (window functions, aggregations, joins).
Experience testing ETL pipelines (batch and incremental loads).
Familiarity with data lake concepts, partitioning, and schema evolution.
Experience with Python or Shell scripting for automation.
Knowledge of QA best practices in data engineering projects (functional, regression, performance testing).
Experience with Git / CI-CD pipelines (Jenkins, CodePipeline, GitHub Actions, etc.).
Excellent problem-solving and communication skills.
Good to Have (Plus Skills)
Experience with Snowflake, BigQuery, or Azure Synapse (multi-cloud exposure).
Knowledge of BI tools (Tableau, Power BI, QuickSight) for report validation.
Experience with data testing frameworks like Great Expectations, dbt, or Pytest.
Familiarity with Redshift performance tuning (distribution keys, sort keys, compression).
Exposure to agile methodology and tools like JIRA, Confluence.
Education
Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Engineering, or related field.
Job Type: Full-time
Ability to commute/relocate:
Application Question(s):
Location:
Work Location: In person
Similar jobs
Arrivy Inc.
Lahore, Pakistan
3 days ago
Techbridge Consultancy Services
Lahore, Pakistan
3 days ago
ZeeYo Tech
Lahore, Pakistan
3 days ago
The Garden of Eve
Lahore, Pakistan
4 days ago
CAN AT TOTAL
Lahore, Pakistan
11 days ago
Ascertia
Lahore, Pakistan
11 days ago
Edward Milton & Co
Lahore, Pakistan
11 days ago
© 2025 Qureos. All rights reserved.