Qureos

FIND_THE_RIGHTJOB.

QA Engineer – AWS Data Warehouse (ETL/DWH Testing)

Lahore, Pakistan

About the Role:

We are looking for a QA Engineer with strong AWS Data Warehouse (DWH) expertise to ensure data quality, accuracy, and reliability across our cloud data platform. You will work closely with Data Engineers, BI Developers, and Business Analysts to design and execute test strategies for ETL pipelines, data transformations, and reporting systems built on AWS services such as Redshift, Glue, Athena, S3, and DMS.

Key Responsibilities:

Design, develop, and execute test plans, test cases, and test scripts for data warehouse and ETL processes.

Validate data extraction, transformation, and loading (ETL) processes between source systems and target AWS DWH (Redshift, S3, Athena).

Perform data quality, reconciliation, and validation testing (counts, duplicates, schema, nulls, transformations).

Test CDC (Change Data Capture) pipelines built with AWS DMS and Glue.

Write complex SQL queries in Redshift/Athena to validate data integrity and business rules.

Test partitioning, performance, and query optimization in Redshift and Athena.

Collaborate with developers and DevOps teams to integrate QA into CI/CD pipelines.

Create and maintain test automation frameworks for data validation (e.g., using Python, Pytest, Great Expectations, dbt tests, etc.).

Document defects clearly, work with the engineering team to resolve issues, and re-test fixes.

Ensure compliance with data governance, lineage, and security policies.

Required Skills & Experience:

3–6 years of experience in Data Warehouse / ETL QA.

Hands-on experience with AWS DWH services: Redshift, Glue, Athena, S3, DMS, Glue Catalog.

Strong SQL skills for data validation (window functions, aggregations, joins).

Experience testing ETL pipelines (batch and incremental loads).

Familiarity with data lake concepts, partitioning, and schema evolution.

Experience with Python or Shell scripting for automation.

Knowledge of QA best practices in data engineering projects (functional, regression, performance testing).

Experience with Git / CI-CD pipelines (Jenkins, CodePipeline, GitHub Actions, etc.).

Excellent problem-solving and communication skills.

Good to Have (Plus Skills)

Experience with Snowflake, BigQuery, or Azure Synapse (multi-cloud exposure).

Knowledge of BI tools (Tableau, Power BI, QuickSight) for report validation.

Experience with data testing frameworks like Great Expectations, dbt, or Pytest.

Familiarity with Redshift performance tuning (distribution keys, sort keys, compression).

Exposure to agile methodology and tools like JIRA, Confluence.

Education

Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Engineering, or related field.

Job Type: Full-time

Ability to commute/relocate:

  • Lahore: Reliably commute or planning to relocate before starting work (Required)

Application Question(s):

  • As an SQA, how do you design test cases to validate data quality and transformations in an AWS Data Warehouse environment (e.g., Redshift, Glue, Athena)?
  • When you identify a defect in an ETL pipeline or reporting system, how do you document it and communicate it to developers for resolution?
  • Have you automated any data validation or testing processes (using Python, Pytest, or Great Expectations)? Can you share an example?
  • Do you have strong English communication skills?

Location:

  • Lahore (Required)

Work Location: In person

© 2025 Qureos. All rights reserved.