Qureos

FIND_THE_RIGHTJOB.

Quality Engineering Lead (Test Lead)

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Project Role : Quality Engineering Lead (Test Lead)
Project Role Description : Leads a team of quality engineers through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Applies business and functional knowledge to develop end-to-end testing strategies through the use of quality processes and methodologies. Applies testing methodologies, principles and processes to define and implement key metrics to manage and assess the testing process including test execution and defect resolution.
Must have skills : Data Warehouse ETL Testing
Good to have skills : NA
Minimum 5 year(s) of experience is required
Educational Qualification : 15 years full time education

Summary: We are looking for an experienced Data and AI ETL Tester to validate end-to-end data engineering solutions built on Databricks using Python/PySpark. The ideal candidate will be responsible for testing complex data transformation pipelines, ensuring data quality across the data warehouse and aggregated layers, validating reporting outputs, and contributing to the automation of testing processes. This role will also focus on testing metadata-driven frameworks and Agentic AI solutions designed for automated data quality management, metadata validation, and intelligent test generation. Roles & Responsibilities: - ETL and Data Pipeline Testing Validate ETL/ELT processes developed in Databricks (Python/PySpark) for correctness, performance, and scalability. Conduct data validation, reconciliation, and regression testing across staging, curated, and aggregated layers. Ensure consistency of data transformations between the data lake, warehouse, and reporting layers. Perform schema validation, business rule verification, and data lineage testing. Data Warehouse and Reporting Validation Validate data aggregation logic and ensure data accuracy in analytical and reporting layers. Cross-check BI reports/dashboards (e.g., Power BI, Tableau) against source and transformed data. Test data models, measures, and KPIs to ensure reporting accuracy. Data Quality and Metadata Testing Execute tests for data quality dimensions (accuracy, completeness, consistency, validity, timeliness, uniqueness). Validate metadata-driven frameworks that dynamically manage ETL logic and schema evolution. Test AI-driven or Agentic AI data quality frameworks—validating automation, intelligent rule generation, anomaly detection, and self-healing data flows. Test Automation and Framework Development Develop and maintain automated data testing frameworks using Python, PySpark, or Databricks notebooks. Integrate automated tests into CI/CD pipelines (e.g., Azure DevOps, Jenkins, GitHub Actions). Implement reusable test scripts for data validation, metadata verification, and data quality monitoring. Contribute to the test strategy, test planning, and execution for continuous integration and testing. Collaboration and Documentation Work closely with data engineers, data scientists, BI developers, and AI solution architects to ensure data accuracy and reliability. Create and maintain test plans, test cases, test data, and defect logs. Support root cause analysis and provide actionable insights to improve data quality and testing efficiency. Professional & Technical Skills:- 7+ years of experience in ETL or data testing, with strong understanding of data engineering principles. Experience in test automation using Python/PySpark and integrating with CI/CD pipelines. Strong expertise in data validation, data reconciliation, and testing large-scale data pipelines. Experience testing data warehouse aggregation layers and reporting outputs. Solid understanding of metadata-driven frameworks and data quality validation concepts. Familiarity with Agentic AI or AI-driven data quality automation tools. Hands-on experience with Databricks, Python, and PySpark. Strong SQL skills for data verification and troubleshooting. Working knowledge of data lakehouse architectures, Delta Lake, and data governance tools. Excellent analytical, communication, and problem-solving skills. Additional Information:- The candidate should have a minimum of 5 years of experience in Large Language Models. - This position is based at our Bengaluru office. - A 15 years full-time education is required. .


15 years full time education

© 2025 Qureos. All rights reserved.