<Job Summary>
Join our dynamic healthcare data team as an ETL Testing Specialist, where you'll play a pivotal role in ensuring the integrity, accuracy, and security of complex healthcare data pipelines. Your expertise will drive the quality assurance of data integration processes, supporting critical decision-making and patient care initiatives. This energetic role offers the opportunity to work with cutting-edge technologies in a fast-paced environment dedicated to transforming healthcare through data excellence.
<Responsibilities>
- Design, develop, and execute comprehensive ETL testing strategies to validate data accuracy, completeness, and consistency across large-scale healthcare data warehouses.
- Collaborate with data engineers, analysts, and stakeholders to understand data flows, source systems, and business requirements for robust testing coverage.
- Validate data transformations performed by ETL tools such as Informatica, Talend, or custom scripts using SQL, Python, Bash scripting, and Shell scripting.
- Perform detailed analysis of data discrepancies using SQL queries on platforms like Microsoft SQL Server, Oracle, Apache Hive, and Azure Data Lake.
- Develop automated test scripts and frameworks leveraging Spark, Hadoop, and Big Data technologies to streamline testing processes.
- Conduct API testing for RESTful services to ensure seamless integration with external systems and data sources.
- Document test cases, results, defects, and collaborate in Agile environments to continuously improve testing methodologies and data quality standards.
<Qualifications>
- Proven experience in ETL testing within healthcare or similar regulated industries with a strong understanding of healthcare data standards such as linked data models.
- Proficiency in SQL for database validation across platforms including Microsoft SQL Server, Oracle, and Big Data environments like Hadoop or Spark.
- Hands-on experience with ETL tools such as Informatica or Talend for designing and validating complex workflows.
- Strong programming skills in Python and familiarity with shell scripting (Bash) for automation tasks.
- Knowledge of cloud platforms such as AWS or Azure Data Lake for scalable data storage and processing solutions.
- Experience working with big data technologies including Hadoop ecosystem components like Apache Hive and Spark.
- Familiarity with analytics tools such as Looker for reporting and visualization purposes.
- Understanding of Agile development methodologies to support iterative testing cycles.
- Excellent analysis skills for troubleshooting data issues and performing root cause analysis.
- Ability to design efficient database schemas and optimize queries for performance tuning in large-scale environments. Embrace this opportunity to be at the forefront of healthcare innovation by ensuring the highest standards of data quality through rigorous ETL testing practices!
Job Type: Contract
Pay: $26.82 - $52.30 per hour
Experience:
- Azure Databricks: 5 years (Required)
- Azure Data Factory : 5 years (Required)
- Python: 5 years (Required)
Work Location: In person