Data Quality Engineer
Role Overview
We are looking for a Data Quality Engineer to design and implement robust data quality frameworks. The ideal candidate will ensure data accuracy, consistency, and reliability across our data platforms.
Key Responsibilities
- Develop and maintain data quality checks and validation frameworks
- Collaborate with data engineers and analysts to identify and resolve data issues
- Monitor data pipelines for anomalies and inconsistencies
- Implement data profiling and cleansing processes
- Ensure adherence to data governance and compliance standards
- Automate data quality reporting and alerting mechanisms
- Contribute to continuous improvement of data quality practices
Technologies & Skills
- Databricks (PySpark, Delta Lake, SQL)
- Snowflake
- Azure Data Factory
- Informatica
- Teradata
- CI/CD (GitHub Actions, Azure DevOps)
- Cloud Platforms (Azure, AWS, GCP)
- Python
- SQL
- Data Governance
- Data Modeling
- Streaming (Kafka, Event Hub)
- Unity Catalog
Nice-to-Have Skills
- Experience with ML pipelines (MLflow, Feature Store)
- Knowledge of visualization tools (Power BI, Tableau, Looker)
- Exposure to Graph databases (Neo4j) or RAG/LLM pipelines
- Containerization (Docker, Kubernetes)
- Basic front-end knowledge (React/Angular)
- Microservices (FastAPI/Flask)
Qualifications & Experience
- Bachelor’s or Master’s in Computer Science, Data Engineering, or related field
- 5–8 years of experience in data-related roles
- Strong problem-solving and analytical skills
- Clear communication and documentation abilities
- Experience working in cross-functional teams
About Us
At Codvo, we are committed to building scalable, future-ready data platforms that power business impact. We believe in a culture of innovation, collaboration, and growth, where engineers can experiment, learn, and thrive. Join us to be part of a team that solves complex data challenges with creativity and cutting-edge technology.