- Strong proficiency in Python, SQL for data manipulation and processing.
- Experience with data warehouse solutions for Snowflake, BigQuery, Databricks.
- Ability to design and implement efficient data models for data lakes and warehouses.
- Familiarity with CI/CD pipelines and automation tools to streamline data engineering workflows
- Deep understanding of principles in data warehousing and cloud architecture for building very efficient and scalable data systems.
- Experience with Apache Airflow and/or AWS MWAA
- Experience with Snowflake’s distinctive features, including multi-cluster architecture and shareable data features.
- Expertise in distributed processing frameworks like Apache Spark, or other big data technologies is a plus.
Job Function
IT INFRASTRUCTURE SERVICES
Desired Skills
AWS | Java | Machine Learning | Python | SQL | API Microservices | ETL Testing | Oracle GCC | Webservices
Desired Candidate Profile
Qualifications : BACHELOR OF ENGINEERING