Job Responsibilities:
- Strong hands-on experience with Databricks and Apache Airflow for data orchestration.
- Expertise in PySpark, SQL, and Python for large-scale data processing.
- Proven experience in building and maintaining scalable data pipelines in a cloud environment.
- Experience leading and mentoring a technical team.
- Strong problem-solving, communication, and collaboration skills.
Job Function
IT INFRASTRUCTURE SERVICES
Desired Skills
Python | Spark | SQL | Apache
Desired Candidate Profile
Qualifications : BACHELOR OF ENGINEERING