Skill: Azure Databricks+ Pyspark
Experience: 4 to 12 years
Location: AIA Noida
Job Summary
We are seeking a highly skilled Sr. Developer with 9 to 10 years of experience to join our dynamic team. The ideal candidate will have expertise in Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark. This role is hybrid allowing for a flexible work environment. The Sr. Developer will play a crucial role in developing and optimizing data solutions that drive business insights and innovation.
Responsibilities
-
Develop and implement robust data solutions using Databricks SQL to enhance data processing capabilities.
-
Optimize Databricks Delta Lake for efficient data storage and retrieval ensuring high performance and scalability.
-
Design and manage Databricks Workflows to automate data pipelines and improve operational efficiency.
-
Utilize PySpark to process large datasets enabling advanced analytics and data-driven decision-making.
-
Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
-
Ensure data quality and integrity by implementing best practices in data governance and validation.
-
Provide technical guidance and support to junior developers fostering a culture of continuous learning and improvement.
-
Monitor and troubleshoot data solutions to ensure seamless operations and minimal downtime.
-
Stay updated with the latest industry trends and technologies to drive innovation and maintain competitive advantage.
-
Contribute to the development of data architecture and strategy to align with organizational goals.
-
Document technical specifications and processes to facilitate knowledge sharing and collaboration.
-
Engage in code reviews to maintain high standards of code quality and performance.
-
Support the deployment and integration of data solutions into existing systems ensuring compatibility and functionality.
Qualifications
-
Possess a strong understanding of Databricks SQL and its application in data processing.
-
Demonstrate expertise in managing and optimizing Databricks Delta Lake for large-scale data solutions.
-
Have experience in designing and implementing Databricks Workflows for automated data pipelines.
-
Show proficiency in PySpark for processing and analyzing large datasets.
-
Exhibit excellent problem-solving skills and the ability to work collaboratively in a team environment.
-
Display strong communication skills to effectively convey technical concepts to non-technical stakeholders.
-
Maintain a proactive approach to learning and adapting to new technologies and methodologies.
Certifications Required
Databricks Certified Data Engineer Associate or equivalent certification in data engineering.