Role: Senior Data Engineer (Scala Specialist)
Location: Chennai
Work Mode: Hybrid
Employment Type: Full-time Permanent
Overview:
We are seeking an experienced Senior Data Engineer with deep expertise in Scala and big data technologies to contribute to a cutting-edge project. This role focuses on designing and implementing high-performance data pipelines and processing frameworks, primarily on cloud platforms like Databricks. The ideal candidate will combine technical proficiency with leadership skills to drive the team’s success.
Responsibilities:
- Architect, develop, and maintain large-scale data pipelines using Scala, Apache Spark, and PySpark.
- Collaborate closely with data scientists, analysts, and engineering teams to deliver scalable data solutions.
- Enhance pipeline performance and ensure reliability on cloud-based platforms, with a focus on Databricks.
- Provide mentorship and technical guidance to junior engineers, promoting best practices in data engineering.
- Conduct code reviews to maintain code quality, consistency, and scalability.
- Troubleshoot and resolve issues related to data processing and system performance.
Essential Skills:
- Expert-level proficiency in Scala programming (mandatory).
- Extensive experience working with Apache Spark and Databricks environments.
- Strong skills in PySpark and Python scripting for data transformation tasks.
- Proven track record of building and supporting robust, large-scale data processing workflows.
- Solid understanding of distributed systems and parallel computing concepts.
- Commitment to writing clean, efficient, and maintainable code.
Preferred Qualifications:
- Experience with major cloud platforms such as Azure, AWS, or Google Cloud Platform (GCP).
- Familiarity with ELT/ETL orchestration tools and frameworks.
- Knowledge of CI/CD pipelines tailored for data engineering workflows.
- Prior experience working in Agile development environments.
Job Type: Permanent
Pay: ₹3,000,000.00 - ₹3,500,000.00 per year
Experience:
- Scala programming : 5 years (Required)
- Spark: 3 years (Required)
- Pyspark: 3 years (Required)
- Spark (Databricks): 3 years (Required)
- Total: 10 years (Required)
Work Location: In person