Qureos

FIND_THE_RIGHTJOB.

Technical Lead-Data Engg

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Country/Region: IN
Requisition ID: 31245
Work Model:
Position Type:
Salary Range:
Location: INDIA - PUNE - BIRLASOFT OFFICE - HINJAWADI

Title: Technical Lead-Data Engg

Description:

Area(s) of responsibility

JD - AWS Data Platform Framework Development Engineer

Key Responsibilities
Design and develop reusable and scalable data processing frameworks and libraries for data ingestion, processing, and ETL pipelines on AWS alongside the platform development team.
Collaborate closely with framework developers, data engineers, architects, and analysts to standardize data pipelines and processing patterns.
Develop and enhance Debezium Kafka CDC pipeline frameworks to enable rapid instantiation of CDC data ingestion workflows.
Build and maintain AWS Glue PySpark job frameworks aligned with medallion architecture principles.
Implement and maintain ETL frameworks for loading data into Snowflake.
Develop Infrastructure as Code using Terraform and Github to automate provisioning and deployment of platform components.
Ensure platform reliability, scalability, and observability.
Contribute to improving development standards, code reviews, and best practices focused on framework and platform engineering.
Required Skills & Experience
Masters degree in software engineering, computer science or equivalent
AWS certifications (Solutions Architect Associate, Developer Associate, Data Engineer Associate).
Strong software engineering background with expertise in Python, especially PySpark.
Experience with and thorough understanding of Kafka and Kafka Connect concepts.
Proven track record developing reusable frameworks or libraries focusing on scalability and maintainability.
Sound understanding and practical application of OOP and SOLID principles (encapsulation, inheritance, polymorphism, abstraction).
Hands-on experience with AWS services including Glue, ECS, S3, Kafka (including Debezium), and Snowflake.
Experience building and orchestrating data pipelines using Airflow or similar tools.
Proficient in Infrastructure as Code using Terraform.
Familiarity with CI/CD workflows using GitHub or similar platforms.
Strong problem-solving skills and ability to write clean, modular, and well-documented code.
Excellent communication skills and ability to work collaboratively in an international team of highly skilled IT engineers.
Preferred Qualifications
Experience with Iceberg or other open table formats in a data lakehouse environment.
Prior experience working on CDC (Change Data Capture) pipelines or Kafka streaming frameworks.
Experience with big data processing frameworks is considered a plus.
Understanding of medallion architecture and data lakehouse design patterns.
Multiple years of application development experience with an OOP native programming language like Java, C++, or C#.
Experience with open table formats and big data is considered a plus.

© 2025 Qureos. All rights reserved.