Job Location: KochiTrivandrum
Experience
8+ Years
Job Purpose
- Seeking experienced Senior Data Engineer to lead development of scalable data ingestion framework ensuring high data quality and validation, while designing robust APIs for seamless data integration using AWS-based technologies.
Job Description / Duties and Responsibilities
- Architect, develop, and maintain end-to-end data ingestion framework for extracting, transforming, and loading data from diverse sources
- Use AWS services (Glue, Lambda, EMR, ECS, EC2, Step Functions) to build scalable, resilient automated data pipelines
- Develop and implement automated data quality checks, validation routines, and error-handling mechanisms
- Establish comprehensive monitoring, logging, and alerting systems for data quality issues
- Architect and develop secure, high-performance APIs for data services integration
- Create thorough API documentation and establish standards for security, versioning, and performance
- Work with business stakeholders, data scientists, and operations teams to understand requirements
- Participate in sprint planning, code reviews, and agile ceremonies
- Contribute to CI/CD pipeline development using GitLab
Job Specification / Skills and Competencies
- 5+ years experience in data engineering with analytical platform development focus
- Proficiency in Python and/or PySpark
- Strong SQL skills for ETL processes and large-scale data manipulation
- Extensive AWS experience (Glue, Lambda, Step Functions, S3)
- Familiarity with big data systems (AWS EMR, Apache Spark, Apache Iceberg)
- Database experience with DynamoDB, Aurora, Postgres, or Redshift
- Proven experience designing and implementing RESTful APIs
- Hands-on CI/CD pipeline experience (preferably GitLab)
- Agile development methodology experience
- Strong problem-solving abilities and attention to detail
- Excellent communication and interpersonal skills
- Ability to work independently and collaboratively
- Capacity to quickly learn and adapt to new technologies
- 10+ years total experience
Any Additional Information/Specifics
- Bachelor’s/Master’s in Computer Science, Data Engineering or related field (preferred)
- Experience with additional AWS services (Kinesis, Firehose, SQS)
- Familiarity with data lakehouse architectures and modern data quality frameworks
- Experience with proactive data quality management in multi-cluster environments
- Adhere to Information Security Management policies and procedures