DE COE: DE COE
Designation Senior Analyst
Experience: 2 to 5 years
Location: Chennai, Tamil Nadu, India (CHN)
Job Description:
We are looking for a highly skilled and motivated Senior Analyst – Data Engineer with 2+ years of hands-on experience in AWS-based data solutions. The role involves designing, developing, and optimizing scalable data pipelines and architectures, along with data modelling, performance tuning, and mentoring junior engineers. This is a great opportunity to work in a cloud-native environment, solve complex data challenges, and grow into architectural and leadership roles.
Responsibilities:
1. Solution Architecture & Design
- Assist in designing and implementing scalable, secure, and cost-efficient data architectures on AWS.
- Work with senior architects and stakeholders to translate business needs into technical solutions.
- Integrate data from multiple sources into a centralized data platform using AWS-native services.
2. Data Modelling & Implementation
- Develop and maintain conceptual, logical, and physical data models.
- Build, test, and deploy ETL/ELT pipelines using AWS-native and open-source tools.
- Implement data governance, quality, and lineage frameworks across systems.
3. Performance Tuning & Optimization
- Optimize ETL pipelines, queries, and storage layers for scalability and cost-effectiveness.
- Leverage AWS services efficiently by fine-tuning compute, storage, and orchestration strategies.
- Set up monitoring, logging, and automated alerts to proactively identify performance bottlenecks.
4. Leadership & Mentoring
- Mentor junior engineers by providing technical guidance, conducting code reviews, and sharing best practices.
- Collaborate closely with cross-functional teams to ensure smooth delivery of data engineering projects.
- Support team leads and managers in strategic decision-making and architectural discussions.
Skills:
2 to 5 years in the Data Engineering domain
• 2+ years of hands-on experience in AWS Data Engineering services, including: o Amazon S3, Glue, Athena, Redshift, Lambda, EMR, Kinesis, Step Functions • Strong expertise in ETL/ELT pipeline design and implementation. • Proficiency in SQL and at least one programming language (Python / PySpark). • Solid understanding of data modelling, data warehousing, and schema design. • Experience with cloud-based orchestration tools (AWS Glue, Step Functions, or Apache Airflow). • Familiarity with version control (Git) and CI/CD practices for data pipelines. • Exposure to data lakehouse architectures and tools like Databricks, Apache Iceberg, or Delta Lake. • Experience working with real-time streaming frameworks like Kinesis or Kafka. • Understanding of data governance, security, and compliance within AWS environments. • Hands-on experience in Agile/Scrum delivery models. • Interest in growing into architectural and leadership roles.