FIND_THE_RIGHTJOB.
JOB_REQUIREMENTS
Hires in
Not specified
Employment Type
Not specified
Company Location
Not specified
Salary
Not specified
Key Result Areas and Activities:
Technology Assessment and Design
Study existing technology landscape and understand current data integration frameworks and do impact assessment for the requirements.
Should be able to design complex Big Data use cases using AWS Services under guidance of Architect
Should be able to assist the architect in reasoning about various architecture choices in terms of cost, performance, durability.
Should be able to suggest optimizations in existing designs.
Ensure optimal balance between cost and performance.
Documentation and Stakeholder Communication
Project documentation, adheres to quality guidelines & schedules.
Works hand in hand with Architect & PM for successful delivery of project and provide estimation, scoping, scheduling assistance.
Articulate design decisions very clearly with stakeholders.
Perform Proof-of-Concepts and document all observations before proposing a new solution.
Conduct design review sessions with other teams and identify scope of improvement.
Process Improvement and Automation
Suggests automation to improve existing processes.
Assist junior Data Engineers by providing expert advice or troubleshooting steps whenever required.
Keep suggesting new ways to improve team productivity.
Training and Knowledge Sharing
Create technology-focused training plans whenever required.
Deliver technology-focused training sessions with team members whenever required.
Conduct Expert Knowledge Sharing sessions with Client Stakeholders whenever required.
Assist in designing case study documents whenever required.
Must-Have:
In-depth knowledge of the following AWS services: S3, EC2, EMR, Athena, AWS Glue, Lambda
Experience with at least one MPP database: AWS Redshift, Snowflake, SingleStore
Proficiency in Big Data technologies: Apache Spark, Databricks
Must have strong programming skills in Python
Responsible for building data pipelines in AWS And Databricks
Experience with Big Data table formats, such as Delta Lake (open source)
Must have very strong SQL skills
Experience with orchestration tools like Apache Airflow
Expertise in developing ETL workflows with complex transformations such as SCD, deduplications, aggregations, etc.
Should be a quick and self-learner, ready to adapt to new AWS services or Big Data technologies as required
Strong understanding of data warehousing concepts
Good to Have:
Cloud Databases – Snowflake, AWS Aurora
Big Data – Hadoop, Hive
Cloud Databases– AWS Aurora
Associate Level or Professional Level AWS Certification
Qualifications:
Overall 5+ years of IT Exp
5+ years of relevant experience in AWS related project
Bachelor’s degree in computer science, engineering, or related field (master's degree is a plus)
Demonstrated continued learning through one or more technical certifications or related methods.
5 to 7 years
India
Similar jobs
Accenture
India
6 days ago
People click techno solutions private limited
India
6 days ago
Barclays
India
6 days ago
Goldman Sachs
India
6 days ago
Goldman Sachs
India
6 days ago
Amazon.com
India
6 days ago
Amazon.com
India
6 days ago
© 2025 Qureos. All rights reserved.