7 - 9 Years
1 Opening
Bangalore
AWS
Data Engineers
Role
Overview:
Seeking
a highly skilled AWS Data Engineer to join our Cloud, Data & AI practice
and play a key role in delivering cutting-edge, cloud-native data solutions for
our clients across industries. In this role, you will design and implement
scalable, secure, and resilient data architectures on AWS that enable
actionable insights and operational efficiencies. As an integral member of
multi-disciplinary teams, you will work closely with business leaders, data
scientists, and technologists to solve complex challenges and accelerate
clients digital transformation journeys.
Your
expertise in AWS cloud technologies, data pipelines, and analytics will ensure we
continue to uphold its reputation for delivering data solutions that drive
sustainable business value and competitive advantage.
Key
Responsibilities:
- Design, develop,
and deploy robust, scalable, real-time AWS-based data solutions aligned
with high standards and client business objectives. - Architect and
maintain secure, compliant data lake and warehouse solutions utilizing AWS
services such as S3, Redshift, EMR, Lambda, Kinesis, and more. - Build efficient
ETL/ELT workflows that support diverse, complex data sources, and enable
timely data availability for analytics and reporting. - Lead the
creation and optimization of event-driven, streaming data pipelines that
support advanced analytics, AI/ML models, and decision-making
processes. - Partner with consultants,
data scientists, BI engineers, and client teams to translate business
problems into effective technical solutions. - Ensure adherence
to our rigorous data governance, security, and privacy policies throughout
the solution lifecycle. - Implement
automation and DevOps best practices within data engineering pipelines to
enhance reliability, scalability, and deployment velocity. - Stay abreast of
AWS innovations and industry trends to continually bring new opportunities
to our data practice and clients.
Preferred
Qualifications:
- Proven
experience building enterprise-level data engineering solutions on AWS,
supporting large-scale data initiatives. - Proficiency in
Python and hands-on experience with data orchestration tools (Apache
Airflow, AWS Step Functions, or equivalent). - Solid hands-on
experience of distributed data frameworks and streaming technologies such
as Apache Spark, Kafka, and Hive. - Hands-on
expertise with AWS ecosystem including S3, Lambda, Kinesis, Redshift, EMR,
Glue and SQS. - Strong data
modeling skills and experience designing ETL/ELT pipelines for batch and
real-time ingestion. - Familiarity with
CI/CD pipelines, version control, and infrastructure-as-code tools
(CloudFormation, Terraform). - Exceptional
analytical skills with a proactive, problem-solving approach and a strong
customer orientation. - Excellent
communication and collaboration skills suited for a consulting environment
involving clients and internal teams.
Bachelor s or higher
degree in Computer Science, Information Systems, Engineering or related fields;
AWS certification(s) highly desirable.
aws,data engineering,python,apache spark,kafka,hive,lambda,s3
UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.