Key Skills: Python, Terraform, Iceberg, Lambda, experience in data implementation and medallion architecture
Job Location: Hyderabad, Pune, and Coimbatore
Mode of Work: Work From Office
Experience: 4-8 Years
About the job:
We are looking for a Senior Data Engineer who will be responsible for building AWS Data pipelines as per requirements. Should have strong analytical skills, design capabilities, problem solving skills. Based on stakeholders’ requirements, should be able to propose solutions to the customer for review. Discuss pros/cons of different solution designs and optimization strategies.
Know your team:
At ValueMomentum Technology Solution Centers, we are a team of passionate software engineering teams who thrive on tackling complex business challenges with innovative solutions catering to the P&C insurance value chain. We achieve this through a strong software engineering foundation and by continuously refining our processes, methodologies, tools, agile delivery teams, and core engineering archetypes. Our core expertise lies in six key areas: Platforms, Infra/Cloud, Application, Data, Core, and Quality Assurance.
Join our team that invests in your growth. Our Infinity Program empowers you to build your career with role-specific skill development, leveraging immersive learning platforms. You'll have the opportunity to work with some of the best minds serving insurance customers in the US, UK and Canadian markets.
Responsibilities:
-
Provide technical and development support to clients to build and maintain data pipelines.
-
Develop data mapping documents listing business and transformational rules.
-
Develop, unit test, deploy and maintain data pipelines.
-
Design a Storage Layer for storing tabular/semi-structured/unstructured data.
-
Design pipelines for batch/real-time processing of large data volumes.
-
Analyze source specifications and build data mapping documents.
-
Identify and document applicable non-functional code sets and reference data across insurance domains.
-
Understand profiling results and validate data quality rules.
-
Utilize data analysis tools to construct and manipulate datasets to support analyses.
-
Collaborate with and support Quality Assurance (QA) in building functional scenarios and validating results.
Requirements:
-
4+ years’ experience developing and maintaining modern ingestion pipeline using technologies like (AWS pipelines, Lamda, Spark, Apache Nifi etc).
-
Basic understanding of the MLOPs lifecycle (Data prep -> model training -> model deployment -> model inference -> model re-training).
-
Should be able to design data pipelines for batch/real time using Lambda, Step Functions, API Gateway, SNS, S3.
-
Must have experience in Data modernization project moving data from Legacy systems to Cloud Based Technology
-
Hands on experience on AWS Cloud and its Native components like S3, Athena, Redshift & Jupyter Notebooks.
-
Requirements Gathering - Active involvement during requirements discussions with project sponsors, defining the project scope and delivery timelines, Design & Development.
-
Strong in Spark Scala & Python pipelines (ETL & Streaming).
-
Strong experience in metadata management tools like AWS Glue.
-
Strong experience in coding with languages like Java, Python,Pyspark.
-
Good-to-have AWS Developer certified.
-
Good-to-have Postman-API and Apache Airflow or similar schedulers experience.
-
Working with cross-functional teams to meet strategic goals.
- Experience in high volume data environments and medallion architecture.
-
Critical thinking and excellent verbal and written communication skills.
-
Strong problem-solving and analytical abilities should be able to work and deliver individually.
-
Good knowledge of data warehousing concepts.
Good To Have:
Candidates with these additional skills on their resumes have an added advantage when their profiles are being assessed. Some of them are as follows:
-
Experience with Lambda, Step Functions, API Gateway, SNS, S3 (unstructured data), DynamoDB (semi-structured data), Aurora PostgreSQL (tabular data), AWS Sagemaker, AWS CodeCommit/GitLab, AWS CodeBuild, AWS Code Pipeline, AWS ECR, SQS, Eventbridge, Cloudwatch, Splunk, Boto3, and PyTest.
About the Company:
ValueMomentum is a leading solutions provider for the global property and casualty insurance industry. The company helps insurers stay ahead with sustained growth and high performance, enhancing stakeholder value and fostering resilient societies. Having served over 100 insurers, ValueMomentum is one of the largest services providers exclusively focused on the insurance industry.
Benefits:
We at ValueMomentum offer you a congenial environment to work and grow in the company of experienced professionals. Some benefits that are available to you are:
- Competitive compensation package.
- Career Advancement: Individual Career Development, coaching and mentoring programs for professional and leadership skill development.
- Comprehensive training and certification programs.
- Performance Management: Goal Setting, continuous feedback and year-end appraisal. Reward & recognition for the extraordinary performers.