Qureos

FIND_THE_RIGHTJOB.

Job Description

About the role:

We’re looking for a skilled Data Engineer with deep Snowflake expertise to help modernize and scale our data platform. If you thrive in a fast-moving environment, can wrangle messy pipelines, and want to build the backbone of a cloud-first data strategy, this role is for you. You’ll work across legacy and modern systems to deliver reliable, high-quality data to customers.

Responsibilities:

  • Design, build, and maintain scalable and efficient data pipelines to support analytics, reporting, and operational use cases.
  • Collaborate closely with product owners, analysts, and data consumers to translate business requirements into reliable data solutions
  • Develop and maintain data integration workflows across both cloud-native and on-premises systems.
  • Champion best practices in data architecture, modeling, and quality assurance to ensure accuracy and performance.
  • Participate in sprint planning, daily stand-ups, and retrospectives as an active member of a cross-functional agile team.
  • Identify and remediate technical debt across legacy pipelines and contribute to the modernization of the data platform.
  • Implement robust monitoring and alerting for pipeline health, data quality, and SLA adherence.
  • Write and maintain documentation for data flows, transformations, and system dependencies.
  • Contribute to code reviews and peer development to foster a collaborative and high-quality engineering culture.
  • Ensure adherence to security, privacy, and compliance standards in all data engineering practices.

Skills & Qualifications:

  • 5+ years of professional experience in data engineering, analytics engineering, or related fields
  • Bachelor’s degree in computer science or equivalent field
  • Advanced SQL skills, including performance tuning and query optimization
  • Expertise in Snowflake, including data warehousing concepts, architecture, and best practices
  • Experience with modern data transformation tools (e.g., dbt)
  • Experience building and maintaining automated ETL/ELT pipelines with a focus on performance, scalability, and reliability
  • Proficiency with version control systems (e.g., Git), working within CI/CD pipelines and experience with environments that depend on infrastructure-as-code
  • Experience writing unit and integration tests for data pipelines
  • Familiarity with data modeling techniques (e.g., dimensional modeling, star/snowflake schemas)
  • Experience with legacy, on-premise databases such as Microsoft SQL Server is preferred
  • Exposure to cloud platforms (e.g., AWS, Azure, GCP), cloud-native data tools, and data federation tools is a plus
  • Experience with Sql Server Reporting Services (SSRS) is beneficial

Duration: 6 months ( 80% possibility, it will be get extended) Contract-to-Hire

Availability: Immediate Joiner / 7 days

Location: Pune(Remote)

Job Type: Contractual / Temporary
Contract length: 6 months

Pay: ₹55,000.00 - ₹70,000.00 per month

Application Question(s):

  • How soon can you join ? ( Mention Date )
  • What is your current CTC ?
  • What is your expected CTC ?

Experience:

  • Snowflake: 5 years (Required)
  • DBT: 5 years (Required)
  • ETL: 5 years (Required)
  • Python: 5 years (Required)
  • SQL: 5 years (Required)

Work Location: In person

© 2025 Qureos. All rights reserved.