FIND_THE_RIGHTJOB.
JOB_REQUIREMENTS
Hires in
Not specified
Employment Type
Not specified
Company Location
Not specified
Salary
Not specified
Role: Data Engineer
Project Duration : 6 months
Contractual Position
Responsibilities
Help build and evolve the core data platform, including pipelines and data warehouse layers
Design and maintain structured datasets to support reporting and analytics
Work with structured and semi-structured data across PostgreSQL, MySQL, and MongoDB
Develop and maintain ETL pipelines for both raw data and analytics-ready warehouse tables
Write and optimize complex queries to process billions of records efficiently
Perform data exploration and mining to uncover patterns, inconsistencies, or optimization
opportunities
Write transformation and automation logic using Groovy or Python
Manage infrastructure across on-prem and AWS environments
Collaborate with engineers, analysts, and product teams to solve business problems with data
Define and enforce data validation, audit, and alerting mechanisms
Requirements
4–8 years of experience in data engineering or a related backend/data-heavy role
Strong SQL skills and deep understanding of query performance tuning
Proficient in Groovy or Python for scripting and automation
Hands-on experience with PostgreSQL, MySQL, and MongoDB at scale
Experience working with or building for a data warehouse (e.g., Redshift, Trino, PostgreSQL)
Comfortable working in cloud (AWS) and/or on-prem environments
Experience building and operating production data systems and pipelines
Strong problem-solving skills and a bias toward ownership and action
Ability to debug across systems and connect the dots in complex data flows
Nice to Have
Experience with Apache NiFi, Airflow, or custom orchestrators
Exposure to data mining, anomaly detection, or pattern recognition techniques
Familiarity with warehouse modeling concepts (fact/dim tables, reporting views)
Experience with tools like Trino, AWS Athena, Metabase, or similar query/BI platforms
Familiarity with observability practices (monitoring, alerting, logging)
Exposure to streaming systems (Kafka, Spark, Flink)
Experience supporting compliance or finance-focused data workflows
Comfort working in containerized setups (Docker/Kubernetes)
Job Type: Contractual / Temporary
Contract length: 6 months
Work Location: In person
Similar jobs
Amazon.com
India
5 days ago
India
5 days ago
CGI
India
5 days ago
JPMorganChase
India
5 days ago
CrossAsyst
India
5 days ago
Cognizant Technology Solutions
India
5 days ago
Kyndryl
India
5 days ago
© 2025 Qureos. All rights reserved.