Qureos

Find The RightJob.

Data Engineer-AWS

Hyderabad

About Us

We empower enterprises globally through intelligent, creative, and insightful services for data integration, data analytics and data visualization.
Hoonartek is a leader in enterprise transformation, data engineering and an acknowledged world-class Ab Initio delivery partner.
Using centuries of cumulative experience, research and leadership, we help our clients eliminate the complexities & risk of legacy modernization and safely deliver big data hubs, operational data integration, business intelligence, risk & compliance solutions and traditional data warehouses & marts.
At Hoonartek, we work to ensure that our customers, partners and employees all benefit from our unstinting commitment to delivery, quality and value. Hoonartek is increasingly the choice for customers seeking a trusted partner of vision, value and integrity

How We Work?

Define, Design and Deliver (D3) is our in-house delivery philosophy. It’s culled from agile and rapid methodologies and focused on ‘just enough design’. We embrace this philosophy in everything we do, leading to numerous client success stories and indeed to our own success.
We embrace change, empowering and trusting our people and building long and valuable relationships with our employees, our customers and our partners. We work flexibly, even adopting traditional/waterfall methods where circumstances demand it. At Hoonartek, the focus is always on delivery and value.

Job Description

We are seeking a hands-on Data Engineer to build, enhance, and operate data pipelines and data products on AWS. Candidate should be experienced in following:

  • Build and enhance ELT/ETL pipelines to ingest, transform, and curate data from multiple sources into clean, trusted datasets.
  • Develop and maintain data processing jobs using Spark and Python, with a focus on performance and reliability.
  • Implement and monitor data quality checks, reconcile issues, and partner with analysts to resolve data defects.
  • Translate requirements into technical tasks and deliver well-documented solutions, including runbooks and operational playbooks.
  • Support day-to-day operations of data pipelines—monitoring, incident triage, root-cause analysis, and meeting SLA targets.
  • Develop workflows and scheduling using MWAA (Airflow) and AWS-native services.
  • Collaborate effectively across time zones with onsite stakeholders and the India engineering team through clear communication and proactive status updates.

Job Requirement

  • Bachelor’s degree in computer science, Engineering, Information Systems, or related technical field (or equivalent practical experience).
  • 7+ years of hands-on data engineering experience building and supporting production data pipelines.
  • Strong SQL skills (design, tuning, troubleshooting) and experience working with relational and analytical data stores.
  • 3+ years of coding/scripting experience with Python (Java/Scala a plus).
  • 3+ years of experience on AWS (e.g., S3, Glue, EMR, Lambda, IAM, CloudWatch, MWAA/Airflow).
  • 2+ years of experience with infrastructure-as-code and CI/CD practices (Terraform preferred).
  • 2+ years of experience with Spark; streaming experience with Kafka is a plus.
  • Experience leveraging AI tools (e.g., GenAI assistants) to improve productivity, with an understanding of secure and responsible usage.

© 2026 Qureos. All rights reserved.