FIND_THE_RIGHTJOB.
JOB_REQUIREMENTS
Hires in
Not specified
Employment Type
Not specified
Company Location
Not specified
Salary
Not specified
JD:
JOB RESPONSIBILITIES:
Architect and manage Data Lake (Spark-based) platform.
Design and govern:
1. Data ingestion (batch and streaming)
2. Transformation and enrichment pipelines
3. Use Airflow (or equivalent) for orchestration and scheduling.
Integrate data from:
4. DMS and enterprise applications
5. Event streams (Kafka)
6. External and partner systems
Define data models and schemas optimized for:
7. Analytics & reporting
8. AI / ML use cases
9. Agentic workflows
10. Ensure data quality, lineage, performance, and scalability.
11. Work closely with Framework and Enterprise Architects to align platforms.
Required Skills & Experience
8–12 years in data engineering / data architecture.
12. Spark-based data lakes
13. Airflow or similar orchestration tools
Job Types: Full-time, Contractual / Temporary
Contract length: 6 months
Pay: ₹905,950.87 - ₹2,446,378.97 per year
Work Location: In person
Similar jobs
No similar jobs found
© 2026 Qureos. All rights reserved.