FIND_THE_RIGHTJOB.
JOB_REQUIREMENTS
Hires in
Not specified
Employment Type
Not specified
Company Location
Not specified
Salary
Not specified
JOB RESPONSIBILITIES:
Architect and manage Data Lake (Spark-based) platform.
Design and govern:
1. Data ingestion (batch and streaming)
2. Transformation and enrichment pipelines
3. Use Airflow (or equivalent) for orchestration and scheduling.
Integrate data from:
4. DMS and enterprise applications
5. Event streams (Kafka)
6. External and partner systems
Define data models and schemas optimized for:
7. Analytics & reporting
8. AI / ML use cases
9. Agentic workflows
10. Ensure data quality, lineage, performance, and scalability.
11. Work closely with Framework and Enterprise Architects to align platforms.
Required Skills & Experience
8–12 years in data engineering / data architecture.
Strong hands-on expertise in:
12. Spark-based data lakes
13. Airflow or similar orchestration tools
14. Large-scale ETL / ELT pipelines
15. Experience handling enterprise-scale, multi-source data.
Job Type: Contractual / Temporary
Contract length: 6 months
Pay: ₹568,796.46 - ₹1,900,000.00 per year
Work Location: Remote
Similar jobs
No similar jobs found
© 2026 Qureos. All rights reserved.