
JOB_REQUIREMENTS
Employment Type
Not specified
Company Location
Not specified
Key Responsibilities:
- Architect and lead scalable, high-availability data pipelines (batch & streaming).
- Design and implement cloud-native data solutions and ETL/ELT workflows on AWS.
- Ensure data reliability, performance, security, and cost optimization.
- Lead architectural reviews, define data standards, and drive best practices.
- Mentor engineers and collaborate with cross-functional teams (Product, Analytics, DevOps).
Must-Have Skills:
- Strong Python programming for data engineering, automation, and analytics.
- Cloud expertise: AWS (Glue, S3, Lambda, Batch, IAM, ECR, SNS).
- CI/CD, code reviews, and software development lifecycle (SDLC) best practices.
- Experience with data pipelines, ETL/ELT, and batch/streaming processing.
Nice-to-Have Skills:
- Real-time data replication and CDC using Oracle GoldenGate.
- Event-driven streaming with Apache Kafka.
- Observability, monitoring, and alerting for large-scale data systems.
- Serverless architecture and containerization experience.
Job Type: Full-time
Work Location: Remote
Similar jobs
No similar jobs found
© 2025 Qureos. All rights reserved.