FIND_THE_RIGHTJOB.
JOB_REQUIREMENTS
Hires in
Not specified
Employment Type
Not specified
Company Location
Not specified
Salary
Not specified
Experience Required: 8+ Years
Roles & Responsibilities:-
· Lead the end-to-end architecture and development of a modern enterprise-scale data platform from the ground up.
· Collaborate with cloud and security architects to ensure platform scalability, performance, and compliance.
· Architect, design, and implement batch and real-time streaming data infrastructures and workloads.
· Build and maintain data lakehouse architectures within the GCP ecosystem.
· Design and develop connector frameworks and reusable data ingestion pipelines to source data from both on-premises and cloud systems.
· Architect and implement metadata management, including data catalogs, lineage, quality, and observability frameworks.
· Design and develop data quality frameworks and governance processes to ensure reliability and accuracy.
· Develop microservices-based components using Kubernetes, Docker, and Cloud Run to abstract platform and infrastructure complexities.
· Design and optimize data storage, transformation, and querying performance for large-scale datasets while ensuring cost efficiency.
· Implement observability tooling (Grafana, Datadog) and DataOps best practices, including CI/CD and test automation.
· Collaborate with data scientists and analysts to define data models, schemas, and advanced analytics capabilities.
· Drive deployment, release management, and platform scalability initiatives.
· Stay ahead of emerging data engineering trends, tools, and best practices to continuously evolve the platform.
Skills:
8+ years of proven experience in modern cloud data engineering and enterprise data platform architecture.
Demonstrated success in architecting and delivering large-scale greenfield data platform
projects.
Deep expertise in Google Cloud Platform (GCP) and its ecosystem — BigQuery, Cloud
Functions, Cloud Run, Dataform, Dataflow, Dataproc, SQL, Python, and Airflow.
Strong understanding of streaming technologies such as Kafka or Pub/Sub.
Hands-on experience with microservices architectures using Kubernetes, Docker, and Cloud Run.
Proven ability to design semantic layers and metadata-driven architectures.
Expertise in data modeling, data architecture, and data governance principles.
Experience with observability and monitoring tools (Grafana, Datadog).
Strong understanding of DataOps principles, including automation, CI/CD, and testing for data
pipelines.
Experience architecting secure, scalable, and high-performance data solutions.
Job Type: Contractual / Temporary
Contract length: 3 months
Pay: ₹130,000.00 - ₹160,000.00 per month
Experience:
Work Location: Remote
Similar jobs
Rearc
India
5 days ago
Oracle
India
5 days ago
Equifax
India
5 days ago
QBurst
Kuala, India
5 days ago
AT&T
India
5 days ago
Fusemachines
India
5 days ago
Astegic
India
5 days ago
© 2025 Qureos. All rights reserved.