Qureos

FIND_THE_RIGHTJOB.

Lead – Big Data and Cloud

Indore, Madhya Pradesh, India


Qualification

:

We are seeking an experienced Technical Lead to design, develop, and lead Big Data and Cloud-based solutions. The ideal candidate will have strong expertise in distributed data processing, cloud-native architectures, and modern data technologies, along with proven leadership in guiding teams and delivering enterprise-grade solutions.


Key Responsibilities:

  • Lead end-to-end design and development of Big Data solutions using Spark, Kafka, and Python.
  • Architect and implement scalable data pipelines and real-time streaming applications.
  • Design and optimize data storage solutions using NoSQL and SQL databases.
  • Drive cloud-native development and deployment on GCP, including Cloud Storage and related services.
  • Implement workflow orchestration using Airflow for data pipelines.
  • Collaborate with cross-functional teams including DevOps, QA, and product stakeholders.
  • Ensure data security, scalability, and performance across platforms.
  • Mentor and guide development teams on best practices and coding standards.
  • Participate in architecture reviews and technical decision-making.
  • Implement CI/CD pipelines and ensure smooth deployments.
  • Deliver high-quality Big Data solutions within agreed timelines.
  • Maintain low-latency data processing and high system availability.
  • Optimize cloud resource utilization and cost efficiency.
  • Ensure compliance with security and governance standards.

Required Skills & Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
  • 8 to 12 years of experience in Big Data and cloud-based solution development, with at least 3 years in a technical lead role.
  • Strong proficiency in Big Data technologies: Spark, Kafka, Hadoop.
  • Expertise in Python programming.
  • Hands-on experience with GCP cloud services including Cloud Storage.
  • Experience with Airflow for workflow orchestration.
  • Strong knowledge of SQL and NoSQL database design and optimization.
  • Familiarity with CI/CD pipelines, containerization (Docker), and Kubernetes.
  • Strong problem-solving, leadership, and communication skills.

Preferred Skills & Certifications:

  • GCP Professional Data Engineer or Cloud Architect certification.
  • Experience with data modeling and lakehouse architectures.
  • Knowledge of other cloud platforms (AWS, Azure)

Experience

:

8 to 12 years

Job Reference Number

:

13524

Skills Required

:

Bigdata, Python, Data Engineer, Cloud

Role

:
  • Lead end-to-end design and development of Big Data solutions using Spark, Kafka, and Python.
  • Architect and implement scalable data pipelines and real-time streaming applications.
  • Design and optimize data storage solutions using NoSQL and SQL databases.
  • Drive cloud-native development and deployment on GCP, including Cloud Storage and related services.
  • Implement workflow orchestration using Airflow for data pipelines.
  • Collaborate with cross-functional teams including DevOps, QA, and product stakeholders.
  • Ensure data security, scalability, and performance across platforms.
  • Mentor and guide development teams on best practices and coding standards.
  • Participate in architecture reviews and technical decision-making.
  • Implement CI/CD pipelines and ensure smooth deployments.
  • Deliver high-quality Big Data solutions within agreed timelines.
  • Maintain low-latency data processing and high system availability.
  • Optimize cloud resource utilization and cost efficiency.
  • Ensure compliance with security and governance standards.

© 2026 Qureos. All rights reserved.