Qureos

FIND_THE_RIGHTJOB.

2005673

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Please note this posting is to advertise potential job opportunities. This exact role may not be open today but could open in the near future. When you apply, a Cisco representative may contact you directly if a relevant position opens.

Meet the Team
Webex Data Analytics Platform (WAP) engineering organization is responsible for developing the big data platform at Webex. The platform forms the base on which other teams, including the WAP team, develop their analytics and data pipelines.

We are looking for a hands-on Data Engineer to join a high-impact data engineering team with a focus on developing the data platform and working with the product team to develop data and analytics pipeline.

Your Impact

  • Work and enhance big data related opensource technologies.
  • Build and operate Kafka-based streaming applications, including ingestion, filtering, enrichment, and replication use cases.
  • Develop data processing jobs using Apache Spark and/or Apache Flink, following established patterns and platform guidelines.
  • Work with data lake technologies (Apache Iceberg) to manage large analytical datasets.
  • Tune jobs for performance, scalability, and cost efficiency.
  • Write clean, testable code and participate actively in code reviews.
Minimum Qualifications

  • At least 4 to 12 years of experience in data engineering and software development.
  • Ability to work with opensource ecosystem.
  • Ability to write high-quality code in Java/Scala, Python, or equivalent languages.
  • Practical experience with Apache Spark and Apache Flink.
  • Practical experience with query engines like Trino/Presto.
  • Experience working with Kafka and scaling for high volume workloads.
  • Strong SQL skills and understanding of data modeling for analytical workloads
Preferred Qualifications

  • Hands-on experience with real-time analytics stores like Apache Pinot/Clickhouse for low-latency analytical queries.
  • Familiarity using AI Tools for debugging and development (CoPilot, Cursor, Codex).
  • Familiarity with data lake table formats such as Apache Iceberg.
  • Familiarity with data governance tools like DataHub.
  • Experience working with tools like DolphinScheduler or Airflow.
  • Experience working with Lakehouse management systems like Apache Amoro.
Why Cisco?

At Cisco, we’re revolutionizing how data and infrastructure connect and protect organizations in the AI era – and beyond. We’ve been innovating fearlessly for 40 years to create solutions that power how humans and technology work together across the physical and digital worlds. These solutions provide customers with unparalleled security, visibility, and insights across the entire digital footprint.
Fueled by the depth and breadth of our technology, we experiment and create meaningful solutions. Add to that our worldwide network of doers and experts, and you’ll see that the opportunities to grow and build are limitless. We work as a team, collaborating with empathy to make really big things happen on a global scale. Because our solutions are everywhere, our impact is everywhere.
We are Cisco, and our power starts with you.

Similar jobs

No similar jobs found

© 2025 Qureos. All rights reserved.