Find The RightJob.
Vyro is redefining the future of digital creativity. We build cutting-edge content creation tools powered by Artificial Intelligence and Machine Learning, helping millions of creators, designers, and storytellers bring their imagination to life — effortlessly.
With a global user base of over 5 million active creators every month, Vyro’s 20+ AI-powered apps are transforming how people design, edit, and express themselves across images, videos, and beyond.
From intuitive AI photo editors to next-gen video creation platforms, our products are designed to make creativity accessible, fast, and limitless.
At Vyro, we’re a team of innovators, builders, and dreamers — known as Vyronauts — driven by passion, purpose, and the belief that technology should inspire creativity, not complicate it.
If you’re excited about shaping the next wave of AI-powered creativity, Vyro is the place to be.
Overview
We are seeking a highly experienced Senior Big Data Developer to design, build, and optimize large-scale data systems and backend infrastructure. The ideal candidate will have strong expertise in BigQuery, distributed data processing, and cloud-based data platforms, and will play a critical role in developing scalable data pipelines and backend services that support analytics, AI, and product systems.
This role requires deep technical expertise in handling large datasets, optimizing data workflows, and building reliable, production-grade data infrastructure.
Design, build, and maintain scalable data pipelines and backend systems for processing large-scale datasets
Develop, optimize, and manage complex queries, datasets, and workflows in BigQuery and data warehouse systems
Build robust ETL/ELT pipelines to ingest, transform, and process data from multiple sources
Develop backend services and data processing systems using languages such as Go, Python.
Ensure high performance, scalability, reliability, and cost efficiency of data systems
Integrate data systems with cloud services, APIs, and distributed systems
Monitor, troubleshoot, and optimize data pipelines and backend infrastructure
Collaborate with data scientists, analysts, and engineering teams to support analytics and AI-driven features
Design clean, efficient, and maintainable system architecture
Mentor junior engineers and contribute to technical leadership and architectural decisions
Requirements
5+ years of experience in Big Data, backend engineering, or data infrastructure roles
Strong hands-on experience with BigQuery or similar data warehouse technologies (Snowflake, Redshift, Databricks, etc.)
Strong proficiency in at least one programming language such as Go, Python, Node.js, or Java
Strong expertise in SQL and query performance optimization
Experience building and maintaining ETL/ELT pipelines
Experience working with large-scale datasets and distributed systems
Experience with cloud platforms such as Google Cloud Platform (preferred), AWS, or Azure
Strong understanding of data architecture, scalability, and performance optimization
Excellent problem-solving and debugging skills
Preferred Qualifications
Experience with GCP services such as BigQuery, Cloud Storage, Pub/Sub, and Cloud Run
Experience with Apache Spark, Kafka, or Airflow
Experience supporting analytics, AI, or machine learning systems
Experience with real-time and batch data processing systems
Experience with microservices architecture
Similar jobs
No similar jobs found
© 2026 Qureos. All rights reserved.