We are seeking an experienced Big Data Developer (Advisor) with strong expertise in Python, Java, Cloud-based data engineering, and modern Big Data ecosystems. The ideal candidate will have a solid background in building batch and real-time integration frameworks, optimizing large-scale distributed systems, and working with cloud-native services such as Cloud Run Jobs, Cloud Composer (Apache Airflow), BigQuery.
Education:
- Graduate Engineer or equivalent qualification
- Minimum 8–10 years of successful experience in a recognized global IT services or consulting company
Big Data Skills:
- 5+ years of hands-on experience implementing batch and real-time Big Data integration frameworks and/or applications in private or public cloud environments
- Preference for GCP
- Technologies include Python, Cloud Run Jobs, Cloud Composer (Apache Airflow)
- Experience in debugging, identifying performance bottlenecks, and fine-tuning frameworks
- 2+ years of experience working in a Linux environment, with the ability to interface with the OS using system tools, scripting languages, and integration frameworks
- 2+ years of hands-on experience in one or more modern object-oriented programming languages such as Java, Scala, or Python, including the ability to code in more than one language
- 3+ years of hands-on experience applying schema design principles, best practices, and trade-offs across Big Data technologies
- Experience designing Big Lakehouse architectures, including five-layer architecture
Preferred Qualifications:
- Experience working in agile, distributed teams
- Familiarity with cloud-native CI/CD pipelines and DevOps practices
- Strong problem-solving, analytical thinking, and communication skills