Apple is where individual imaginations gather together, committing to the values that lead to great work. Every new product we build, service we create, or Apple Store experience we deliver is the result of us making each other’s ideas stronger. That happens because every one of us shares a belief that we can make something wonderful and share it with the world, changing lives for the better. It’s the diversity of our people and their thinking that inspires the innovation that runs through everything we do. When we bring everybody in, we can do the best work of our lives. Here, you’ll do more than join something - you’ll add something. Apple Pay brought mobile payment to millions of customers, and it’s just the beginning. We are looking for engineers who enjoy both hands-on technical work and designing thoughtful, scalable services for Wallet and Apple Pay. Our team’s vision is to be the engine of intelligent transformation, leveraging a unified, reliable data platform to build and deploy innovative and solutions that drive significant business impact and enable data-driven decision-making throughout the organization. We are seeking pragmatic AI Big Data Engineers to join our dynamic team to build and optimize data and analytics solutions as well as perform ML enablement and participate in generative AI initiatives to craft the future of Wallet and Apple Pay. You will collaborate with cross-functional teams across different time zones and deliver impactful and scalable data architectures.
Description
- Instrument APIs, user journey and interaction flows to systematically collect behavioral, transactional, and operational data, enabling robust analytics and insightful reporting - Build robust data architectures for Wallet, Payments & Commerce products. - Build robust data architectures supporting large-scale Wallet Payments & Commerce (WPC) applications. - Optimize ETL workflows to enhance data processing efficiency and reliability. - Develop tools and frameworks to optimize data processing performance. - Ensure data quality and integrity across all data systems and platforms. - Collaborate closely with diverse set of partners to gather requirements, prioritize use cases, and ensure high-quality data products delivery. - Integrate data pipelines into the broader ML Operations (MLOps) process. This includes automating the data flow for feature engineering, model retraining, performance monitoring models in production, drift detection and ensuring scalability. - Construct and maintain data pipelines for Gen AI/RAG solutions, including processes for data extraction, chunking, embedding, and grounding to prepare data for models and perform continuous quality and performance measurement.
Minimum Qualifications-
Bachelor’s or Master’s degree in Computer Science or a related technical field or equivalent experience
- 4+ years of experience in designing, developing and deploying data engineering for analytics or ML & AI pipelines.
- Strong proficiency in SQL, Scala, Python, or Java, with hands-on experience in data pipeline tools (e.g., Apache Spark, Kafka, Airflow), CI/CD practices, and version control.
- Familiarity with cloud platforms (AWS, Azure, GCP) and data management and analytics tools like Snowflake, Databricks and Tableau.
- Strong understanding of data warehousing, data modeling (dimensional/star schemas), and metric standardization.
- Strong problem-solving skills and the ability to work in an agile environment.
Preferred Qualifications-
Ability to create technical Specs, instrumentation specs and posses the ability to understand APIs, MSDs etc..
- Expertise in building and refining large-scale data pipelines, as well as developing tools and frameworks for data platforms.
- Hands-on experience with big data technologies such as distributed querying(Trino), real-time analytics(OLAP), near-real-time data processing (NRT), and decentralized data architecture (Apache Mesh).
- Familiarity with data governance, security protocols, and compliance in financial data systems.
- Experience enabling ML pipelines including automating the data flow for feature engineering, model retraining, performance monitoring models in production, drift detection and ensuring scalability.
- Familiarity with GenAI concepts like Retrieval-Augmented Generation (RAG), Large Language Models (LLMs), prompt engineering, vector embeddings, and LLM fine-tuning
- Works independently with minimal oversight, actively builds relationships, and contributes to a positive team environment.
- Demonstrates sound judgment, applies technical principles to complex projects, evaluates solutions, and proposes new ideas and process improvements.
- Seeks new opportunities for growth, demonstrates a thorough understanding of technical concepts, exercises independence in problem-solving, and delivers impactful results at the team level.
Familiarity with Fintech, Wallet domain, digital commerce etc..
Submit CV