We are seeking an experienced Senior GCP Data Engineer to join our team. The ideal candidate will bring strong expertise in designing and managing scalable data pipelines using Google Cloud Platform (GCP) services, while working closely with data scientists, analysts, and business stakeholders to deliver impactful solutions.
Key Responsibilities
- Collaborate with Data Science and Analytics teams to design and implement scalable data solutions.
- Build, maintain, and optimize ETL/ELT pipelines for cleaning, transforming, and aggregating large datasets.
- Leverage GCP BigQuery, DataProc, and Composer (Airflow) for data processing and workflow orchestration.
- Ensure high data quality, reliability, and availability across all data systems.
- Model front-end and back-end data sources to enable comprehensive user flow analysis.
- Support predictive modeling and advanced analytics initiatives.
- Translate complex business requirements into efficient technical solutions.
Must-Have Skills
- Hands-on experience with GCP BigQuery, GCP Composer (Airflow), and GCP DataProc.
- Strong programming background in Python and PySpark.
- Advanced SQL skills with query optimization expertise.
- Solid understanding of ETL/ELT pipeline design and big data ecosystems (Spark, Hive, Hadoop).
Secondary Skills
- Knowledge of CI/CD pipelines in GCP.
- Experience with Linux/Shell scripting.
- Exposure to Core Java for integration tasks.
- Familiarity with data visualization tools (Tableau, R, etc.).
- Understanding of NoSQL databases (Bigtable, MongoDB).
- Exposure to other cloud platforms (AWS, Azure) is a plus.
Job Type: Contractual / Temporary
Contract length: 6 months
Pay: ₹126,000.00 per month
Work Location: In person