About CGI
Founded in 1976, CGI is among the world's largest independent IT and business consulting services firms. With 94,000 consultants and professionals globally, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services, and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion, and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at
cgi.com.
Position Title: GCP Cloud Engineer / Developer
Job location: Bangalore/Hyderabad/Chennai/Pune/Mumbai
Work mode: Hybrid
Qualification: B.E/
B.Tech/M.Tech/MCA\
Required Exp: 5+yrs relevant experience in GCP cloud engineer role
Notice Period: Looking for immediate joiners who are ready to join on/before 17th October
Job Summary:
looking for a hands-on GCP Developer with strong experience in ETL design, BigQuery, and Python. The ideal candidate should be able to build and maintain scalable data pipelines, perform data transformations using GCP native tools, and optimize analytical workflows. Exposure to Azure cloud services is a plus. You will work closely with data architects and analysts to ensure seamless data integration and transformation across systems.
Key Responsibilities:
- Design, develop, and maintain ETL pipelines using Dataflow, Dataform, or custom Python scripts.
- Manage data ingestion from multiple sources such as MySQL, APIs, and Cloud Storage into BigQuery.
- Implement incremental and CDC-based data loading strategies for large datasets.
- Optimize BigQuery performance through effective partitioning, clustering, and query tuning.
- Implement and manage data snapshots, backups, and recovery mechanisms using BigQuery’s time-travel and GCS versioning.
- Work on data modeling and medallion architecture (Bronze, Silver, Gold layers) for structured data transformation.
- Use Python (and PySpark when needed) for advanced data transformations, automation, and orchestration.
- Integrate CI/CD pipelines for GCP resources using GitHub, Terraform, or Deployment Manager.
- Collaborate with cross-functional teams to develop data APIs using FastAPI or Django when needed.
- Maintain data quality, reliability, and observability across the ETL pipeline using logging, monitoring, and alerting tools.
- Ensure security and compliance of data within GCP services following organizational policies.
Required Skills and Experience:
- 5–9 years of experience in data engineering or cloud data development.
- Strong hands-on expertise in Google Cloud Platform (GCP) including:
o BigQuery (partitioning, clustering, snapshots, performance tuning)
o Cloud Storage (GCS) (versioning, soft/hard delete concepts)
o Dataflow / Dataform / Pub/Sub / Cloud Composer
o Database Migration Service (DMS) or ETL migration tools
- Experience with ETL design and managing large-scale datasets (GBs to TBs+).
- Strong programming experience in Python, including:
o Data processing with Pandas, PySpark
o Use of Decorators, Generators (yield), and modular design
- Very Good understanding of SQL and advanced concepts like window functions, rank(), dense_rank(), and aggregate operations.
- Knowledge of CI/CD pipelines, GitHub, and IaC tools like Terraform or Deployment Manager.
- Familiarity with Liquibase (for schema change management) and Dataform (for BigQuery SQL pipelines).
- Understanding of Medallion architecture and data warehousing principles.
- Experience with API development frameworks like FastAPI or Django.
- Exposure to logging and monitoring (OpenTelemetry, ELK stack, or Cloud Logging).
Nice-to-Have Skills:
- Experience with Azure cloud services including Azure Data Factory, Synapse, or Blob Storage.
- Experience with streaming data ingestion (Pub/Sub, Kafka, Dataflow).
- Knowledge of machine learning pipelines using BigQuery ML or Vertex AI.
- Familiarity with Cloud Scheduler and Cloud Functions for workflow orchestration.
- Understanding of Data Governance, Metadata management, and Data Catalog.
- Front end development knowledge is plus.
Soft Skills:
- Strong problem-solving and analytical mindset.
- Excellent communication and collaboration skills.
- Ability to work in agile, fast-paced environments.
- Proactive approach to optimizing data performance and reliability.