Experience- 10-14 years
Locations- Noida, Mumbai, Pune, Bangalore, Chennai, Hyderabad
Mandatory Experience
- 10–14+ years in data engineering/architecture
- At least 3 years on GCP
- Experience in enterprise-scale cloud transformations & D&A platforms
- Strong analytical and problem-solving skills
- Good Communication Skills
JD-
Job Summary
The GCP Data Architect will lead the architecture, design, and implementation of scalable, secure, and high‑performance data platforms on Google Cloud Platform. The role involves driving data strategy, designing Warehouse / Lakehouse systems, guiding engineering teams and ensuring best‑practice-based architecture for enterprise Data & Analytics programs.
Key Responsibilities
1. Architecture & Solution Design
- Define end-to-end data architecture for Data Lake, DWH, ELT/ETL pipelines, and analytics workloads
- Extensive experience in integration with 3rd-party platforms through APIs
- Design scalable, secure, and cost-optimized solutions using GCP services:
BigQuery, Cloud Storage, Pub/Sub, Dataflow, Dataproc, Cloud Composer, Cloud SQL
- Represent architecture discussions with business & technical stakeholders
- Create POCs / MVPs as required before delivering a full-fledged solution
- Build high-level & low-level design documentation
2. Data Engineering Leadership
- Guide development teams on implementing data models, ingestion pipelines, and transformation layers (Bronze/Silver/Gold)
- Drive best practices for BigQuery optimization, partitioning, clustering & performance tuning
- Oversee code reviews, standards, and quality of deliverables
- Experience in cloud modernization & migration projects
- Partner with business teams to define KPIs, DWH models and analytics requirements
- Support production pipelines, cost optimization and performance enhancements
- Ensure design alignment with long-term D&A roadmap
3. Data Governance, Security & Compliance
- Implement data governance frameworks, lineage, metadata management, and access controls
- Apply IAM best practices and security architecture guidelines
4. Collaboration & Client Engagement
- Work closely with Data Engineers, PMs, Product Owners, Analysts & Clients
- Provide input for RFPs, proposals, and solution architecture decks
Mandatory Skills / Experience
Technical Skills
- Strong expertise in BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud SQL, Cloud Storage, Composer
- Proficiency in SQL, Python, data modeling, and distributed data processing
- Hands-on experience designing data lakes, data warehouses, data marts
- Understanding of real-time and batch data processing patterns
Architectural Experience
- 7–12+ years data engineering/architecture experience; at least 3 years on GCP
- Experience with enterprise-scale cloud transformations & D&A platforms
Soft Skills
- Strong analytical, problem-solving and communication skills
- Ability to work with multi-functional teams and lead technical conversations
Good-to-Have
- GCP Professional Cloud Architect / Data Engineer Certification
- Experience with Terraform / IaC, CI/CD, and DevOps
- Exposure to BI tools: Looker, Data Studio, Power BI
Job Type: Full-time
Pay: From ₹1,500,000.00 per year
Application Question(s):
- How many years of total experience do you have?
- How many years of experience so you have in BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud SQL, Cloud Storage, Composer?
- Have many years of experience do you have in Data Architecture & Solution Design?
- Are you willing to work on a 6-month contract (extendable as per project needs)?
- In the last 6 months, have you applied or attended the interview in Birlasoft?
- What is your notice period/ LWD?
- What is your current and preferred location?
- What is your current CTC?
- What is your expected CTC?
Work Location: In person