Data Platform Lead (Snowflake + AWS + dbt + Financial Planning)
Location: Bangalore
Experience Required: 7–12+ years
Job Type: Full-Time
Department: Data Engineering / Technology / Platform Modernization
Role Overview
We are seeking an experienced Data Platform Lead with strong expertise in Snowflake, AWS, dbt, and end-to-end modern data platform engineering—combined with the ability to create financial projections, budgets, cost forecasts, and investment requirements related to the project.
This role blends technical leadership + financial planning, ensuring the successful delivery of the entire cloud data platform while also owning the budgeting needed for implementation, migration, scaling, and long-term sustainability.
You will drive architecture, migration, platform build-out, governance, orchestration and prepare financial models that justify costs, track spending, forecast usage, and support management decision-making.
Key Responsibilities
1. End-to-End Data Platform Architecture
- Architect and build a full modern data stack using AWS → Snowflake → dbt.
- Design raw, staging, core, and datamart layers following best practices.
- Establish Dev/Prod environments, CI/CD pipelines, and automated execution flows.
2. AWS Cloud Engineering
- Build cloud infrastructure including S3, IAM, Lambda, Step Functions, VPC, PrivateLink, KMS.
- Set up secure data ingestion pipelines and AWS-native services for automation.
3. Snowflake Platform Ownership
- Configure warehouses, schemas, RBAC, pipes, file formats, tasks, and governance structures.
- Monitor compute usage, storage, and credits; optimize costs and performance.
- Oversee migration and validation of all datasets into the Snowflake ecosystem.
4. dbt Modelling & Transformations
- Lead the development of dbt models (stage → core → mart).
- Configure dbt jobs, CI pipelines, tests, macros, and documentation.
- Implement best practices for performance, modularity, and maintainability.
5. Orchestration & Automation
- Build orchestration flows using AWS Step Functions, Airflow, or equivalent tools.
- Define execution chains, rerun logic, error notifications, and monitoring dashboards.
6. Data Migration & Repointing
- Lead complete migration of 20–30 existing pipelines, models, and dashboards.
- Align legacy models to new dbt architecture.
- Repoint BI dashboards (PowerBI/Tableau/Looker) to new data marts.
7. Governance, Security, and Compliance
- Establish RBAC, access roles, object-level security, and encryption standards.
- Implement audit logs, data retention policies, and cost monitoring systems.
- Maintain platform documentation, design diagrams, SOPs, and runbooks.
8. Financial Planning & Budget Ownership (New Requirement)
Financial Responsibilities
- Build detailed financial models for platform costs (AWS, Snowflake, dbt, orchestration tools).
- Forecast monthly, quarterly, and annual costs, including compute, storage, credits, licensing, and resource usage.
- Prepare investment requirements, including:
- Resource allocation
- Migration costs
- Infrastructure scaling estimates
- Third-party tool subscriptions
- Vendor onboarding costs
- Identify cost-saving opportunities and present ROI calculations.
- Monitor actual vs. budgeted usage and report variances.
- Work closely with the CFO office to finalize capex/opex related to the data platform.
Business & Management Collaboration
- Present budget plans, cost models, and financial projections to executive leadership.
- Provide financial justification for architectural choices.
- Support strategic decisions with cost-benefit analysis and scenario modelling.
9. Leadership & Collaboration
- Lead a cross-functional delivery team (Data Engineers, DevOps, BI, SMEs).
- Collaborate with Product, Engineering, DevOps, and Business teams.
- Mentor junior engineers and guide best practices for the data ecosystem.
Required Skills & Experience
Technical Expertise (Mandatory)
- Snowflake: deep experience with RBAC, tasks, warehouses, cost optimization
- AWS: S3, Lambda, IAM, VPC/PrivateLink, Step Functions, KMS
- dbt: model design, testing, workflow orchestration, CI/CD
- Python: ETL scripting, automation, data workflows
- CI/CD: GitHub Actions, automated deployments
- IaC: Terraform or AWS CDK (Python preferred)
Financial & Business Expertise (Mandatory)
- Strong understanding of cloud cost models (Snowflake credits, AWS services).
- Ability to create:
- Budget forecasts
- Cost breakdown sheets
- Investment requirement models
- Resource allocation plans
- ROI calculations
- Experience working with finance teams to validate budgets.
Experience
- 7–12+ years in Data Engineering or Cloud Platform roles.
- Proven track record delivering large-scale platform migrations.
- Experience handling financial modeling for technical projects is a strong advantage.
- Background in EdTech, SaaS, or high-scale product companies preferred.
Soft Skills
- Strong analytical & architectural thinking
- Financial acumen & cost-awareness
- Leadership & stakeholder communication
- Problem solving & decision-making
- Documentation excellence
- Ownership mentality & delivery discipline
What We Offer
- Opportunity to architect and scale a flagship cloud data platform
- Hands-on work with modern data technologies (Snowflake, dbt, AWS)
- High visibility and decision-making authority
- Competitive compensation + performance rewards
- Fast career growth in a rapidly scaling EdTech ecosystem
Job Type: Full-time
Work Location: In person