Qureos

FIND_THE_RIGHTJOB.

Databricks Architect

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Pune

About Us

We empower enterprises globally through intelligent, creative, and insightful services for data integration, data analytics and data visualization.
Hoonartek is a leader in enterprise transformation, data engineering and an acknowledged world-class Ab Initio delivery partner.
Using centuries of cumulative experience, research and leadership, we help our clients eliminate the complexities & risk of legacy modernization and safely deliver big data hubs, operational data integration, business intelligence, risk & compliance solutions and traditional data warehouses & marts.
At Hoonartek, we work to ensure that our customers, partners and employees all benefit from our unstinting commitment to delivery, quality and value. Hoonartek is increasingly the choice for customers seeking a trusted partner of vision, value and integrity

How We Work?

Define, Design and Deliver (D3) is our in-house delivery philosophy. It’s culled from agile and rapid methodologies and focused on ‘just enough design’. We embrace this philosophy in everything we do, leading to numerous client success stories and indeed to our own success.
We embrace change, empowering and trusting our people and building long and valuable relationships with our employees, our customers and our partners. We work flexibly, even adopting traditional/waterfall methods where circumstances demand it. At Hoonartek, the focus is always on delivery and value.

Job Description


Job Title

Senior Databricks Architect (Banking & Financial Services)

Location

Pune

About the Role

We’re seeking a seasoned Databricks Architect to lead the design and delivery of a modern Lakehouse platform for a large banking environment. You’ll define end‑to‑end data architecture, champion ELT pipelines on Databricks, and ensure secure, governed, and high‑performance data products that support critical use cases across regulatory reporting, risk, fraud/AML, customer analytics, and real‑time decisioning. This role blends deep hands‑on engineering with architectural leadership and stakeholder engagement.

Experience

12–18 years in data engineering/architecture, with 5+ years architecting solutions on Databricks/Spark and leading enterprise data platforms in regulated industries.

Key Responsibilities

1) Lakehouse & Data Architecture

  • Design the enterprise Databricks Lakehouse architecture (Delta Lake, Unity Catalog, MLflow) ensuring scalability, performance, security, and data integrity for banking workloads.
  • Define standards and best practices for data modeling, storage layers (Bronze/Silver/Gold), and access patterns for analytics, reporting, and ML.
  • Develop logical/physical data models, data flow diagrams, and reference architectures aligned to regulatory use cases (Basel/BCBS 239, IFRS 9, AML).

2) ELT Pipeline Strategy & Development

  • Lead the design and development of ELT pipelines: ingest diverse sources (core banking, cards, trade, payments, CRM), load into Delta Lake, and transform using Spark SQL/Python within Databricks.
  • Implement medallion architecture with robust orchestration (Databricks Workflows, Azure Data Factory triggers) and CI/CD for notebooks/jobs.
  • Optimize pipelines for cost, performance, reliability (cluster sizing, Photon, partitioning, Z‑Order, caching) and minimize downtime during batch and streaming loads.

3) Data Integration & Transformation

  • Integrate structured and unstructured data across on‑prem and cloud (Azure/AWS/GCP) into Delta Lake; enable real‑time and batch processing via Structured Streaming.
  • Define transformation rules for data cleansing, normalization, deduplication, and enrichment, aligned to banking product hierarchies and customer/transaction entities.

4) Data Quality, Governance & Security

Establish data quality SLAs, validation checks, and observability across Bronze
  • Gold layers.
  • Implement Unity Catalog‑based governance: fine‑grained access controls, row/column‑level security, PII masking, tokenization; ensure compliance with RBI, GDPR, PCI‑DSS, and internal risk policies.
  • Partner with the data governance function to formalize metadata, lineage, retention, and data ownership; support BCBS 239 principles (accuracy, completeness, timeliness).

5) Stakeholder Collaboration & Enablement

  • Work closely with Risk, Compliance, Finance, Fraud/AML, Analytics teams to translate business needs into scalable data products and curated marts.
  • Partner with IT/Security on network, identity, secrets management, and platform hardening; guide BI developers and data scientists in best‑practice Databricks usage.

Required Skills & Competencies

Technical

  • Databricks & Spark: Databricks SQL, notebooks (Python/SQL/Scala), Spark optimizations, Delta Lake, Unity Catalog, MLflow.
  • ELT & Orchestration: Databricks Workflows, Azure Data Factory/Synapse pipelines, job clusters, CI/CD (Git integration).
  • Cloud Platforms: Azure (preferred), with experience across AWS/GCP; secure storage (ADLS/S3/GCS), Key Vault/Secrets, VNETs/Private Links.
  • Data Warehousing & Modeling: Dimensional modeling, star/snowflake schemas; conformed dimensions; semantic layers for regulatory and financial reporting.
  • Streaming & Real‑time: Kafka/Event Hubs, Structured Streaming; CDC; near‑real‑time risk/fraud signals.
  • Security & Compliance: Data privacy, PII handling, masking, encryption at rest/in transit; policy‑based access; audit readiness (RBI/GDPR/PCI‑DSS).

Functional (Banking)

  • Regulatory reporting (Basel/BCBS 239, IFRS 9), credit & market risk data, AML/KYC, transaction monitoring, customer 360, product profitability, liquidity and stress testing.

Soft Skills

  • Strong communication—translating complex technical concepts into business terms; stakeholder management across risk, compliance, and technology.
  • Analytical/problem‑solving; ownership mindset; mentoring engineers and analysts.

Qualifications

  • BE/B.Tech in Computer Science (or related field). Hands‑on expertise with T‑SQL/SQL, PL/SQL, and modern data platforms.

Preferred Certifications

  • Databricks Certified Data Engineer Professional / Lakehouse Fundamentals.
  • Microsoft Azure: DP‑203 (Data Engineering), AZ‑305 (Architect).
  • Security/Compliance: ISO 27001 awareness, PCI‑DSS familiarity.
  • Apache Spark Developer certifications.

© 2026 Qureos. All rights reserved.