Infinitive has been named “Best Small Firms to Work For” by Consulting Magazine eight times, and has also been named a Washington Post Top Workplace, Washington Business Journal Best Places to Work, and Virginia Business Best Places to Work.
Role Overview
This architect will define and shape a unified platform service that enables scalable, governed, and cost-efficient data access across the bank. The ideal candidate will influence enterprise design standards and technical adoption by making Databricks Unity Catalog the effortless, observable, and default foundation for data integration, governance, and analytics across all business domains.
Key ResponsibilitiesPlatform Vision & Architecture-
Define and champion the end-to-end architecture for the bank’s Databricks-based data platform, ensuring scalability, security, cost efficiency, and ease of adoption.
-
Design a self-service platform layer that leverages Databricks Unity Catalog to deliver seamless data discovery, access, and observability across all environments.
-
Establish architectural patterns and reference implementations that encourage enterprise-wide reuse and standardization.
Unity Catalog Strategy & Enablement-
Lead the design and implementation of Databricks Unity Catalog as the central governance plane—defining catalog hierarchies, fine-grained access controls, and cross-environment lineage.
-
Evaluate and implement metadata, RBAC/ABAC, and data masking capabilities to meet regulatory and compliance requirements (e.g., GLBA, GDPR, HIPAA).
-
Define the template architecture that allows Unity Catalog to operate as a scalable and cost-effective shared service across lines of business.
Scalability, Cost, and Observability-
Engineer platform capabilities that provide deep visibility into compute, storage, and catalog operations through integrated observability, monitoring, and FinOps practices.
-
Develop resource optimization strategies to balance performance and cost while maintaining compliance and SLAs.
-
Establish metrics, dashboards, and alerts to ensure the platform scales predictably under enterprise workloads.
API and Integration Design-
Architect streamlined RESTful/GraphQL APIs for secure, governed data access and metadata integration.
-
Ensure interoperability with enterprise systems, APIs, and external data consumers using modern, consistent, and documented integration patterns.
Data Modeling & Pipeline Strategy-
Guide teams in building Lakehouse-aligned data models that maximize reuse and governance.
-
Oversee design of ETL/ELT architectures (Spark, PySpark, SQL) that integrate seamlessly with Unity Catalog for lineage and access tracking.
Collaboration & Influence-
Partner with engineering, data science, and risk teams to align platform design with business outcomes and regulatory expectations.
-
Influence architecture steering committees and platform engineering groups to adopt the Databricks foundation as a managed, enterprise-wide service.
-
Promote a culture of easy adoption through clear design patterns, documentation, and working sessions.
Technical Leadership & Mentorship-
Mentor engineers and architects on Databricks, Unity Catalog, and best practices for cost, scale, and observability.
-
Contribute to internal architecture communities and upskill teams across multiple domains.
Required Skills & Qualifications-
Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
-
Experience: 8+ years in data architecture or platform engineering, including experience designing enterprise-scale, distributed data environments.
-
Databricks Expertise: Deep hands-on knowledge of Databricks, Delta Lake, Apache Spark, and Lakehouse principles.
-
Unity Catalog Mastery: Demonstrated success architecting and operationalizing Databricks Unity Catalog for enterprise governance, metadata management, and access control.
-
Programming & Data: Advanced proficiency in Python (PySpark) and SQL; experience with cloud data platforms (AWS, Azure, or GCP).
-
API Engineering: Strong background in API architecture (REST, GraphQL, OpenAPI) and applying best-in-class security and observability.
-
Governance Knowledge: Expert-level understanding of data governance frameworks, data quality management, and regulatory compliance.
-
Soft Skills: Outstanding communication and influence skills, with ability to advocate for design principles across executive, technical, and risk audiences.
Preferred Qualifications-
Experience deploying Databricks and cloud infrastructure using Terraform or IaC frameworks.
-
Familiarity with MLflow and model governance integration.
-
Relevant certifications (Databricks Certified Data Engineer, AWS/Azure/GCP Architect).
-
Experience with real-time data streaming technologies (Kafka, Structured Streaming).
GTxbsgGenw