About the Role
We are seeking a highly skilled Senior Data Architect / Application Engineer to join our team. In this role, you will be responsible for designing, implementing, and maintaining data platforms and application infrastructure. You will play a key role in driving innovative data solutions while ensuring platform reliability, security, and performance.
Location: New York City, NY (Hybrid – 3 days onsite)
Interview Process: In-person interview required in NYC.
Work Authorization: Only W2 candidates will be considered. C2C is allowed only if the candidate is a US Citizen or Green Card holder with their own corporation.
Key Responsibilities
-
Lead architecture and technical design discussions using industry best practices and modern technologies.
-
Support production operations and resolve complex issues within the Credit Risk application platform.
-
Design and implement batch and ad-hoc data pipelines using Medallion Lakehouse architecture on modern cloud platforms, primarily Databricks.
-
Build and maintain data ingestion pipelines from upstream systems into object storage (e.g., S3, ADLS) using formats like Parquet, including partitioning, z-ordering, and schema evolution.
-
Integrate with external XVA / risk engines and develop orchestration logic for long-running computations.
-
Model and optimize risk metrics such as EPE and PFE for efficient querying and analytics.
-
Ensure platform reliability, security, observability, and auditability, including IAM roles, authentication mechanisms, and encryption.
-
Contribute to API design for internal and external consumers, including versioning, documentation, error handling, and SLAs.
Required Qualifications
General
-
12–15 years of experience as an application developer or data engineer.
-
Strong communication and collaboration skills.
-
Experience working in Agile environments.
-
Ability to design and document technical architectures and system designs.
-
Proactive and self-driven team player with strong analytical skills.
Technical Skills (Mandatory)
-
Python (expert level) including PySpark / Spark for data engineering.
-
Azure Databricks with experience implementing Medallion Lakehouse Architecture.
-
Strong SQL expertise including joins, unions, stored procedures, and query optimization.
-
REST API development using frameworks such as Django, Flask, or FastAPI.
-
Experience with CI/CD pipelines using Git, Jenkins, and Azure DevOps.
-
Experience building data ingestion and transformation pipelines.
-
Cloud platform certification such as AWS Certified Cloud Practitioner or equivalent.
Domain Requirement
-
Strong experience in Credit Risk and Counterparty Risk within the financial services or capital markets domain.
Preferred Qualifications
-
Advanced degree in Finance, Computer Science, or related field.
-
Experience with risk modeling and financial analytics.
-
Knowledge of deployment, operational support, and monitoring tools.
-
Exposure to technical architecture design and system documentation.