Data Engineer
Experience - 6+ Years
Summary
We are seeking a highly skilled and motivated Data Engineer Associate to join our dynamic team. The ideal candidate will have extensive experience in data engineering, with a strong focus on Databricks, Python, and SQL. As a Data Engineer Associate, you will play a crucial role in designing, developing, and maintaining our data infrastructure to support various business needs.
Key Responsibilities
-
Develop and implement efficient data pipelines and ETL processes to migrate and manage client, investment, and accounting data in Databricks
-
Work closely with the investment management team to understand data structures and business requirements, ensuring data accuracy and quality.
-
Monitor and troubleshoot data pipelines, ensuring high availability and reliability of data systems.
-
Optimize database performance by designing scalable and cost-effective solutions.
Candidate Profile
Preferred Experience & Skills:
-
6+ years' experience in Data Engineering
-
Proficiency in Apache Spark. Databricks Data Cloud, including schema design, data partitioning, and query optimization
-
Exposure to Azure.
-
Proven experience in setting up and orchestrating Databricks pipelines, including automated workflows and job scheduling.
-
Strong hands-on knowledge of Databricks Unity Catalog, Hive Metastore, and the ability to build and manage Databricks Dashboards for data visualization and monitoring.
-
Working experience with Azure Functions for serverless data processing and event-driven workflows.
-
Proficient in implementing Medallion Architecture (Bronze, Silver, Gold layers) for scalable and structured data ingestion, transformation, and delivery.
-
Experience in building end-to-end data pipelines on Azure and Databricks environments.
-
Good to have: Familiarity with Box for data integration and Databricks Asset Bundles (DAB) for managing and deploying reusable code artifacts.
-
Advanced SQL, data modeling skills, and data warehousing concepts tailored to investment management data (e.g., transaction, accounting, portfolio data, reference data etc).
-
Experience with ETL/ELT tools like snap logic and programming languages (Python).
-
Familiar with data governance frameworks and security protocols.
-
Excellent problem-solving skills and attention to detail.