- Design, build and maintain enterprise data models and data assets on Microsoft Azure to support analytics, reporting, ML/AI and data products.
- Translate business requirements into robust logical and physical data models, ensuring alignment with data architecture, governance, security and performance standards.
- Enable reusable, well-documented data structures that accelerate insight delivery and reduce technical debt.
Key responsibilities
- Develop conceptual, logical and physical data models for transactional, analytical and semantic layers (including star/snowflake schemas, data vault or other patterns as appropriate).
- Define entities, attributes, relationships, keys, constraints and data types in line with client standards.
- Drive consistent modeling across projects to support integration, reuse and scale.
- Implement models using Azure services: Azure SQL Database / Azure Synapse / Azure Data Lake (Gen2) / Synapse dedicated SQL pools / Azure Databricks / Cosmos DB as applicable.
- Work with engineers to implement ELT/ETL pipelines (Azure Data Factory / Synapse Pipelines / Databricks notebooks) that load and transform data into modelled structures.
- Engage with business stakeholders, data engineers, data scientists, analytics teams and product owners to gather and validate data requirements.
- Partner with enterprise architecture, cloud engineering and security teams to ensure solutions meet standards and constraints.
- Ensure data modelling meets data privacy, classification and security requirements; support access control design (RBAC, row-level security).
- Recommend partitioning, indexing, materialised views / cached layers and compute sizing to balance performance and cost.
Required qualifications & experience
- Degree in Computer Science, Information Systems, Data Science, Engineering or a related discipline (or equivalent practical experience).
- Proven hands-on experience designing and implementing data models for analytics and reporting on Microsoft Azure in enterprise environments.
- Experience with one or more of: Azure Synapse Analytics, Azure SQL Database, Azure Data Lake Gen2, Azure Databricks, Cosmos DB.
- Strong knowledge of data modelling techniques (3NF, dimensional modelling, data vault) and experience selecting appropriate patterns.
- Practical experience with data ingestion and transformation tools such as Azure Data Factory, Synapse Pipelines or Databricks.
- Familiarity with metadata management tools (Azure Purview or equivalent), data lineage and data quality frameworks.
- Strong SQL skills; experience optimising SQL for analytical workloads.
- Understanding of data security, privacy and regulatory requirements relevant to pharma/healthcare desirable.
Preferred skills and attributes
- Azure certifications (e.g., Microsoft Certified: Azure Data Engineer Associate, Azure Solutions Architect) desirable.
- Knowledge of big data processing (Spark), Python/Scala for data engineering tasks.
- Experience working in regulated industries (pharmaceuticals, healthcare) and with clinical/trial data models desirable.
- Strong problem solving, attention to detail and pragmatic approach to delivery in agile teams.