Qureos

Find The RightJob.

Data Platform Engineer

Job Description

Who is RHP Properties?

Headquartered in Farmington Hills, Michigan, RHP Properties (www.rhp.com) is the nation's largest private owner and operator of manufactured home communities. With more than 380 communities throughout 33 states, we continue to expand our footprint to provide accessible and affordable housing across the country. All of this would not be possible without the energy and drive of our talented employees! We invest in our employees, with regular training, opportunities for advancement, and fun events to bring everyone together.

RHP Properties is seeking Data Platform Engineer to lead the design, implementation, and evolution of the company's Microsoft Fabric data platform. This role will play a critical part in modernizing the enterprise data ecosystem by migrating legacy reporting and data systems into a scalable, governed, and high-performance Lakehouse architecture. This position combines data architecture, engineering, and platform ownership, requiring both strategic thinking and hands-on implementation. The successful candidate will design data models, build robust pipelines, integrate enterprise systems, and establish engineering best practices that support reliable analytics across the organization. This role focuses on data platform engineering and architecture rather than report development, ensuring RHP has a strong foundation for analytics, automation, and future AI initiatives.

As RHP transitions toward a modern Fabric-based analytics ecosystem, this role will lead the technical design and implementation of the enterprise data platform. You will work closely with leadership, analysts, and engineering teams to build a trusted enterprise data foundation that enables:

  • Scalable data pipelines
  • Standardized semantic models
  • Governed self-service analytics
  • High-performance reporting
  • Advanced analytics and AI readiness

The role requires a strong engineering mindset and experience building production-grade data platforms rather than simply developing reports or dashboards.

Key Responsibilities

Platform Architecture & Migration

  • Design and implement RHP’s Microsoft Fabric data architecture including Lakehouse, Warehouse, Pipelines, and Notebooks.
  • Lead migration of legacy data sources including SQL Server, flat files, SharePoint, APIs, and SFTP feeds into the Fabric platform.
  • Implement medallion architecture (Bronze / Silver / Gold) and enterprise data modeling standards.
  • Optimize performance, refresh speed, and scalability of data pipelines and semantic models.
  • Define architectural patterns that support long-term platform scalability.

Data Engineering & Integration

  • Build robust and scalable data ingestion pipelines using modern ELT/ETL patterns.
  • Develop Python and SQL-based transformation frameworks to support repeatable data workflows.
  • Integrate internal and external systems through APIs, secure data transfers, and automated pipelines.
  • Implement data validation, logging, monitoring, and error handling to ensure reliability.
  • Develop reusable frameworks for data ingestion and transformation.

Analytics Enablement

  • Design and maintain enterprise semantic models that serve as the trusted layer for reporting.
  • Optimize Power BI datasets and semantic models for performance and usability.
  • Enable self-service analytics by providing structured, governed datasets for analysts and business teams.
  • Partner with analytics teams to improve data accessibility and reduce reporting inefficiencies.

Data Governance & Quality

  • Implement enterprise data governance standards including naming conventions, lineage tracking, and access controls.
  • Establish automated data quality validation frameworks.
  • Ensure consistency, reliability, and traceability of enterprise data assets.
  • Maintain documentation of data models, transformations, and platform architecture.

Data Platform Reliability

  • Implement monitoring and observability across pipelines and data models.
  • Establish alerting mechanisms for data failures and performance degradation.
  • Improve reliability and resilience of production data workloads.
  • Identify and resolve bottlenecks in data pipelines and processing layers.

Engineering Practices

  • Implement Git-based version control for data platform artifacts.
  • Develop CI/CD workflows for deployment and promotion of data pipelines and models.
  • Promote testing, documentation, and maintainable engineering practices.
  • Encourage reusable data engineering patterns and modular pipeline design.

Technology Stack

The data platform environment includes technologies such as:

  • Microsoft Fabric (Lakehouse, Warehouse, Pipelines, Notebooks)
  • Power BI Semantic Models
  • Python and SQL
  • Azure Storage / Data Lake
  • API integrations
  • Git version control
  • Data pipeline orchestration and monitoring tools

Job Requirements

Required Technical Skills

  • Experience working with modern cloud data platforms such as Microsoft Fabric, Azure Synapse, Databricks, Snowflake, or similar.
  • Strong Python and advanced SQL skills.
  • Experience designing data warehouses and dimensional data models.
  • Experience building data pipelines and ELT/ETL frameworks.
  • Experience integrating data via APIs and automated data ingestion pipelines.
  • Experience optimizing semantic models and Power BI datasets.
  • Understanding of modern data architecture patterns such as medallion architecture and star schemas.

Professional Attributes

Successful candidates will demonstrate:

  • Strong ownership mindset and execution discipline
  • Ability to solve complex technical problems
  • Focus on automation and process improvement
  • Curiosity and adaptability with emerging technologies
  • Attention to detail and commitment to high-quality engineering standards
  • Strong communication skills and stakeholder empathy

Qualifications

  • Experience migrating legacy systems to cloud-based data platforms.
  • Experience implementing CI/CD pipelines and DevOps practices.
  • Experience with Git version control workflows.
  • Experience implementing data governance frameworks.
  • Familiarity with real estate, property management, or operational analytics.
  • Minimum 3–5 years of hands-on experience delivering production data solutions.
  • Proven track record building scalable data platforms, pipelines, or analytics architectures.
  • Experience working with cross-functional teams to deliver enterprise data initiatives.

Success Indicators (First 6 Months)

A successful hire will:

  • Establish Foundational Fabric platform architecture
  • Deploy production-grade data pipelines
  • Improve performance and reliability of enterprise datasets
  • Standardize semantic models and reporting logic
  • Reduce manual reporting and data inefficiencies
  • Improve visibility and trust in enterprise data

We are Proud to Provide the following:

  • Access to benefits including medical, dental and vision insurance
  • 401K with company match
  • Short-term and long-term disability
  • Life insurance
  • Generous Paid Time Off and holidays
  • Flexible spending account

© 2026 Qureos. All rights reserved.