DevCo is seeking a full-time
Data Engineer
to join our team in
Bellevue, WA
in a hybrid in-office capacity
.
The Data Engineer owns data ingestion, transformation, and warehouse architecture across DevCo’s operational and financial systems. This role designs and maintains scalable, production‑grade pipelines that integrate data from third‑party platforms and internal systems, ensuring high‑quality, reliable datasets for analytics, BI, and decision‑making.
About The Company
DevCo Residential Group is an integrated development and investment company focused on multi-family communities. Founded in 1994, the company and its affiliates develop, own, and manage over 14,000 affordable and market rate apartment units throughout the United States. Headquartered in Bellevue, Washington, DevCo is one of the largest providers of affordable housing in Washington State.
Mission
DevCo Residential Group’s mission is to develop, construct and manage high-quality multifamily housing that provides stability, fosters growth and delivers long-term value to our residents and stakeholders.
Vision
DevCo’s vision is to be a leading developer, builder and manager of quality multifamily housing throughout the western US.
Values
-
Quality: We deliver excellence in every aspect of our work.
-
Commitment: We honor our promises with unwavering dedication.
-
Teamwork: We achieve more together through collaboration and respect.
-
Integrity: We uphold the highest ethical standards in all we do.
Pay Details:
$90,000 to $105,000 in annual salary + performance bonus target of 10%
Schedule:
Monday-Friday 8am-5pm (Hybrid in-office format)
Benefits Offered
-
100% company paid medical benefits for employee coverage.
-
100% company paid dental and vision benefits for employee coverage.
-
Healthcare and dependent care flexible spending accounts.
-
Company paid life insurance, AD&D and long-term disability benefits for employee coverage.
-
Best-in-class voluntary insurance benefits.
-
Pre-tax and Roth 401(k) programs with a company match equal to 100% of the first 4% contributed by the employee.
-
Discretionary bonus programs.
-
Eligibility for a 30% housing discount consideration.
-
Employee assistance program (EAP) with 24/7 counseling service
-
Company-sponsored industry training and certifications.
-
3 weeks of paid time off each year.
-
Up to 12 paid holidays each year
Job Responsibilities
-
Data Integration & Pipeline Engineering
-
Design, build, and maintain automated data pipelines consuming data via APIs, SFTP, flat files, databases, and third‑party connectors.
-
Own integrations with Yardi, Procore, Smartsheet, Northspyre, HappyCo, and other operational platforms.
-
Implement modern ELT/ETL workflows using orchestration and transformation frameworks.
-
Data Warehouse Architecture
-
Architect and maintain the enterprise data warehouse, including schema design, partitioning strategy, indexing, and performance optimization.
-
Develop layered data models (raw, curated, analytics‑ready) that support enterprise reporting and BI.
-
Data Quality, Reliability & Observability
-
Establish data quality checks, reconciliation rules, freshness monitoring, and anomaly detection.
-
Build logging, alerting, and monitoring to ensure pipeline reliability and SLA adherence.
-
Manage scalability and performance as data volume and usage grow.
-
Documentation & Governance
-
Document data sources, pipeline logic, schemas, lineage, and ownership.
-
Support data governance standards, including security, access controls, and handling of sensitive data.
-
Collaboration
-
Partner closely with the Data Analyst to ensure the warehouse supports semantic modeling and reporting needs.
-
Collaborate with Finance, Development, Construction, Property Management, and Accounting teams to validate business logic and requirements.
Qualifications
-
2+ years of experience in data engineering or analytics engineering.
-
Strong SQL skills and experience building production data pipelines.
-
Experience integrating with APIs and third‑party SaaS platforms.
-
Hands‑on experience with cloud data warehouses (Snowflake, BigQuery, Redshift, Azure).
-
Familiarity with orchestration and transformation tools (e.g., Airflow, Prefect, dbt, Fivetran, Stitch).
-
Strong understanding of dimensional and analytical data modeling best practices.
DevCo Residential Group is an equal opportunity employer.