About Trailhead Biosystems
Trailhead Biosystems is an early stage, fast-moving biotech company building next-generation stemcell products. Data is central to everything we do, scientific data from laboratory workflows, operational data from business systems, and data powering active AI initiatives. We need someone who owns the full infrastructure stack that supports all of it.
About the Role
This is a hands-on, full-stack data infrastructure role. You own the databases, pipelines, APIs, cloud systems, and internal tooling that the entire company depends on; including systems that directly support our AI initiatives. You also build simple internal applications that make data accessible to the people who rely on them. You will inherit an active, production-grade stack and are expected to contribute meaningfully from early on.
Responsibilities
Data Infrastructure
- Design, build, and maintain relational databases supporting scientific research, AI applications, operational systems, and business data; schema design, normalization, query optimization, indexing, partitioning, and migrations
- Build and maintain pipeline orchestration workflows (e.g. Airflow, Prefect, Dagster) that move and transform data reliably across internal and external sources, including cloud object storage
- Develop and maintain data transformation and testing frameworks (e.g. dbt or equivalent) with full testing coverage; own the data build and quality lifecycle
- Build and maintain Python-based REST APIs serving internal applications and AI systems
- Manage integrations with external scientific data sources and internal laboratory data systems
- Own database security end-to-end: access controls, encryption at rest and in transit, audit logging, backup and disaster recovery
- Manage cloud infrastructure across compute, storage, identity, and networking layers
- Document all systems, schemas, pipelines, and processes without exception
Internal Tooling and AI Infrastructure
- Build lightweight internal data tools and dashboards using Python-native frameworks (e.g. Streamlit, Django) and BI tools (e.g. PowerBI) for data entry, visualization, and operational reporting
- Support AI initiative infrastructure: work comfortably at the intersection of data systems and AI applications integrating with model APIs, supporting vector search infrastructure, and building interfaces that surface AI outputs to internal users
- Collaborate with the AI Engineer on data layer requirements you own the data infrastructure, they own the model layer
- Stay current with the modern data and AI infrastructure stack; bring informed opinions on tooling decisions as the company's needs evolve
Systems and Infrastructure Context
- Understand how the hardware and network layer affects data system performance — disk I/O, memory pressure, network latency, and storage architecture are not abstract concepts to you
- Troubleshoot across the full stack when needed: application, database, operating system, and hardware
- In a small company environment, you are the technical anchor. You coordinate with our external IT partners intelligently — scoping issues accurately, communicating clearly, and following through to resolution — without this becoming the center of your day
- Comfortable handling routine infrastructure and IT matters (user access, connectivity, device setup) as they arise, as part of a broader ownership mindset rather than a defined job function
Requirements
- Deep PostgreSQL expertise: schema design, query optimization, indexing, partitioning, and migration management with the ability to defend every architectural decision made.
- Production pipeline experience: hands-on building, deploying, and operating data pipelines using modern orchestration tools, including real failure handling in production environments.
- Data transformation and testing: practical experience with modern transformation frameworks, dependency management, and data quality testing in live production settings.
- Python proficiency: production-grade, clean, maintainable code that others can read, extend, and operate independently.
- REST API development: deployed APIs built with modern Python frameworks, incorporating proper validation, authentication, and production-grade serving.
- Internal tooling: demonstrated ability to build functional data applications and dashboards for non-technical users using Python-native tools.
- Cloud infrastructure: hands-on management of compute, storage, identity, and networking on a major cloud platform (Azure preferred) operated in production, not just studied.
- Docker, Git, and database security: containerization experience, disciplined version control, and direct ownership of access control, encryption, and backup and recovery strategies.
- Infrastructure literacy: working knowledge of how disk I/O, memory pressure, network latency, and hardware conditions surface as data system problems with the ability to troubleshoot across every layer.
- AI infrastructure awareness: experience building the data layer that AI systems depend on reliable pipelines, low-latency retrieval, and vector search infrastructure with an understanding of what those systems require from the data layer beneath them.
- Small company mindset: demonstrated experience owning problems end to end in a lean environment, without defined team structures absorbing the slack.
- Collaborative by default: a track record of working effectively across scientific, commercial, and technical teams contributing to shared goals, sharing context proactively, and making the people around them more effective rather than operating in isolation.
- Communication and documentation: a history of surfacing issues proactively, documenting systems to a standard that others can operate independently, and translating technical context clearly for non-technical stakeholders without creating bottlenecks.
- Continuity and adaptability: proven ability to inherit in-progress work and continue it cleanly, reprioritize without losing momentum, and support teammates during periods of transition or shifting priorities.
Pay: $85,000.00 - $100,000.00 per year
Benefits:
- 401(k)
- 401(k) matching
- Dental insurance
- Health insurance
- Paid time off
- Retirement plan
- Vision insurance
Application Question(s):
- Will you now or in the future require sponsorship for employment visa status (e.g., H-1B visa status)?
Education:
Ability to Commute:
- Beachwood, OH 44122 (Required)
Ability to Relocate:
- Beachwood, OH 44122: Relocate before starting work (Preferred)
Work Location: In person