FIND_THE_RIGHTJOB.
Beaverton, United States
Job Summary
Senior Data Engineer
Teema
Contract
Hybrid | Beaverton, OR, United States
Founded in 2020 in Portland, Oregon, My client has grown to over 75 employees and serves more than 100 customers.
Data Engineer (Healthcare Domain)
Join a company at the forefront of data innovation and AI.
Role Overview:
We are seeking a Data Engineer with strong experience in healthcare to help leverage our data assets effectively. Our platform handles billions of data rows monthly, impacting millions of users. Your healthcare industry expertise will support our strategic data usage for meaningful outcomes.
What You'll Work On:
50% – Building, scaling, and maintaining the data pipelines.
20% – Assisting in the implementation of DataOps methodologies.
20% – Writing, optimizing, and tuning queries and algorithms.
10% – Supporting, monitoring, and maintaining data pipelines.
We foster a collaborative environment that values proactive communication, continuous learning, and teamwork. Expect strong leadership support when encountering and communicating roadblocks.
Who Thrives in This Role?
Healthcare domain expertise – You have a practical understanding and experience within the healthcare industry, including knowledge of healthcare data standards and regulations.
Self-Starter & Curious Learner – You're proactive in solving problems, stay updated on industry trends, and continuously seek to enhance your skills (yes, listening to Databricks and data engineering podcasts counts!
Databricks Familiarity – Hands-on experience with Databricks technologies across Azure, AWS, or GCP.
Python/Scala Proficiency – Strong skills in Python or Scala.
Spark Experience – Familiarity with Spark (Databricks), Delta Lakehouse, Delta Live Tables, and Unity Catalog is beneficial.
Skills & Qualifications Required:
Must be a US Citizen or Green Card holder with excellent communication skills
6+ years of relevant healthcare data experience.
Practical knowledge of Spark (Scala or Python) and Databricks.
Solid backend engineering skills for data processing, focusing on scalability, availability, and performance optimization.
Familiarity with algorithms and data structures.
Experience with workflow orchestration tools such as Databricks DLT, Azure Data Factory (ADF), Azure Synapse, Airflow, etc.
Understanding of distributed systems architecture.
Proficiency with REST APIs.
Experience working with NoSQL databases.
Recent professional experience involving Scala/Python and Spark (Databricks).
Education:
B.E./B.Tech in Computer Science & Engineering.
#WorkwolfSimilar jobs
Eide Bailly
Minneapolis, United States
11 days ago
Oklo Inc
Santa Clara, United States
11 days ago
Autodesk
Boston, United States
11 days ago
Palo Alto Networks
Santa Clara, United States
11 days ago
Information Technology Senior Management Forum
McLean, United States
11 days ago
GoFundMe
San Francisco, United States
11 days ago
Information Technology Senior Management Forum
Chicago, United States
11 days ago
© 2025 Qureos. All rights reserved.