Qureos

Find The RightJob.

FDE-Senior Data Engineer (Forward Deployed Engineer)

Location: Remote | Type: Contract



About Newpage Solutions



Newpage Solutions is a global digital health innovation company helping people live longer, healthier lives. We partner with life sciences organisations which include, pharmaceutical, biotech and healthcare leaders, to build transformative AI and data driven technologies addressing real-world health challenges.

From strategy and research to UX design and agile development, we deliver and validate impactful solutions using lean, human-centered practices.

We are proud to be a ‘Great Place to Work®’ certified company for the last three consecutive years. We also hold a top Glassdoor rating and are named among the "Top 50 Most Promising Healthcare Solution Providers" by CIOReview. As an organisation, we foster creativity, continuous learning and inclusivity, creating an environment where bold ideas thrive and make a measurable difference in people’s lives.



Your Mission



We are seeking a Staff Data Engineer who will design and build data integration, storage systems, and data infrastructure that enable analytics and AI-powered applications. In this role, you will navigate complex enterprise data landscapes — building relationships to gain data access, handling undocumented schemas through investigation, and building robust, maintainable integration solutions that serve multiple teams. You will optimise AI tool usage across your area, establishing practices that balance AI speed with verification rigour.

In this forward-deployed role, you will be embedded directly with business units including Commercial, Manufacturing, and R&D. You will seek out undefined data problems rather than avoiding them — embedding with users to discover latent data needs, coaching engineers on data discovery techniques, and turning ambiguity into clear data strategies. As part of the discovery-to-scale pipeline, you will lead pattern generalization initiatives, defining criteria for when to generalize data solutions versus keeping them custom, and establishing intake processes for validated solutions from Forward Deployed Engineers.

This role encompasses multi-team initiatives with area-wide impact. You will set your own direction within broader strategic goals, influence teams in your area, and be recognised as a domain expert. You will drive through ambiguity to deliver results at scale.


What You’ll Do

Responsibilities:

  • Business: Immerse in operations until you think like an insider. Rapidly acquire domain expertise through direct observation, translate between business and engineering seamlessly, and mentor engineers in your area on immersion. Influence senior stakeholders effectively, manage complex stakeholder landscapes with competing agendas, and build trust rapidly with new stakeholders.
  • Delivery: Lead rapid delivery initiatives across teams in your area, coach on prototype-first approaches for data solutions, and establish trust through consistent fast delivery. Navigate complex enterprise data landscapes, build relationships to gain data access, handle undocumented schemas through investigation, and define clear criteria for prototype-to-production transitions.
  • People: Build high-performing teams across your area, navigate complex interpersonal dynamics, foster psychological safety, and create environments where diverse perspectives are valued. Influence through communication at all levels — from frontline to executive. Handle difficult conversations skillfully and train engineers in your area on effective communication.
  • AI-Augmented Development: Optimise AI tool usage across teams in your area, train engineers on AI-augmented workflows, evaluate new AI development tools, and establish practices that balance AI speed with verification rigour.
  • Scale: Design complex data architectures spanning multiple systems across teams. Make strategic trade-offs between consistency, performance, and maintainability. Establish and enforce quality standards across teams in your area, mentor engineers on effective code review, and drive testing strategies for data pipelines.
  • Documentation: Define documentation standards across teams in your area, create documentation systems and templates for data pipelines, train engineers on spec-driven development, and ensure data lineage documentation quality across projects. Lead pattern generalization initiatives, defining criteria for when to generalize versus keep custom.
  • Reliability: Define reliability standards across teams in your area, drive post-incident improvements systematically for data pipeline failures, design capacity planning processes for data infrastructure, and mentor engineers on SRE practices applied to data systems.
  • Process: Lead lean transformations across teams in your area, design flow-optimised processes for data delivery, coach engineers on lean principles, and balance speed with sustainability. Establish metrics that drive improvement.

Behaviour

  • Own the Outcome: Drive accountability culture focused on outcomes not deliverables. Own business relationships and impact metrics across your function. Make trade-offs between custom data solutions and generalisable work. There is no "I must run this by X." Ensure verification rigour for AI-generated code — particularly critical for data transformations.
  • Be Polymath Oriented: Champion cross-disciplinary learning. Create holistic data solutions spanning technical and business domains. Embody the Renaissance Engineer ideal. Translate specialised data engineering knowledge into accessible explanations. Think like a business insider.
  • Communicate with Precision: Create spec-driven development practices for data pipelines. Mentor others on precise communication around data definitions, quality expectations, and pipeline behaviour. Span C-level executives to frontline workers. Drive clarity as a core value across your function. Represent the organisation externally.
  • Don't Lose Your Curiosity: Drive team curiosity through challenging questions about data quality and sources. Create environments where exploration and experimentation with data are encouraged. Model problem discovery orientation. Seek out ambiguity in data landscapes rather than avoiding it.
  • Think in Systems: Shape systems design practices across your function for data architectures. Conduct chaos engineering experiments on data pipelines. Influence cross-team architecture decisions. Create clarity from complexity. Bridge technical data systems with business processes.

Practitioner-level Skills

  • Data Integration: You navigate complex enterprise data landscapes across teams, build relationships to gain data access, handle undocumented schemas through investigation, and build robust, maintainable integration solutions. You mentor engineers in your area on data integration challenges.
  • Data Modeling: You design complex data architectures spanning multiple systems across teams. You make strategic trade-offs between consistency, performance, and maintainability. You mentor engineers in your area on data modelling best practices.
  • Architecture & Design: You design complex multi-component systems end-to-end, evaluate architectural options for large initiatives across teams, guide technical decisions for your area, and mentor engineers on architecture. You balance elegance with delivery needs.

Working-level Skills

  • Code Quality & Review: You produce consistently high-quality, well-tested data pipeline code. You review AI-generated code critically and never ship code you don't fully understand. You identify edge cases and ensure adequate test coverage.
  • Full-Stack Development: You build complete applications rapidly across any technology stack for teams in your area. You select the right tools for each problem, balance technical debt with delivery speed, and mentor engineers on full-stack development.
  • DevOps & CI/CD: You build complete CI/CD pipelines end-to-end, manage infrastructure as code, implement monitoring, and design deployment strategies for your data services.
  • Cloud Platforms: You design cloud-native data solutions, manage infrastructure as code, implement security best practices, and make informed service selections. You troubleshoot cloud-specific issues.
  • Site Reliability Engineering: You design observability strategies for your data services, lead incident response, implement resilience testing, and conduct blameless post-mortems. You balance reliability investment with feature velocity.
  • AI-Augmented Development: You integrate AI tools strategically into your development workflow. You review AI-generated code with the same rigour as human code and never ship code you don't fully understand.
  • AI Evaluation & Observability: You design evaluation frameworks with custom evaluators tailored to your use case. You build golden datasets, establish annotation workflows with clear rubrics, and run experiments to compare prompt and model changes systematically.
  • Problem Discovery: You seek out undefined data problems rather than avoiding them. You embed with users to discover latent data needs, coach engineers in your area on problem discovery techniques, and turn ambiguity into clear problem statements.
  • Rapid Prototyping & Validation: You lead rapid delivery initiatives across teams in your area, coach on prototype-first approaches, establish trust through consistent fast delivery, and define clear criteria for prototype-to-production transitions.
  • Business Immersion: You immerse in operations until you think like an insider. You rapidly acquire domain expertise through direct observation, translate between business and engineering seamlessly, and mentor engineers in your area on immersion.
  • Stakeholder Management: You manage multiple stakeholders with different interests, navigate conflicting priorities diplomatically, and build trust through consistent delivery. You tailor communication to each audience.
  • Multi-Audience Communication: You influence through communication at all levels — from frontline to executive. You handle difficult conversations skillfully, train engineers in your area on effective communication, and represent teams across the function.

Foundational-level Skills

  • AI Literacy: You evaluate AI solutions critically for specific use cases. You understand bias, fairness, and hallucination risks. You make informed decisions about when AI helps versus when traditional approaches are better.
  • Technical Writing: You create comprehensive documentation for complex data systems. You write precise specifications that enable accurate AI-generated code, establish documentation practices for your projects, and ensure docs are discoverable.
  • Team Collaboration: You facilitate collaboration across the team, resolve minor conflicts before they escalate, enable others to succeed, and contribute positively to team dynamics and morale.
  • Pattern Generalization: You extract reusable components from field data solutions, design appropriate abstractions that balance flexibility with simplicity, and collaborate with FDEs to validate generalised solutions in new contexts.
  • Data Analysis: You perform exploratory data analysis independently, create effective visualisations, and identify patterns in data. You ask good questions about data quality and context.
  • Knowledge Management: You create searchable knowledge articles proactively, maintain team documentation, and organise information so others can find it.


What You Bring


  • Bachelor’s degree in computer science, Data Engineering, Software Engineering, or related field with 7+ years of relevant professional experience.
  • Deep production experience with Python, SQL, and cloud data platforms (AWS preferred — including services such as Glue, S3, Redshift, Lambda, and EMR; Snowflake, Azure, or GCP also valued) is required.
  • Proven expertise in large-scale data integration, distributed data processing (such as Spark), and enterprise data architecture is essential.
  • Experience with enterprise data platforms (such as Salesforce Data Cloud or similar CRM data systems) is strongly valued.
  • Demonstrable fluency with AI coding tools (such as Claude Code, Cursor, GitHub Copilot, or similar) and hands-on experience architecting data infrastructure that supports generative AI applications (data preparation for LLMs, vector databases, RAG systems, evaluation pipelines) is essential.
  • Experience leading data initiatives across multiple teams, mentoring engineers, and navigating complex enterprise data landscapes with undocumented or messy data sources is required.
  • Experience in an embedded, forward-deployed, or consulting-style engineering model is a strong plus.

Bonus Skills / Experience

  • Must have exceptional communication skills and demonstrated ability to influence at senior levels.
  • This role requires comfort leading across ambiguous data environments, building trust-based relationships with commercial stakeholders at all levels, developing junior and mid-level data engineers, and making strategic decisions that balance delivery speed with data quality and architectural integrity.
  • You will shape data engineering practices across your area and be expected to lead multi-team data initiatives.


What We Offer



At Newpage, we’re building a company that works smart and grows with agility, where driven individuals come together to do work that matters. We offer:

  • A people-first culture - Supportive peers, open communication and a strong sense of belonging
  • Smart, purposeful collaboration - Work with talented colleagues to create technologies that solve meaningful business challenges
  • Balance that lasts - We respect your time and support a healthy integration of work and life
  • Room to grow - Opportunities for learning, leadership and career development, shaped around you
  • Meaningful rewards - Competitive compensation that recognises both contribution and potential


Ready to Apply?



Let’s build the future of health together. Apply below or reach out to:
Ramadevi.bhumireddy@newpage.io


More about Newpage

Newpage is a digital health solutions company. We devote ourselves to advancing the quality of life by enhancing health and optimizing the longevity of people. We do this by, passionately building futuristic technologies for global organizations across the healthcare ecosystem. We partake at every stage from problem definition, strategy & service design, user research, UX design, and agile software development – utilizing lean practices to deliver and validate highly innovative digital health solutions that drive user value and business transformation.

Newpage is recognized by ‘CIO’s Review’ as “Top 50 Promising Healthcare Solution Providers” and Great Place to Work Certified (GPTW) 2023 & 2024.

© 2026 Qureos. All rights reserved.