Qureos

Find The RightJob.

Senior Data & Integration Engineer

The Senior Data and Integration Engineer is a senior technical role responsible for architecting, developing, and optimizing enterprise data and integration solutions that enable seamless, reliable data flow across Parker’s Kitchen’s technology ecosystem. This role serves as a technical leader and subject matter expert in ETL/ELT pipeline design, API integration architecture, data warehousing strategy, and cloud-based data platforms. The ideal candidate will drive technical standards, mentor junior engineers, and work closely with business stakeholders, developers, and data teams to deliver scalable, high-quality data infrastructure that supports operational excellence and strategic decision-making.


ESSENTIAL DUTIES AND RESPONSIBILITIES


Data Architecture & Pipeline Engineering:

  • Architect, develop, and maintain complex ETL/ELT pipelines to extract, transform, and load data from multiple sources into data lakes and data warehouses, ensuring scalability and fault tolerance.
  • Define and enforce data transformation standards to ensure data quality, consistency, and performance across all pipelines.
  • Automate data ingestion and transformation processes using tools such as Azure Data Factory (ADF), Apache Airflow, SSIS, or equivalent platforms.
  • Evaluate emerging data technologies and recommend adoption strategies aligned with organizational goals.
  • Design and implement data lake and Lakehouse architectures to support analytics, reporting, and operational workloads.

Integration Architecture & API Development:

  • Design, develop, and govern API-based integrations between enterprise applications, cloud platforms, and third-party services, establishing reusable integration patterns.
  • Work with RESTful and SOAP APIs, JSON, XML, webhooks, and event-driven architectures.
  • Implement and enforce authentication and security protocols (OAuth 2.0, JWT, API keys, mTLS) across all integration points.
  • Define integration architecture standards, including error handling, retry logic, rate limiting, and monitoring practices.
  • Lead integration efforts for new vendor platforms and system migrations, serving as the technical point of contact.

Database Engineering & Cloud Platforms:

  • Design and optimize database structures for data warehousing, reporting, and real-time analytics solutions.
  • Write and optimize complex SQL queries, stored procedures, views, and materialized views to support data extraction and analytics workloads.
  • Work with relational databases (SQL Server, PostgreSQL, Snowflake) and NoSQL solutions, recommending the appropriate technology for each use case.
  • Develop and deploy data solutions in cloud environments (primarily Azure), leveraging cloud storage, compute services, and serverless technologies for scalable data processing.
  • Contribute to Infrastructure as Code (Terraform, ARM templates) and CI/CD pipeline development for data platform deployments.

Data Governance, Security & Compliance:

  • Implement and champion data governance best practices, including metadata management, data lineage tracking, data cataloging, and access controls.
  • Ensure compliance with data privacy regulations (PCI DSS, state privacy laws) and organizational security policies.
  • Define and monitor data quality metrics, SLAs, and alerting thresholds for all production data pipelines.
  • Conduct security reviews of data integrations and recommend hardening measures.

Technical Leadership & Collaboration:

  • Serve as a technical mentor and resource for peers and cross-functional teams on data engineering best practices.
  • Create and maintain technical documentation, architecture diagrams, runbooks, and process workflows.
  • Lead troubleshooting and root cause analysis for data integration failures and performance degradation.
  • Support cross-functional teams in identifying data-related issues and translating business requirements into technical solutions.
  • Participate in architecture review boards and contribute to the organization’s technology roadmap.
  • Drive continuous improvement initiatives to reduce technical debt and improve pipeline reliability and performance.

EDUCATION AND REQUIREMENTS

Required:

  • Bachelor’s degree in computer science, Information Systems, Data Engineering, or a related field.
  • 5+ years of progressive experience in data engineering, data integration, or related roles, with demonstrated ownership of enterprise-scale solutions.
  • Deep expertise in ETL/ELT tools (Azure Data Factory, Apache Airflow, SSIS, Informatica, or equivalent).
  • Advanced SQL proficiency and strong experience with database technologies (SQL Server, Snowflake, PostgreSQL).
  • Extensive hands-on experience designing and consuming APIs, web services, and integration platforms.
  • Strong experience with cloud data platforms, preferably Azure (Synapse, Data Lake Storage, Functions).
  • Proficiency in scripting languages (Python, PowerShell, or Bash) for automation and data processing.
  • Experience with DevOps practices, CI/CD pipelines, and Infrastructure as Code (Terraform, ARM templates).
  • Proven ability to lead technical initiatives, define standards, and influence architecture decisions.
  • Strong problem-solving, analytical, and communication skills with the ability to translate complex technical concepts for business stakeholders.

Preferred:

  • Experience in retail, convenience store, or food service technology environments.
  • Experience with data observability and pipeline monitoring tools.
  • Knowledge of event-driven architecture, message queues (Kafka, Azure Service Bus), and stream processing.
  • Experience with Snowflake administration, optimization, and Snowpark.
  • Relevant certifications (Azure Data Engineer Associate, Snowflake SnowPro, AWS Data Analytics, etc.)

Physical Requirements:

  • Office environment with occasional requirements to work outside of normal business hours.
  • On-call rotation for data platform emergencies and critical pipeline failures.
  • Ability to lift and carry up to 50 pounds.
  • Ability to work in confined spaces and to climb ladders and stairs.
Equal Opportunity Employer
This employer is required to notify all applicants of their rights pursuant to federal employment laws. For further information, please review the Know Your Rights notice from the Department of Labor.


Parker’s is an equal-opportunity employer committed to hiring a diverse workforce and sustaining an inclusive culture. Parker’s does not discriminate on the basis of disability, veteran status or any other basis protected under federal, state, or local laws.

© 2026 Qureos. All rights reserved.