The Enterprise BI & Analytics Team is the central hub of analytics and data engineering at WebstaurantStore. We transform billions of data points across marketing, supply chain, operations, and customer experience into trusted data foundations and actionable insights. Leveraging a modern data stack, scalable cloud technologies, and innovative DataOps practices, the team empowers smarter decisions, rapid growth, and new opportunities across the business.
As a Senior Data Platform Engineer, you will architect and own the next generation of our enterprise data integration and ingestion platform. You will lead the creation of a unified ingestion framework into Snowflake—building scalable, repeatable, and cloud-native, highly performant data pipeline patterns that power analytics and data products across the organization. You will guide our transition from SQL Server to Snowflake, mentor engineers, establish engineering best practices, and collaborate closely with stakeholders to deliver reliable, production-grade cloud data solutions.
Senior BI Engineers are technical leaders and trusted partners to BI developers, analysts, and cross functional teams. You will shape the future of our enterprise data platform by driving ingestion architecture, building modern pipelines, and enabling high impact engineering capabilities across the business.
Key Areas of Ownership
Data Platform Engineering & Cloud Architecture
-
Lead the design, development, and implementation of a standardized cloud data ingestion framework into Snowflake.
-
Architect scalable, secure, and reusable ingestion patterns for diverse data sources, including APIs, databases, cloud storage, flat files, and SharePoint.
-
Build and maintain Fabric Pipelines to support enterprise-scale orchestration and workflow automation.
-
Develop Python-based ingestion utilities, automation scripts, and reusable components.
-
Establish best practices for ingestion performance, observability, resiliency, error handling, logging, and monitoring.
-
Analyze existing ingestion processes and drive modernization, consolidation, and data platform optimization initiatives.
-
Support and guide the organization through the transition from SQL Server to Snowflake.
Leadership & Team Enablement
-
Serve as the technical authority for data integration and ingestion architecture, providing direction, mentorship, and oversight to BI developers.
-
Delegate work effectively and coach team members on ingestion patterns, frameworks, and best practices.
-
Conduct code reviews, provide architectural guidance, and ensure adherence to data engineering standards and DevOps principles.
-
Collaborate with BI Managers to align resources, timelines, and priorities.
-
Promote a culture of knowledge sharing, continuous improvement, and engineering excellence.
Planning, Analysis & Delivery
-
Partner with analysts and business stakeholders to understand ingestion requirements and translate them into scalable cloud data solutions.
-
Lead the planning, design, and analysis of ingestion workflows and data engineering projects.
-
Evaluate new data sources and determine optimal ingestion strategies.
-
Contribute to long-term data platform roadmap and cloud architecture strategy.
-
Ensure solutions meet enterprise standards for data quality, governance, and security.
-
Work is performed while sitting/standing and interfacing with a personal computer.
-
Requires the ability to communicate effectively using speech, vision, and hearing.
-
Requires the regular use of hands for simple grasping and fine manipulations.
-
Requires occasional bending, squatting, crawling, climbing, and reaching.
-
Requires the ability to occasionally lift, carry, push, or pull medium weights, up to 50lbs.
-
Access to a reliable and secure high-speed internet connection. Cable or fiber internet connections (at least 75mbps download/10mbps upload) are preferred, as satellite connections often cannot support the technologies used to perform day-to-day tasks.
-
Access to a home router and modem.
-
A dedicated home office space that is noise- and distraction-free. The space should have strong wireless connection or a wired Ethernet connection (wired connection is preferred, if possible).
-
A valid, physical address (apartment, suite, etc.). PO Boxes are not supported, as a physical address is required for you to receive your computer equipment.
-
The desire and ability to work and communicate with other team members via chat, webcam, etc.
-
Legal residents of one of the following states: (AK, AL, AR, AZ, CT, DE, FL, GA, IA, ID, IN, KS, KY, LA, MD, ME, MI, MN, MO, MS, NC, ND, NH, NM, NV, OH, OK, PA, SC, SD, TN, TX, UT, VA, VT, WI, WV, or WY). H-1B Visa Sponsorship Not Available, W2 only.
-
Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field; or equivalent professional experience.
-
7–10 years of professional experience in data engineering.
-
5+ years of hands-on experience with Snowflake, including ingestion, transformation, optimization, and platform performance tuning.
-
4–6 years of experience building data pipelines using Microsoft Fabric Pipelines or equivalent orchestration tools.
-
4–6 years of professional Python experience focused on automation, ingestion utilities, and framework development.
-
5+ years of enterprise SQL experience, including advanced SQL Server work and hands-on Change Data Capture (CDC) implementation and support.
-
5+ years of experience working in Agile/Scrum environments.
-
3+ years of experience leading data engineering initiatives or serving as a technical lead on ingestion projects.
-
Experience working with event-driven architecture, streaming ingestion, or data contract–based integration models.
-
Experience using Qlik Replicate for data movement and replication.
-
Experience using dbt Cloud for ELT transformation and orchestration.
-
Microsoft Azure experience.
-
Certifications in dbt Cloud, Snowflake, or Microsoft Fabric preferred but not required.
This role does not require a degree. We value relevant skills and experience and alignment with our core values above all else.
-
Deep expertise in building and maintaining cloud-based data integration pipelines across diverse platforms.
-
Strong proficiency with Snowflake architecture, Snowflake ingestion patterns, and SQL.
-
Advanced Python skills for automation, ingestion utilities, and framework development.
-
Hands-on experience with Microsoft Fabric Pipelines or similar orchestration tools.
-
Strong understanding of data modeling, data warehousing, and ETL/ELT best practices.
-
Ability to lead technical initiatives, mentor team members, and influence engineering direction.
-
Strong analytical and problem-solving skills with the ability to design scalable solutions.
-
Excellent communication skills, capable of translating complex technical concepts for nontechnical audiences.
-
Ability to manage multiple priorities in a fast-paced environment.
-
High degree of adaptability, emotional intelligence, and collaborative mindset.
-
Customer-focused approach with a commitment to delivering high-quality solutions.
The foodservice professional’s premier source for restaurant equipment, supplies, and knowledge online. Our purpose is to empower and equip people to run their businesses more profitably and efficiently.
-
Medical
-
Vision
-
Dental
-
PTO
-
Paid Maternity Leave
-
Paid Parental Leave
-
Life Insurance
-
Disability
-
Dependent Care FSA
-
401(k) matching
-
Employee Assistance Program
-
Wellness Incentives
-
Company Discounts
-
AT&T & Verizon Discount
-
Bonus Opportunities
-
Accident Insurance
-
Critical Illness Insurance
-
Adoption Assistance
-
On-Site Fitness Centers
-
Dog-friendly Offices