Onsite- Green Bay, WI.
POSITION SUMMARY:
Responsible for the development and maintenance of data pipeline infrastructures, ensuring the seamless flow of data for various needs within the organization. Work alongside data professionals, contributing to the design and implementation of data architectures that meet the evolving needs of the business. Support the transformation of raw data into usable information for analysts and other internal stakeholders.
DUTIES AND RESPONSIBILITIES:
DATA PIPELINE DEVELOPMENT
- Assist in building and maintaining data pipelines from internal databases and SaaS applications
- Support the development of ETL (Extract, Transform, Load) processes
- Assist in implementing data validation and quality checks to ensure accuracy and consistency
- Write clean, documented, and maintainable code following established standards
DATA INFRASTRUCTURE SUPPORT
- Assist in maintaining databases, data warehouses, and data lake components
- Support the configuration and monitoring of data infrastructure
- Assist in troubleshooting basic data processing issues and escalate complex problems
- Participate in code reviews and incorporate feedback from senior engineers
DATA INTEGRATION
- Assist with integrations from various data sources into the data ecosystem
- Assist in maintaining connections with internal and external APIs
- Support data ingestion processes for structured and unstructured data
- Document integration processes and data flow for transparency and standardization
COLLABORATION & DOCUMENTATION
- Create and maintain systems and process documentation
- Collaborate with various engineers and analysts to support their work
- Participate in team meetings and contribute to planning discussions
- Actively seek out learning opportunities and keep up with advancements in data engineering practices
MINIMUM KNOWLEDGE, EXPERIENCE & SKILLS REQUIREMENTS:
- Bachelor’s degree in computer science, data engineering, information systems, or related field or equivalent experience
- 1 - 2 years of experience in data engineering, software development or related roles
- Experience using Python, Java, or Scala for data processing (Python preferred)
- Familiar with data-related Python packages (pandas, NumPy)
- Basic understanding of SQL and relational databases
- Basic knowledge of version control systems (Git)
- Understanding of data modeling fundamentals preferred
- Knowledge of Linux/UNIX operating systems preferred
- Exposure to cloud platforms (AWS, Azure,or Google Cloud Platform)
- Able to take direction, prioritize work, and manage multiple activities simultaneously
- Excellent communication skills
- Able to work independently and as part of a team
- Willing to share knowledge and experience with other members of the team
- Strong analytical and problem-solving skills
- Attention to detail and commitment to data quality
- Solid planning and organizational skills
- Proficiency with Microsoft Office Suite of programs
ESSENTIAL FUNCTIONS & WORK REQUIREMENTS:
- Ability to effectively communicate at all levels within the organization through written and two-way verbal communication
- Able to read and write at a high school graduate level
- Able to sit or stand for extended periods of time
- Able to operate various office equipment (e.g., personal computer, telephone, fax machine, copier, etc.)
- Able to lift 10 to 20 pounds
- Able to work overtime and regular and/or extended (evenings, nights, and weekends) office hours to meet established deadlines
- Able to travel independently to support Company objectives and personal development
These statements are intended to describe the general nature and level of work performed by teammates assigned to this job classification. They are not intended to be an exhaustive list of all responsibilities, duties, and skills required.