Qureos

FIND_THE_RIGHTJOB.

Overview:

The Data Engineer will play a critical role in building and maintaining the data infrastructure that powers analytics, reporting, and machine learning initiatives across the company. This role bridges the gap between raw data and actionable insights by ensuring data pipelines are reliable, scalable, and optimized for performance. The Data Engineer will partner with data scientists, analysts, and product teams to deliver high-quality, well-structured data that supports decision-making and drives business growth.

Prodege:

A cutting-edge marketing and consumer insights platform, Prodege has charted a course of innovation in the evolving technology landscape by helping leading brands, marketers, and agencies uncover the answers to their business questions, acquire new customers, increase revenue, and drive brand loyalty & product adoption. Bolstered by a major investment by Great Hill Partners in Q4 2021 and strategic acquisitions of Pollfish, BitBurst & AdGate Media in 2022, Prodege looks forward to more growth and innovation to empower our partners to gather meaningful, rich insights and better market to their target audiences.

As an organization, we go the extra mile to “Create Rewarding Moments” every day for our partners, consumers, and team. Come join us today!

Primary Objectives:

  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Ensure data quality, integrity, and availability for analytics and reporting.
  • Collaborate with cross-functional teams to understand data needs and deliver solutions.
  • Optimize performance of databases and data workflows for efficiency and scalability.
  • Implement best practices for data governance, security, and compliance.
  • Support data scientists and analysts by providing clean, well-structured data sets.
  • Monitor and troubleshoot data pipeline issues to minimize downtime.

Qualifications - To perform this job successfully, an individual must be able to perform each job duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.

Detailed Job Duties:

  • Build, manage, and improve data pipelines to support analytics, business intelligence, and machine learning initiatives.
  • Extract, transform, and load (ETL) data from multiple sources into cloud-based data warehouses (e.g., Snowflake, Redshift, BigQuery).
  • Implement data models and database structures to ensure scalability and high performance.
  • Collaborate with stakeholders to define requirements for data architecture and reporting needs.
  • Ensure data accuracy through testing, validation, and ongoing monitoring processes.
  • Optimize SQL queries, ETL jobs, and data flows for performance and cost-effectiveness.
  • Maintain compliance with data security, privacy regulations, and governance policies.
  • Document processes, workflows, and architecture to ensure transparency and knowledge sharing.
  • Identify and implement automation opportunities to streamline data operations.
  • Troubleshoot and resolve production issues quickly to maintain consistent data availability.

What does SUCCESS look like?
Success in this role means delivering reliable and efficient data pipelines that empower teams with accurate, timely, and accessible data. The organization will benefit from improved decision-making, enhanced analytics capabilities, and seamless integration of data sources into business processes. The Data Engineer will be recognized for enabling scalable data operations and contributing to the company’s overall data strategy.

The MUST Haves:

  • Bachelor’s degree in Computer Science, Engineering, or a related technical field.
  • Three or more (3 +) years of experience as a Data Engineer or in a similar role.
  • Strong proficiency in SQL and data modeling principles.
  • Hands-on experience with cloud platforms such as AWS, GCP, or Azure.
  • Proficiency with ETL tools and frameworks (e.g., dbt, Airflow, or similar).
  • Experience with programming languages such as Python or Scala for data engineering tasks.
  • Knowledge of data warehousing concepts and technologies.
  • Understanding of data security, privacy, and governance practices.
  • Strong problem-solving, troubleshooting, and analytical skills.
  • Excellent communication and collaboration abilities.

The Nice to Haves:

  • Master’s degree in Data Engineering, Computer Science, or a related field.
  • Experience with real-time data processing tools (e.g., Kafka, Spark Streaming).
  • Knowledge of containerization and orchestration (Docker, Kubernetes).
  • Familiarity with machine learning pipelines and integration.
  • Experience with data cataloging and lineage tools.
  • Certifications in cloud platforms or data engineering specialties.

© 2025 Qureos. All rights reserved.