Qureos

FIND_THE_RIGHTJOB.

GCP Data Engineer

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Title: GCP Data Engineer

Location: NJ

Duration: Long-Term

Job Summary
We are seeking a skilled GCP Data Engineer to join our dynamic data team. The ideal candidate will be responsible for designing, developing, and maintaining scalable data pipelines and data warehouses utilizing cloud-based technologies. This role involves working with big data tools, implementing ETL processes, and collaborating with cross-functional teams to enable data-driven decision-making. The position offers an exciting opportunity to work with cutting-edge cloud platforms such as Google Cloud Platform (GCP), AWS, Azure Data Lake, and related technologies to support enterprise analytics and modeling initiatives.

Responsibilities

  • Design, develop, and optimize scalable data pipelines using GCP, AWS, and Azure Data Lake services.
  • Build and maintain robust data warehouses utilizing SQL, Oracle, Microsoft SQL Server, and Hadoop ecosystems including Apache Hive and Spark.
  • Develop ETL workflows using tools like Talend, Informatica, and custom scripting in Python, Bash (Unix shell), and Shell Scripting.
  • Integrate Linked Data sources and implement RESTful APIs for seamless data access across platforms.
  • Collaborate with data scientists on model training and analysis skills to facilitate advanced analytics projects.
  • Ensure data quality, consistency, and security across all platforms while adhering to best practices in database design.
  • Participate in Agile development cycles to deliver iterative improvements on data solutions.
  • Monitor system performance and troubleshoot issues related to data processing pipelines.
  • Document technical specifications and provide ongoing support for deployed solutions.

Skills

  • Strong experience with cloud platforms including GCP, AWS, and Azure Data Lake.
  • Proficiency in programming languages such as Java, Python, VBA, Bash (Unix shell), and Shell Scripting.
  • Extensive knowledge of Big Data tools including Hadoop ecosystem (Hadoop, Spark) and Apache Hive.
  • Expertise in ETL processes using Talend, Informatica, or similar tools.
  • Solid understanding of SQL databases such as Microsoft SQL Server, Oracle, and Data Warehouse architectures.
  • Familiarity with Looker for data visualization and analytics reporting.
  • Experience with Linked Data concepts and RESTful API integration.
  • Knowledge of Model Training techniques for predictive analytics.
  • Strong analysis skills combined with database design expertise to optimize data flow architecture.
  • Ability to work within Agile teams in a fast-paced environment while managing multiple priorities. This role offers an engaging environment where innovative thinking is valued alongside technical expertise in cloud-based big data solutions. The successful candidate will be instrumental in transforming complex datasets into actionable insights that drive strategic business decisions.

Job Type: Contract

Pay: $25.73 - $30.98 per hour

Work Location: In person

© 2025 Qureos. All rights reserved.