Qureos

FIND_THE_RIGHTJOB.

Data Platform Architect

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Master Works is excited to invite applications for the position of Enterprise Data Platform Architect. In this strategic role, you will guide the design and implementation of enterprise-wide data platforms that facilitate effective data management, analytics, and governance. You will work collaboratively with stakeholders across the organization to develop data architecture strategies that empower the business while ensuring compliance with industry standards. Your expertise will play a crucial role in optimizing data flow, storage, and accessibility, making data a valuable asset for decision-making. As a champion for best practices in data architecture, you will lead initiatives to promote data integrity, security, and scalability throughout the enterprise, ultimately transforming the way Master Works leverages its data assets for business success.

Key Responsibilities

  • Architect, implement, and maintain enterprise-scale data solutions, combining data virtualization (Denodo) and big data ecosystem technologies (Cloudera, Hadoop, Spark, Hive, Kafka, etc.).
  • Integrate complex structured and unstructured data sources (SQL/NoSQL, cloud platforms, applications) into unified, high-performance data layers.
  • Design, optimize, and monitor large-scale data pipelines, virtual views, and workflows for high-performance, low-latency access.
  • Implement and enforce data governance, security, and access control policies across all data platforms.
  • Collaborate with data engineers, analysts, and business stakeholders to translate requirements into scalable and robust solutions.
  • Troubleshoot, monitor, and continuously improve system performance, reliability, and scalability.
  • Maintain best practices, documentation, and knowledge sharing for enterprise data platforms.

Requirements

  • Extensive experience with Denodo Platform, Cloudera Hadoop ecosystem, and enterprise data virtualization.
  • Strong expertise in SQL, data modeling, query optimization, and distributed computing concepts.
  • Proficient in big data tools: Spark, Hive, Impala, HBase, Kafka, and Sqoop.
  • Solid understanding of ETL processes, data integration, and cloud data services.
  • Proven ability to manage complex, enterprise-scale data projects with high-quality results.
  • Excellent problem-solving, analytical, and communication skills.
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
  • Minimum 7+ years of experience in related filed

© 2025 Qureos. All rights reserved.