Qureos

FIND_THE_RIGHTJOB.

Datamart Developer

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

We are seeking an experienced Datamart / Semantic Layer Developer to develop and implement business-oriented datamarts and semantic layers on Teradata EDW, CDP Hive, and Trino platforms. The candidate must possess strong SQL development skills, dimensional modeling knowledge, telecommunications domain expertise, and ability to translate technical specifications into optimized analytics solutions.

Experience Required: Minimum 5+ years in datamart development and semantic layer implementation

Core Responsibilities

Datamart Development

  • Develop and implement star schema and snowflake schema dimensional models on Teradata EDW
  • Build subject-area datamarts (Customer, Revenue, Network, Product, Finance) based on design specifications
  • Create and optimize fact tables, dimension tables, bridge tables, and aggregate tables
  • Implement slowly changing dimensions (SCD Types 1, 2, 3) logic and dimensional hierarchies
  • Develop complex SQL queries, stored procedures, and views for datamart population
  • Implement data transformation and aggregation logic for business metrics and KPIs

Semantic Layer Development

  • Develop semantic layers using TIBCO Data Virtualization on Teradata and CDP platforms
  • Build semantic models using Trino for distributed query processing and data access
  • Create virtual views, materialized views, and business-friendly data abstractions
  • Implement business logic, calculated measures, KPIs, and derived metrics in semantic layer
  • Develop data access policies, row-level security, and governance rules
  • Optimize semantic layer performance through caching, indexing, and query optimization

Multi-Platform Development

  • Work across Teradata, CDP Hive, and Trino platforms for datamart and semantic layer implementation
  • Develop HiveQL queries and tables in CDP (Cloudera Data Platform) environment
  • Integrate data from Teradata EDW and CDP Hive through Trino for unified semantic access
  • Create cross-platform queries and federated views using Trino connectors
  • Implement partitioning, bucketing, and optimization strategies in Hive tables

Implementation & Optimization

  • Translate design documents (HLD, LLD) and mapping specifications into SQL code
  • Develop ETL/ELT processes to populate datamarts from EDW sources
  • Optimize query performance using indexing (PI, SI, NUSI), statistics, partitioning, and aggregations
  • Conduct unit testing, data validation, and reconciliation between source and target
  • Debug and troubleshoot performance issues in datamarts and semantic layers

Collaboration & Documentation

  • Work closely with datamart designers, EDW developers, BI teams, and business analysts
  • Implement business requirements and KPI calculations as per specifications
  • Create technical documentation: SQL scripts, deployment guides, data lineage
  • Support UAT activities and assist business users in validating data accuracy
  • Provide production support and resolve data or performance issues

Requirements

Required Skills

SQL & Development (Required - Strong)

  • Teradata (Must Have): Advanced SQL development, stored procedures, performance tuning, utilities (BTEQ, TPT)
  • Strong understanding of Teradata architecture, indexing (PI, SI, NUSI), partitioning, and statistics
  • CDP Hive: HiveQL development, table creation, partitioning, bucketing, optimization in Cloudera environment
  • Trino (PrestoSQL): SQL development using Trino, federated queries, connector configuration
  • Expert-level SQL across multiple platforms for complex queries and transformations
  • Oracle SQL and PL/SQL development experience

Semantic Layer & Tools (Required)

  • TIBCO: Hands-on development experience with TIBCO Data Virtualization for semantic layer implementation
  • Experience creating virtual views, business views, and semantic models in TIBCO
  • Understanding of data virtualization concepts and query federation
  • Knowledge of BI tool integration with semantic layers

Dimensional Modeling Knowledge

  • Strong understanding of star schema and snowflake schema dimensional models
  • Knowledge of fact table design, dimension design, and SCD implementations
  • Ability to translate dimensional models into physical database objects
  • Understanding of dimensional modeling best practices (Kimball methodology)

Telecommunications Domain (Required)

  • Understanding of telecom business processes, KPIs, and data flows
  • OSS: Network performance, inventory, fault management metrics
  • BSS: Billing, customer analytics, revenue, churn, product performance
  • Telecom KPIs: ARPU, churn rate, CLTV, network utilization, revenue metrics

Professional Skills

  • Full SDLC experience (Agile/Scrum, Waterfall)
  • Strong analytical and debugging skills for performance troubleshooting
  • Good communication skills for technical collaboration
  • Unix/Linux scripting for automation (plus)
  • Version control: Git, SVN

Preferred Qualifications

  • Bachelor's degree in Computer Science, Information Technology, or related field
  • Experience with data profiling and data quality tools
  • Knowledge of ETL tools (Ab Initio, Informatica)
  • Understanding of data governance and metadata management
  • Experience with BI tools: Tableau, Power BI, Qlik


Key Deliverables

  • Developed and deployed datamarts (star/snowflake schema) on Teradata
  • Semantic layer implementations using TIBCO and Trino with business views and virtual tables
  • Optimized SQL code, stored procedures, and views for datamarts
  • HiveQL scripts and tables in CDP environment
  • Technical documentation: SQL scripts, deployment guides, data lineage
  • Unit tested code with data validation and reconciliation reports

Performance tuning recommendations and optimization implementations

© 2025 Qureos. All rights reserved.