Qureos

Find The RightJob.

ELT Engineer

Position Summary:
The ETL Engineer is responsible for designing, developing, and maintaining data pipelines that extract, transform, and load data into analytics platforms, including SQL Server and ClickHouse for real-time and high-performance querying. This role ensures data integrity, scalability, and performance for BI, reporting, and advanced analytics.

Job Responsibilities:
  • ETL Pipeline Development
    • Design and implement ETL workflows using SQL Server Integration Services, Azure Data Factory, Databricks, and ClickHouse.
    • Develop transformations for structured and semi-structured data; optimize for speed and reliability.
  • ClickHouse Integration
    • Build and maintain ingestion pipelines for ClickHouse using batch and streaming methods.
    • Optimize ClickHouse schemas, partitions, and materialized views for query performance.
  • Data Integration
    • Connect multiple sources (SQL Server, APIs, cloud storage, Kafka/Event Hubs) into centralized data platforms.
  • Performance & Optimization
    • Monitor SSIS ETL jobs and SQL Server and ClickHouse queries; tune for low latency and high throughput.
    • Implement compression, sharding, and replication strategies in ClickHouse.
  • Data Quality & Governance
    • Apply validation checks, error handling, and lineage tracking.
    • Ensure compliance with security and governance standards.
  • Collaboration
    • Work with data engineers, analysts, and BI teams to deliver reliable datasets.
    • Support real-time analytics and dashboards powered by PowerBI and ClickHouse.
  • Documentation
    • Maintain clear documentation for ETL processes, SQL Server and ClickHouse schemas, and operational playbooks.
Required Skills:
  • Strong experience with ETL tools (Azure Data Factory, SSIS, Databricks).
  • Proficiency in SQL and relational database concepts.
  • Familiarity with ClickHouse (schema design, ingestion, optimization).
  • Familiarity with cloud platforms (Azure preferred).
  • Programming skills in Python or Scala for data processing.
Preferred Qualifications
  • Experience with streaming data (Kafka, Azure Event Hubs).
  • Familiarity with big data frameworks.
  • Understanding of DevOps practices for data pipelines.
  • Exposure to BI tools (Power BI, Tableau) integrated with ClickHouse.
Education & Experience
  • Bachelor’s degree in Computer Science, Data Engineering, or related field.
  • 3+ years of experience in ETL development and data integration.
This is an On-Premises position . Monday through Thursday (9-5) and remote on Fridays only (No exceptions).

Similar jobs

No similar jobs found

© 2026 Qureos. All rights reserved.