Qureos

FIND_THE_RIGHTJOB.

Data Engineer

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

This role is for one of the Weekday's clients

Min Experience: 4 years

Location: Pune, Bangalore

JobType: full-time

We are looking for an experienced Data Engineer with strong capabilities across SQL, ETL, Data Warehousing, BI, DevOps, and CI/CD—combined with hands-on experience in core banking systems and regulatory reporting pipelines. The ideal candidate will be highly technical, detail-oriented, and comfortable working with complex financial data environments.

Requirements

Key Responsibilities 2. Data Warehousing & Data Modelling 3. SQL & Database Engineering 4. Big Data & Distributed Processing 5. BI, Reporting & Visualization Support 6. DevOps & CI/CD Automation 7. Banking Domain Expertise

  • Data Engineering & ETL Development
  • Design, build, and optimize ETL workflows for extracting data from Core Banking Systems (CBS) to support regulatory, analytics, and reporting needs.
  • Develop ingestion pipelines using Apache NiFi, Airflow, SSIS, Informatica, Python, PySpark, and Databricks.
  • Integrate and manage streaming solutions using Kafka or RabbitMQ.
  • Ensure pipeline reliability, performance, data accuracy, and high availability.
  • Build and maintain staging, integration, and semantic layers within enterprise data warehouses.
  • Apply dimensional modelling techniques (star/snowflake schemas) and data vault concepts when needed.
  • Optimize schema structures across SQL and distributed data environments for scalability and speed.
  • Write advanced SQL queries, stored procedures, and transformations.
  • Work extensively with SQL Server, Oracle, and MySQL.
  • Conduct performance tuning, indexing, and query optimization for large datasets.
  • Develop scalable transformation pipelines using Databricks, PySpark, and distributed processing engines.
  • Handle large-scale data processing across structured and semi-structured datasets.
  • Build clean and standardized datasets for regulatory and management reporting.
  • Develop dashboards and visualizations using Power BI.
  • Collaborate with reporting teams to ensure data quality, clarity, and compliance.
  • Implement CI/CD workflows using Azure DevOps, Git, Jenkins, or similar tools.
  • Manage version control, automated deployments, environment provisioning, and test automation.
  • Ensure secure, scalable, and efficient deployment of data engineering and ETL components.
  • Collaborate with finance, compliance, risk, and operations teams.
  • Mandatory experience working with core banking systems such as T24, Finacle, or Flexcube.
  • Build and maintain regulatory reporting pipelines (CBUAE, SAMA, QCB, BASEL, IFRS, ALM, Liquidity, Credit Risk, etc.).

Required Skills

  • Databases: SQL Server, Oracle, MySQL
  • ETL Tools: Apache NiFi, Airflow, SSIS, Informatica
  • Big Data: Databricks, PySpark
  • Messaging: Kafka, RabbitMQ
  • Programming: Python (required), Java (preferred)
  • Reporting: Power BI
  • Data Warehousing: Star/Snowflake schema, dimensional modelling
  • DevOps/CI/CD: Git, Jenkins, Azure DevOps
  • Nice to Have: MongoDB, Neo4j (NoSQL/Graph DBs)
  • Domain Expertise: Core banking systems and regulatory reporting

Experience

  • 4-7 years of experience in data engineering, ETL development, SQL, BI, and DevOps.
  • Mandatory: Experience working within banks or financial institutions, including core banking systems and regulatory frameworks

© 2025 Qureos. All rights reserved.