About the Role
We are looking for a motivated
Data Engineer
to join our
Data Management and AI team
and contribute to building scalable data platforms that power analytics and AI initiatives across our digital banking ecosystem.
In this role, you will design and develop
data pipelines, streaming infrastructure, and data warehouse solutions
that enable reliable and high-performance data processing.
This is an excellent opportunity for a
junior data engineer
looking to grow within a fast-paced fintech environment while working on modern data technologies.
Team Overview
The
Data Management and AI team
works across the entire technology stack within QNBeyond Plus.
Our team collaborates with engineering, analytics, and product teams to manage, process, and analyze data that enables innovative digital banking experiences for our customers.
Key Responsibilities
Data Pipeline Development
-
Design and build scalable
data pipelines for batch and real-time processing
.
-
Implement ETL/ELT workflows integrating data from
APIs, databases, and event streams
.
Streaming & Event Processing
-
Develop and maintain
streaming data pipelines using tools such as Kafka or similar platforms
.
-
Ensure
low-latency, reliable, and fault-tolerant data delivery
.
Data Transformation
-
Develop
data enrichment and transformation workflows
to convert raw data into business-ready datasets.
-
Implement SQL transformations across layered architectures (
raw → bronze → silver → gold
).
Data Warehouse Development
-
Design and maintain
data warehouse models and schemas
.
-
Optimize query performance and storage design for analytical workloads.
Data Quality & Reliability
-
Implement
data validation and quality checks
.
-
Monitor data pipeline performance and ensure reliability.
Collaboration
-
Work closely with
engineering, analytics, and product teams
to deliver production-ready data solutions.
What We Are Looking For
You must be:
-
Self-motivated
, proactive, and results-driven
-
A
team player
who enjoys collaboration and learning
-
A
problem solver
who approaches challenges analytically
-
Customer-oriented
, committed to delivering high-quality data solutions
-
Detail-oriented
, especially regarding data quality and compliance
-
Adaptable and eager to grow in a
fast-paced fintech environment
Required Qualifications
Education
-
Bachelor’s degree in
Computer Science, Computer Engineering, or related field
Technical Knowledge
-
Understanding of
computer systems and cloud environments
-
Understanding of
network protocols and system connectivity
Systems Skills
-
Experience using
Linux command line tools
-
Familiarity with
SSH, SFTP, Bash, IP, ports, and networking concepts
Data Engineering Skills
-
Experience developing and optimizing
ETL/ELT pipelines
-
Experience with tools such as
Informatica PowerCenter, Kafka, NiFi, or similar
Data & Analytics
-
Strong
SQL skills
-
Understanding of
data warehousing concepts and schema design
-
Experience with
data modeling and data lifecycle management
Programming
-
Programming experience using
Python
-
Scripting experience using
Bash or similar tools
Experience
-
2+ years of experience working with ETL or data engineering workflows
-
Experience with
Informatica PowerCenter and MDM
-
Experience developing and scaling
data models within modern data warehouse architectures
-
Experience implementing
data enrichment and transformation logic
-
Experience optimizing queries and storage design for analytics workloads
-
At least
1 year of experience in fintech, regulated environments, or PCI/DSS compliance
is preferred
-
Familiarity with
performance monitoring tools