Qureos

FIND_THE_RIGHTJOB.

Senior Data Engineer

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

# Senior Data Engineer

## About the Role

We are seeking an experienced **Senior Data Engineer** with expertise in cloud data engineering to join our team. This is an exceptional opportunity for a seasoned professional to work on complex data solutions, leveraging Azure cloud services and Snowflake data platform in a fully remote environment.

---

## Position Details

- **Experience Required:** 8 years

- **Location:** Offshore/Remote (Work from anywhere)

- **Employment Type:** Full-time

---

## Mandatory Skills & Key Technologies

### Core Technical Expertise

- **Python** - Advanced proficiency in Python for data engineering and ETL development

- **ETL Engineering** - Strong hands-on experience in designing and implementing ETL pipelines

- **Microsoft Azure** - Solid experience with Azure cloud services and data platform

- **Snowflake** - Proven expertise in Snowflake data warehouse implementation and optimization

- **Data Modelling** - Strong understanding of data modeling concepts, dimensional modeling, and data warehousing principles

- **Solution Architecture** - Experience in designing end-to-end data solutions and architectural frameworks

---

## Required Technical Skills

### Cloud & Data Platform

- **Azure Services:** Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure SQL Database, Azure Blob Storage

- **Snowflake:** Data warehousing, SnowSQL, stored procedures, data sharing, performance tuning

- Expertise in cloud-native data architecture and best practices

### ETL & Data Engineering

- Design and development of complex ETL/ELT workflows

- Data pipeline orchestration and automation

- Real-time and batch data processing

- Data quality and validation frameworks

- Performance optimization and troubleshooting

### Python & Programming

- Advanced Python programming for data engineering

- Experience with Python libraries: Pandas, NumPy, PySpark, SQLAlchemy

- Object-Oriented Programming (OOP) principles

- RESTful API development and integration

### Database & Data Modeling

- **SQL Databases:** PostgreSQL, MySQL, SQL Server

- **NoSQL Databases:** MongoDB, Cosmos DB

- Data modeling techniques (Star schema, Snowflake schema, Data Vault)

- Database design, normalization, and optimization

- Writing complex SQL queries and stored procedures

### DevOps & Tools

- **Version Control:** Git, Azure DevOps, GitHub

- **CI/CD:** Experience with automated deployment pipelines

- **Containerization:** Docker, Kubernetes (preferred)

- **Orchestration Tools:** Apache Airflow, Azure Data Factory

- Monitoring and logging tools

---

## Key Responsibilities

### Solution Design & Architecture

- Design scalable and efficient data solutions on Azure and Snowflake

- Create solution architecture documents and technical specifications

- Define data integration strategies and best practices

- Evaluate and recommend technologies for data engineering needs

### ETL Development & Implementation

- Develop and maintain robust ETL/ELT pipelines using Python and Azure services

- Implement data ingestion from multiple sources (APIs, databases, files, streaming)

- Build data transformation logic following business requirements

- Ensure data quality, consistency, and reliability across pipelines

### Data Platform Management

- Design and implement Snowflake data warehouse solutions

- Optimize Snowflake performance (clustering, partitioning, query optimization)

- Manage Azure data services and infrastructure

- Implement data security and governance policies

### Collaboration & Leadership

- Collaborate with data analysts, data scientists, and business stakeholders

- Provide technical guidance and mentorship to junior team members

- Participate in code reviews and ensure coding standards

- Document technical processes and create knowledge base articles

---

## Required Qualifications

- **8+ years** of experience in data engineering, ETL development, or related roles

- Bachelor's or Master's degree in Computer Science, Engineering, or related field

- Strong understanding of data warehousing concepts and methodologies

- Proven track record of delivering complex data solutions

- Excellent problem-solving and analytical skills

- Strong communication skills for remote collaboration

---

## Preferred Qualifications

- Azure certifications (Azure Data Engineer Associate, Azure Solutions Architect)

- Snowflake certifications (SnowPro Core, Advanced Architect)

- Experience with data governance tools and frameworks

- Knowledge of machine learning pipelines and MLOps

- Experience with Agile/Scrum methodologies

- Exposure to real-time streaming technologies (Kafka, Event Hubs)

---

## Ideal Candidate Profile

We're looking for someone who:

- Has deep expertise in Python-based ETL development

- Demonstrates strong Azure and Snowflake knowledge

- Can architect scalable and efficient data solutions

- Possesses excellent data modeling and design skills

- Is self-motivated and thrives in a remote work environment

- Has strong analytical and problem-solving abilities

- Stays current with data engineering trends and technologies

- Communicates effectively across distributed teams

- Takes ownership and delivers high-quality solutions

---

## What We Offer

- **Fully remote/offshore** work model - Work from anywhere

- Opportunity to work with cutting-edge cloud and data technologies

- Challenging and impactful projects

- Professional growth and continuous learning opportunities

- Collaborative and inclusive work culture

- Flexible working hours

- Performance-based incentives

Job Types: Full-time, Permanent

Pay: ₹1,500,000.00 - ₹2,100,000.00 per year

Benefits:

  • Health insurance
  • Life insurance
  • Provident Fund

Work Location: Remote

© 2025 Qureos. All rights reserved.