Qureos

Find The RightJob.

Senior Data Engineer - AWS

Valleysoft | Center of Excellence is a regional IT services provider based in Egypt, serving clients globally since 2006. The company collaborates with global partners like Oracle to address diverse business and technical challenges, from enterprise application development to process management. Valleysoft's vendor-neutral and process-oriented approach, coupled with operational maturity, ensures high-quality and cost-effective services for clients.

Overview

We are looking for a skilled Senior Data Engineer with strong AWS expertise to design, develop, and optimize cloud-based data pipelines and solutions. The role involves hands-on development, collaboration with cross-functional teams, and contributing to scalable, high-performance data platforms. The ideal candidate has solid AWS experience, strong programming skills, and an understanding of data engineering best practices.

Key Responsibilities
  • Develop and maintain batch and real-time data pipelines on AWS
  • Implement ETL/ELT solutions using AWS Glue, Spark on EMR, and other AWS services
  • Assist in workflow orchestration using Apache Airflow (MWAA)
  • Support serverless data processing using AWS Lambda
  • Participate in designing data warehousing solutions using Amazon Redshift
  • Design and manage data storage strategies with Amazon S3 and DynamoDB
  • Apply data governance and access controls using AWS DataZone
  • Monitor and troubleshoot data platform issues using Amazon CloudWatch
  • Collaborate with Analytics, BI, Data Science, and Business teams
  • Write clean, efficient code, and perform peer code reviews
  • Follow security, compliance, and data governance standards

Requirements

Required Skills & Experience
  • 5–7 years of experience in Data Engineering or related roles
  • 3+ years of hands-on experience designing data solutions on AWS
  • Proficiency with: AWS Glue, Amazon EMR (Spark), AWS Lambda, Apache Airflow (MWAA), Amazon EC2, Amazon CloudWatch, Amazon Redshift, Amazon DynamoDB, AWS DataZone, Amazon S3
  • Strong programming skills in Python
  • Advanced SQL skills with performance tuning expertise
  • Understanding of data warehousing, data modeling (Star/Snowflake), ETL/ELT best practices
  • Familiarity with Agile/Scrum delivery environments
  • Problem-solving and analytical skills
Nice to Have
  • AWS Professional Certification (Solutions Architect / Data Analytics Specialty).
  • Experience with real-time streaming (Kinesis, Kafka).
  • Exposure to data governance and metadata management tools.
  • Knowledge of cost optimization in AWS.

© 2026 Qureos. All rights reserved.