Find The RightJob.
Business Summary
Position Responsibilities
Job Summary
As a Sr Data Engineer, you will design and implement robust data architectures for enterprise platforms, including data lakes, warehouses, and lakehouse solutions. You will build scalable data pipelines, ensure data quality and reliability, and drive modernization initiatives. Collaborating closely with data science and analytics teams, you will deliver advanced data solutions while supporting compliance and governance requirements. This role offers opportunities to mentor junior engineers, evaluate emerging technologies, and contribute to technical excellence across the organization.
Key Responsibilities
Design and implement data architecture for enterprise data platforms (data lakes, warehouses, lakehouse architectures).
Build scalable data pipelines using modern ETL/ELT frameworks (e.g., Airflow, DBT); ensure data quality, reliability, and performance.
Implement Change Data Capture (CDC) and real-time streaming solutions using technologies such as Kafka and Spark Streaming.
Optimize data infrastructure for performance and cost, including partitioning, compression, and caching strategies.
Establish data governance practices: data lineage, quality monitoring, schema management, and access controls.
Lead technical initiatives for data platform modernization; evaluate and adopt new technologies and tools.
Mentor junior data engineers, conduct code reviews, and establish best practices.
Collaborate with data science and analytics teams to deliver data solutions for advanced analytics.
Support compliance requirements for data security, privacy, and regulatory needs (e.g., GDPR, SOC2).
Qualifications
Required Qualifications
5+ years of data engineering experience building large-scale data platforms on cloud infrastructure.
Extensive experience with cloud data platforms: AWS (Redshift, Glue, EMR, S3), Azure (Synapse, Data Factory, ADLS), or GCP (BigQuery, Dataflow, GCS).
Expertise in SQL; proficiency in Python or Scala; experience with Spark for distributed data processing.
Hands-on experience with data pipeline orchestration tools (Airflow, DBT, Luigi, or similar).
Experience with streaming platforms (Kafka, Kinesis, Pub/Sub) and event-driven architectures.
Deep knowledge of relational databases (PostgreSQL, Oracle, SQL Server) and NoSQL databases (MongoDB, Cassandra, DynamoDB).
Strong skills in dimensional modeling, data normalization, and schema design.
Experience with Infrastructure as Code tools (Terraform, CloudFormation, or similar).
Bachelor’s degree in Computer Science, Engineering, or related field.
This position requires US Citizenship
Preferred Qualifications
Master’s degree in Computer Science, Engineering, or related field.
Experience with data compliance requirements (FedRAMP, ITAR, HIPAA).
Experience with lakehouse platforms (Databricks, Delta Lake).
Experience with CDC tools (GoldenGate, Debezium).
Oracle Cloud experience.
Compensation Info
Position Type
Travel Requirements
Compliance Requirements
EEO Statement
E-Verify Statement
Applicant Privacy Notice
Similar jobs
Amazon Web Services
Seattle, United States
8 days ago
Amazon.com
Seattle, United States
8 days ago
DeNovo Solutions
Remote, United States
8 days ago
DeNovo Solutions
Remote, United States
8 days ago
Meta
Menlo Park, United States
9 days ago
Apple
Austin, United States
9 days ago
© 2026 Qureos. All rights reserved.