Qureos

Find The RightJob.

Databricks Certified Data Engineer /AWS Solutions Architect

Job Overview
Devout, Inc. is seeking a highly skilled and motivated Databricks Certified Data Engineer / AWS Solutions Architect to join our dynamic data team. This role offers an exciting opportunity to design, develop, and optimize scalable data solutions leveraging cloud platforms and big data technologies. As a key player, you will architect robust data pipelines, implement innovative analytics solutions, and ensure seamless integration across diverse data sources. Your expertise will empower our organization to harness the full potential of data-driven insights, driving strategic decision-making and operational excellence.

Duties

  • Design, build, and maintain scalable data pipelines using Databricks, Spark, and Hadoop ecosystems to support complex analytics and machine learning models.
  • Develop and optimize cloud-based data architectures on AWS, including services like S3, Redshift, Lambda, and EC2 instances, ensuring high availability and security.
  • Collaborate with cross-functional teams to gather requirements and translate them into efficient ETL (Extract, Transform, Load) processes utilizing tools such as Talend, Informatica, or custom scripting in Python, Bash, or Shell scripting.
  • Implement database solutions with Microsoft SQL Server, Oracle, and Azure Data Lake for structured and unstructured data storage.
  • Design and develop RESTful APIs for seamless data integration across platforms and applications.
  • Conduct analysis of large datasets using SQL, Python, Spark, and other analytical tools to derive actionable insights.
  • Support model training activities by preparing datasets and ensuring data quality for machine learning initiatives.
  • Maintain comprehensive documentation of architecture designs, workflows, and best practices aligned with Agile development methodologies.

Requirements

  • Proven certification as a Databricks Certified Data Engineer and AWS Solutions Architect demonstrating expertise in cloud architecture and big data processing.
  • Extensive experience working with AWS cloud services, including S3, EC2, Lambda, Redshift, and related tools.
  • Strong programming skills in Java, Python, Bash (Unix shell), or Shell scripting for automation and pipeline development.
  • Deep understanding of big data frameworks such as Hadoop ecosystem components (HDFS, Hive), Spark (PySpark), Apache Hive, and related technologies.
  • Hands-on experience with ETL tools like Talend or Informatica for efficient data integration workflows.
  • Proficiency in database design principles for Data Warehouses using SQL Server, Oracle databases; familiarity with Looker or similar BI tools for reporting.
  • Knowledge of Linked Data concepts for semantic web applications is a plus.
  • Ability to analyze complex datasets with strong attention to detail; excellent problem-solving skills are essential.
  • Experience working within Agile teams to deliver iterative solutions in fast-paced environments.
  • Familiarity with analysis techniques involving model training and predictive analytics is advantageous. Join us to leverage your expertise in cloud architecture and big data engineering! Be part of a forward-thinking organization committed to innovation through cutting-edge technology solutions that transform how we analyze and utilize data across industries.
  • US Citizen required, Public Trust
  • Hybrid 2-3 on-site

Pay: $70.00 - $85.00 per hour

Work Location: Hybrid remote in Washington, DC 20006

© 2026 Qureos. All rights reserved.