Qureos

FIND_THE_RIGHTJOB.

Data Architect

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Data Architect

Responsibilities:
Data Pipeline Development: Design, build, and maintain robust and scalable data pipelines using Databricks and Apache Spark, primarily with Scala, to ingest, process, and transform large datasets from various sources.
ETL/ELT Implementation: Develop and optimize Extract, Transform, Load ETL or Extract, Load, Transform ELT processes within the Databricks environment to ensure data quality, consistency, and availability for analytical purposes.
Experience on Snowflake
API Development: Design, develop, and deploy secure and efficient RESTful APIs using AWS services like Lambda and API Gateway to expose data or trigger data processing workflows.
Cloud Infrastructure Management: Utilize AWS services for data storage, compute, and other infrastructure components related to data engineering solutions.
Performance Optimization: Optimize Spark jobs and Databricks clusters for performance and cost efficiency.
Data Governance and Security: Implement data governance best practices, ensuring data security and compliance within the Databricks and AWS environments, including securing API endpoints.
Collaboration: Work closely with data scientists, analysts, and other engineering teams to understand data requirements and deliver solutions.
Monitoring and Troubleshooting: Monitor data pipelines, identify and resolve issues, and ensure the reliability and availability of data systems.
Qualifications:
Proficiency in Scala: Strong programming skills in Scala, specifically in the context of Apache Spark.
Databricks Expertise: In-depth experience with Databricks platform, including Databricks notebooks, Delta Lake, and cluster management.
Apache Spark Mastery: Extensive experience with Apache Spark for big data processing, including Spark SQL, Spark Streaming, and Spark MLlib.
AWS Cloud Experience: Hands-on experience with AWS services, particularly Lambda, API Gateway, S3, EC2, and potentially other relevant services like Glue or Redshift.
API Development: Proven ability to design and implement secure RESTful APIs, including authentication and authorization mechanisms.
Data Warehousing Concepts: Understanding of data warehousing principles, data modeling, and schema design.
SQL Proficiency: Strong SQL skills for data manipulation and analysis.
Problem-Solving: Excellent analytical and problem-solving skills with a focus on delivering high-quality data solutions.
Communication: Effective communication skills to collaborate with technical and non-technical stakeholders.

About Virtusa

Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us.

Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence.

Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

© 2025 Qureos. All rights reserved.