Qureos

FIND_THE_RIGHTJOB.

Data Architect- Databricks

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Position: Data Architect (Databricks, Snowflake)


Location: Nagpur, Pune, Chennai, Bangalore


Type of Employment: Full time Employment


Purpose of the Position:

Design and optimize cloud-native data architectures on platforms like Databricks and Snowflake, enabling scalable data engineering, advanced analytics, and AI/ML solutions aligned with business needs.


Key Result Areas and Activities:

  • Design and implement Lakehouse architectures using Databricks, Delta Lake, and Apache Spark.
  • Lead the development of data pipelines, ETL/ELT processes, and data integration strategies.
  • Collaborate with business and technical teams to define data architecture standards, governance, and security models.
  • Optimize performance and cost-efficiency of Databricks clusters and jobs.
  • Provide technical leadership and mentorship to data engineers and developers.
  • Integrate Databricks with cloud platforms (Azure, AWS, or GCP) and enterprise systems.
  • Evaluate and recommend tools and technologies to enhance the data ecosystem.
  • Ensure compliance with data privacy and regulatory requirements.
  • Contribute to proposal and pre sales activities.

Essential Skills:

  • Expertise in data engineering, data architecture, or analytics.
  • Hands-on experience on Databricks and Apache Spark.
  • Hands-on experience on Snowflake
  • Strong proficiency in Python, SQL, and PySpark.
  • Deep understanding of Delta Lake, Lakehouse architecture, and data mesh principles.
  • Deep understanding of Data Governance and Unity Catalog.
  • Experience with cloud platforms (Azure preferred, AWS or GCP acceptable).

Desirable Skills:

  • Good understanding of the CI/CD pipeline.
  • Working experience with GitHub.
  • Experience in providing data engineering solutions while maintaining balance between architecture requirements,required efforts and customer specific needs in other tools.

Qualification:

  • Bachelor’s degree in computer science, engineering, or related field
  • Demonstrated continued learning through one or more technical certifications or related methods
  • Over 10+ years of relevant experience in ETL tools

Qualities:

  • Proven problem-solving, and troubleshooting abilities, with a high degree of adaptability; well-versed in the latest trends in the data engineering field
  • Ability to handle multiple tasks effectively, maintain a professional attitude, and work well in a team
  • Excellent interpersonal and communication skills, with a customer-focused approach and keen attention to detail

Years Of Exp

10 to 14 years

Location

Pune, Maharashtra, India

Similar jobs

No similar jobs found

© 2026 Qureos. All rights reserved.