Qureos

FIND_THE_RIGHTJOB.

Architect - Azure

India

While technology is the heart of our business, a global and diverse culture is the heart of our success. We love our people and we take pride in catering them to a culture built on transparency, diversity, integrity, learning and growth.

If working in an environment that encourages you to innovate and excel, not just in professional but personal life, interests you- you would enjoy your career with Quantiphi!
Role: Architect - Azure
Experience Level: 10 Years and above
Work location: Mumbai, Bangalore & Trivandrum
Role & Responsibilities:
  • Work with a team of architects, engineers, project managers and customer stakeholders to solve big data problems by:
  • Creating a migration strategy of existing data platforms from on-premise to the cloud or from one cloud to another. The plan could include providing a point-of-view on the choice of cloud and/or services to be used on the cloud, re-architecting the platform to enable the solution on a chosen cloud keeping in mind the latest trends and technologies.
  • Architecting and implementing a completely new Data platform aligning with the data strategy and business objectives of the client.
  • Designing and building a cloud migration strategy for cloud and on-premise data architectures, applications, pipelines, etc.
  • Diagnosing and troubleshooting complex distributed data systems problems and developing solutions with a significant impact at a massive scale.
  • Providing technical leadership in building applications using native Azure services which are customer facing, scalable as per user requirements, follow best practices in cybersecurity and data protection.
  • Building tools to ingest and jobs to process several terabytes or petabytes per day.
  • Strong experience of working with Azure Synapse Analytics, Azure Data Factory and Microsoft Fabric platform.
  • Good understanding of data solutions paradigms like data lake, data mesh, data fabric, data lakehouse.
  • Designing and developing next-gen storage and compute solutions for several large customers in the healthcare & life sciences space.
  • Being involved in proposals, and RFPs and providing effort estimates, solution design etc.
  • Communicating with a wide set of teams, including Infrastructure, Network, Engineering, DevOps, SiteOps teams, Security and cloud customers.
  • Building advanced tooling for automation, testing, monitoring, administration, and data operations across multiple cloud clusters.
  • Developing a better understanding of Data modeling and governance.
Must Have Skills:
  • 12+ years of hands-on experience in data architectures, cloud, data structures, distributed systems, Hadoop and Spark, SQL & NoSQL databases
  • Strong project experience in setting up infrastructure for hosting applications on Azure using tools and technologies like Docker, Kubernetes, Logic Apps, Azure Virtual Machines, Azure Functions, Azure Event Grid, Azure Container Instances, etc.
  • Strong software development skills in at least one of: Python, PySpark, Java, Scala
  • Strong experience of working with SQL and/or PL/SQL using any RDBMS
  • Experience in building and deploying Azure-based data solutions at scale using Azure as well as other third-party products like Databricks, Snowflake, etc.
  • Experience in developing large-scale enterprise big data solutions (migration, storage, processing) to accommodate structured, semi-structured, and unstructured data.
  • Experience in different medical imaging formats like DICOM, WSI, etc., & different health data formats like HL7, FHIR, etc.
  • Experience in building and supporting large-scale systems in a production environment
  • Designing and development of ETL, ELT pipelines
  • Requirement gathering and understanding of the problem statement
  • End-to-end ownership of the entire delivery of the project
  • Designing and documentation of the solution
  • Knowledge of RDBMS & NoSQL databases
  • Experience in any of Kafka, Kinesis, Cloud Pub-Sub
  • Experience in any of the Cloud Platforms – Azure + AWS or GCP Big Data Distributions
  • Experience in Spark in any processing frameworks like Apache Spark / CDH / HDP / EMR / Google DataProc / Databricks
  • Experience in any of Airflow, Oozie, Apache NiFi, Google DataFlow Message/Event Solutions
  • Exposure to reporting tools (at least one of Power BI, Tableau, Looker)
  • Experience in code versioning using Git and DevOps for CI/CD
If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

© 2025 Qureos. All rights reserved.