Mandatory Skills
Data Warehousing, Azure Data Engineer, Azure Data Lake, Data Bricks, Pyspark, Python
Skill to Evaluate
Data Warehousing, Azure Data Engineer, Azure Data Lake, Data Bricks, Pyspark, Python
Experience
4 to 6 Years
Roles & Responsibilities
- Develop and implement scalable data processing solutions using Azure Databricks, including ETL processes that efficiently handle large data volumes.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs and objectives.
- Optimize data storage and retrieval processes to enhance performance and reduce latency.
- Ensure data quality and integrity by implementing best practices for data governance and validation.
- Stay updated with the latest trends and technologies in data engineering, and evaluate their applicability to the organization's needs.
Proficiency in Databricks and experience with Spark for big data processing.
- Solid programming skills in Python or Scala for developing data pipelines.
- Knowledge of SQL and experience with relational databases and data warehousing concepts.
- Familiarity with cloud platforms - Azure, particularly in relation to data storage and processing.
- Understanding of data modeling, ETL processes, and data integration methodologies.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration abilities to work effectively in a team-oriented environment.
Skill Area Tools & Technologies
Data Engineering Databricks, Apache Spark, Delta Lake
Programming Python, SQL, PySpark
Cloud Platforms Azure
Education Qualificaiton
B Tech