We have an urgent requirement for
Spark Scala Developer
for our client based at Dubai.
Tech Stack: Scala, Apache Spark (Structured Streaming), Kafka, MQTT, MongoDB, Redis, PostgreSQL, Docker/Kubernetes--Must
Key Responsibilities
-
Design, develop, and maintain scalable data processing applications using Apache Spark and Scala
-
Work with large datasets to extract insights and build data pipelines
-
Collaborate with data scientists and analysts to understand requirements and deliver data-driven solutions
-
Optimize Spark jobs for performance, scalability, and reliability
-
Troubleshoot and resolve issues with Spark applications
-
Develop and maintain data processing workflows using Spark, Scala, and other relevant technologies
Requirements
-
Strong experience with Apache Spark and Scala programming
-
Proficiency with Spark Core, Spark SQL, and Spark Streaming
-
Experience with data processing, data warehousing, and data analytics
-
Strong understanding of data structures, algorithms, and software design patterns
-
Excellent problem-solving skills and attention to detail
-
Ability to work with large datasets and distributed systems
Nice To Have
-
Experience with Hadoop, Hive, and other big data technologies
-
Familiarity with cloud platforms (e.g., AWS, Azure, GCP)
-
Knowledge of data visualization tools (e.g., Tableau, Power BI)
-
Experience with Agile development methodologies and version control systems (e.g., Git)
Skills
-
Programming Languages: Scala, Java, Python
-
Big Data Technologies: Apache Spark, Hadoop, Hive
-
Data Processing: Data pipelines, data warehousing, data analytics
-
Data Structures: RDDs, DataFrames, Datasets
-
Performance Optimization: Spark job optimization, performance tuning
Skills: apache spark,spark,scala,kafka