Find The RightJob.
1. Good understanding of Hadoop concepts including file system and Map Reduce.
2. Hands on experience on Spark framework, Unix scripting, Hive queries, wriring UDF in Hive. Theortical knowledge and POC alone will not suffice.
3. Good knowledge in Software Development Life Cycle and Project Development Lifecycle.
4. Associate should be able to work independently and should have strong debugging skill in both Hive and Spark. Associate should have experience developing large scale systems, experience debugging and performance tuning, excellent software design skills, communication skills and ability to work with client partners. Hands on in Apache Spark with Python or any other language (Preferred is Python -PySpark)1. Experience in Banking and Finance Domain.2. Experience in Agile Methodology
3. Knowledge in job scheduling tools like Autosys.
4. Knowledge in Kafka.Desired Candidate Profile
Similar jobs
No similar jobs found
© 2026 Qureos. All rights reserved.