Position - Big Data Architect
Experience: 10-12 years
Key Responsibilities:
- Data Architecture Design: Designing and implementing scalable and resilient data
architectures for both batch and streaming data processing.
- Data Modeling: Developing data models and database structures to ensure efficient
data storage and retrieval.
- Data Security and Governance: Ensuring data security, integrity, and compliance
with relevant regulations.
- Data Integration: Integrating data from various sources, including legacy systems,
into the big data infrastructure.
- Performance Optimization: Monitoring and optimizing the performance of the big
data infrastructure.
- Collaboration: Working with data scientists, engineers, and business stakeholders
to understand requirements and translate them into technical solutions.
- Technology Selection: Evaluating and recommending new technologies and tools
for data management and processing.
- Mentorship: Provide guidance and mentorship to junior team members.
- Problem Solving: Identifying and resolving complex data challenges.
- Participate in the pre and post sales process, helping both the sales, professional.
Skills and Qualifications:
- Bachelor’s / Master’s degree in computer science, computer engineering, or
relevant field.
- Overall 10+ years of experience, at least 2 years of experience in Big Data
Architect.
- Strong understanding of big data technologies: Hadoop, Spark, NoSQL databases,
cloud-based data services (AWS, Azure, GCP).
- Experience with open-source ecosystem programming languages (i.e., Python,
Java, Scala, Spark etc.)
- Proficiency in data modeling and database design: SQL, NoSQL.
- Experience with ETL processes: Extracting, transforming, and loading data.
- Strong analytical and problem-solving skills.
- Good communication and collaboration skills.
- Experience with data security and governance principles.
- Knowledge of API design and development.
- Understanding of data visualization techniques.
- Strong understanding of authentication (i.e. LDAP , Active Directory, SAML,
Kerberos etc.) & authorization configuration for Hadoop based distributed systems.
- Familiarity of DevOps methodology & toolsets and automation experience.
Job Types: Full-time, Permanent
Work Location: In person