Synopsis of the role
In this role, you will work with the Data Architect to develop and deploy cutting edge solutions from a data management perspective involving large datasets. Also, the development and maintenance of ongoing programs/project metrics, presentations, analyses, dashboards, etc. to inform a wide variety of stakeholders. Work location of this role will be our Bangalore office.
What you’ll do
- Work with key stakeholders and understand their needs to develop new or improve Data management processes
- Work in a cross-functional team
- Provide hands-on support and guidance in the design and development of solutions using large datasets within GCP/Ignite platform
- Deploy innovations, including but not limited to AI/ML techniques for Data Management in Ignite Internal Data lake
- Understand the various aspects of architecture practices: business, application, data, security, infrastructure and governance
- Define and capture business and functional requirements for key projects or assignments
- Facilitate team meetings involving business and technical resources
- Coordinate, execute, and monitor processes to ensure project objectives are met
- Designing and Building Data Infrastructure
- Data Ingestion and Integration
- Data Modeling and Database Management
- Data Transformation and Processing
- Data Security and Governance
What experience you need
- Bachelor's Degree in Computer Science/Engineering or other related discipline
- 5 to 8 years of experience in data management, engineering, and/or, data analysis experience in the credit risk, financial services domain
- Proficiency in SQL and understanding of relational databases - Experience in writing Advanced SQL queries, stored procedures, views, triggers and performance tuning, query and cost optimization.
- Good understanding of ETL and Data Tools - Using Python/Pyspark, SQL, Apache Spark, Apache beam, Airflow and RDBMS (SQL Server, Oracle, MySQL etc)
- Understanding of data modeling and integration concepts, with the ability to design and implement data architectures.
- Strong programming skills in languages such as Python, PySpark, Scala, C# or Java, with experience in data manipulation, transformation, and analysis.
- Experience with big data technologies and frameworks, such as Hadoop, Spark, or Hive.
- Design and build scalable and efficient data infrastructure on GCP platform
- Understanding of SDLC, data warehousing, data modeling and ETL concepts
- Familiarity with data quality and governance principles.
- Excellent problem-solving and analytical skills, with the ability to identify and resolve complex data engineering challenges.
- Strong collaboration and communication skills, with the ability to work effectively within a team and interact with stakeholders.
- Knowledge of Agile methodology and working within an agile team
- Certification on Data Engineering is added advantage
- Knowledge on Cloud Platforms - preferably on GCP is added advantage
What could set you apart
- Knowledge of credit bureau data Graph database, AI/ML tools usage to solve large scale business problems Experience in Data Visualization and storytelling
#India