Job Overview
This role involves managing, organizing, and ensuring the integrity of data within the customer data, often focusing on tasks such as data cleansing, reconciliation and validation, maintenance, and reporting. You will also have the chance to think of innovative solutions to provide a better way of working towards the goal of the squad.
In this role, you will be working closely with the Product Owner, Customer Journey Experts, Data analysts, and Data Operations Engineer. This role will be reporting functionally to the Product Owner and will have a hierarchical lead locally.
Key Responsibilities
Resolve incidents through data cleansing and correction caused by incorrect data migration to make sure that the source and target systems are synchronized.
Work with data analysts to provide better reporting and visualization of the reconciled data
Perform batch execution and monitoring on data by module and release.
Initiate new solutions to provide different and new ways to deal with maintaining, reporting and visualizing data.
Key Capabilities/Experience
Experience working in Financial Services and/or KYC processes.
Experience in handling data operations or data migration.
Experience in reading and searching through aggregated logs.
Ability to extract key observations from data and bring clearly across in executive reporting.
Ability to facilitate decision-making to drive complex design issues to a conclusion.
Experience working in a Production environment and performing data cleaning and correction.
Experience in running batch execution and monitoring data related processes.
Minimum Qualifications
Indicate the must have and good to have skills that the successful candidate should possess i.e. technical skills, character and people skills, level of stakeholder engagement, educational requirements, certifications, etc.
Must have basic to intermediate Unix / Linux knowledge.
Must have basic to intermediate SQL query knowledge.
Must have experience in handling and browsing through aggregated logs, and dashboarding tools. (Such as: ELK Stack)
Good to have SAS Enterprise Guide background creating projects or programs.
Good to have exposure to IBM Infosphere MDM.
Good to have basic knowledge of Python or/and Scripting tools.
Good to have Data scheduling workflow tool (Such as: Apache Airflow)
Good to have excel and VBA experience.
Plus points:
Ability to transform user information requirements into requirements for the data management department.
Have conceptual knowledge or experience in setting up a basic data Pipeline.
Exposure with various BI-toolings such as PowerBI and/or Cognos as visualization / reporting tool.
ETL concept knowledge / Data Ingestion Tools.
Innovative mind to experiment and apply innovative solutions.