Banks struggle to ensure quality and timeliness of data

    Also watch on:

Senior executives in data management, technology and risk management from leading banks in Southeast Asia shared their perspectives on challenges and how they are designing data strategies and architecture to address the evolving customer and business needs.

As economies become more digital and competition is growing, banks find it increasingly important to deliver a unique and contextual customer experience. Most banks still grapple with vast amount of structured and unstructured data across silos, fragmented applications and a complex legacy technology environment. Han Hwee Chong, group chief risk officer at RHB Group, Kamaruzaman Bin Mohd Noah, head of operations for Malaysia and regional transformation at Maybank (Malaysia), Hendy Gunawan head of data governance at  Maybank (Indonesia), Razak Idris, head of transformation management office, of Bank Muamalat, Swee Leong Ang, managing director and regional head of group legal and compliance at  CIMB Malaysia together with Koh Ee Laine principal consultant, Southeast Asia,  Commvault, Thomas De Souza, CTO of financial services, Europe, Middle East and Africa at Hitachi Vantara; and Suraj Kotipalli, APAC business leader, content and data intelligence solution at  Hitachi Vantara debated on how banks are rethinking technology and productising data to drive business value, real-time risk management, and ransomware resilience.

Chong said speed and building the system that enables banks to use analytics and personalise customer interactions are key to real-time data capability. Noah put emphasis on replicating data within the organisation, the use of data for robotic process automation and application programming interfaces and building an interconnected system around it.

Ang highlighted the importance of data hygiene, keeping the system clean as one of the best practices in real-time risk management, threat detection and data security. Gunawan said prioritising customer data is more sensitive with regards to risk management.

Idris added that people, process, governance and technology are key factors that they focus on when ensuring standardisation of data processes. De Souza underscored the value of data, especially with the changes in customer demand and emergence of various intelligent technologies. Laine pointed that active monitoring of live and backup data lessens ransomware risks.

The panellists also talked about how effective data needs the combination of right strategy, technology, data process and data framework for a competitive edge and better customer experience.

The following key points were discussed.

Some of the interesting discussion points:

The following is the edited transcript of the interview:

Neeti Aggrawal (NA): Welcome to our RadioFinance virtual dialogue, “Are banks ready for data-driven customer experience and real-time risk management?” We have with us a distinguished group of senior executives in technology risk and data management from some of the leading national institutions in Southeast Asia. They will share with us their perspective on how they are rethinking technology and prioritising data to drive business value, real-time risk management and resilience. Today we'll look at some of the key areas in data strategies and how banks are designing their data framework and strategies. How banks can transform to what's in each aisle and future-oriented data architecture to drive revenue and mitigate risk, operationalising real-time analytics and artificial intelligence (AI) to drive contextualised experience for customers? How to build an effective protection, resilience and intelligence framework to address the growing ransomware challenge, and the complexities of best practice in real-time risk management, data security and data immutability?  

We have with us Han Hwee Chong, group chief risk officer at RHB Bank; Kamaruzaman bin Mohd Noah, head of operations for Malaysia and regional transformation at Maybank; Hendy Gunawan, head of data governance at Maybank; Swee Leong Ang, managing director and regional head of group legal and compliance at CIMB; Thomas De Souza, CTO of financial services, Europe, Middle East and Africa at Hitachi Vantara; Suraj Kotipalli, APAC business leader, content and data intelligence solution at Hitachi Vantara; and Koh Ee Laine, principal consultant, Southeast Asia, Commvault. Thank you so much for joining us today, and I look forward to a lively discussion. 

Financial sector dynamics have evolved rapidly with the entry of new and nimble agile players, the customers now expect a more seamless, contextual and more personalised service from their banks. To provide this, banks need to utilise data more effectively, apply advanced analytics to better understand customers, not predict their needs more accurately. Real-time analytics can allow them to have more meaningful customer interaction, as well as drive revenue opportunities through precise and targeted marketing. However, achieving this is not easy, as banks faced numerous challenges. Increase digital connectivity with billions of interconnected devices has led to an explosion of structured and unstructured data for banks. Banks are saddled with complex legacy technology architecture with numerous data silos. By integrating these to have a single customer view and real-time risk management remain a challenge for many. They struggle to ensure the quality and timeliness of data to drive accurate business decision with speed. The data security and compliance challenges have increased. 

Recently, we've read reports that revealed ransomware attacks have increased by almost 150% in 2020. The cyberattacks have become much more sophisticated, innovative and organised and even targeting the pack of data. The repercussions for this are enormous for banks, not just monetary, but also reputational. Given these, banks need to rethink their data architecture and analytics framework much more strategically and holistically. They need to address evolving customer needs and show data governance, immutability and data protection in real-time. Coming to the first topic of discussion today, which is transforming towards an agile and future-oriented data architecture to drive revenue. 

Now, to drive intelligence-based decisions in revenue, the data architecture would need to integrate multiple data system in silos. I want to understand where the banks are on this journey. Hendy, if you can share with us, where is your bank on this journey? What has been your experience, as well as some of the key challenges you face in this whole process? Share with us the initiatives on how you are addressing these challenges.

Challenges faced due to legacy systems, different data formats and speed of technology maturity

Hendy Gunawan (HG): The biggest challenge that we face right now, is all the legacy systems that we have in our banks, with all the different systems for funding, credit cards, loans, etc. It has a lot of different data format, etc. That's what our biggest challenge is right now if we want to integrate them. Some of the initiatives that we currently have with the head office in Malaysia, including future initiatives that move towards integration. For example, for enterprises that need data management tools and data science platform. We work together with all related parties in Malaysia and Singapore, so we can have more or less common tools or platforms that we can work together. The data integration from data warehouse, or even from the extensible markup language (XML) data can be managed better in the future.

NA: Legacy systems and different data formats are big challenges that most banks have. Share with us some of the initiatives about how you're integrating these multiple systems. Are you prioritising which systems to integrate first? Perhaps you can share about that.

HG: Some of the systems that we try to integrate, the priority will be the risks and financial-related systems because we have regulatory reporting for us and for risk. We have Basel and all of those kinds of initiatives. Those two related systems and reports is our priority over the others. But then again, for data science and analytics, we use those data so it's more or less, we move together. The main subject that we are very interested in right now is risk. 

NA: Han,  share with us some of your experience in this journey so far, the initiatives you've undertaken. As we heard from Hendy, about the challenges they had with regard to legacy systems. How is your bank building agile data flows to ensure and collate all these data for customers? How are you utilising this data with speed?

Han Hwee Chong (HHC): The challenges that we are facing, and to a certain extent, across the region, is about the different speed of maturity of changes that are required. We are talking from the IT platform about the integration at the front, and even on the resource challenges methodologies. Certain things that potentially on a method side. It's easier for us given that we potentially - having the skillset in-house, we are able to develop that, but to then roll that out that itself can be a challenge, depending on whether the system  and people are ready. When we look at agility, we are looking at the more targeted areas. What do you want to achieve? 

Digitalisation is one key area for us, and we have embarked on digitalising the bank for quite some time now even before my time. Agile as a way of working and we as a bank is very big in that. We have flipped quite a few areas in agile. In fact, we are pretty much at the last mile, potentially at the more of a support function area that is I would say yet to flip. But that doesn't mean we are not performing. We are not managing in an agile manner. The key for us at the moment is digital and to make sure that the platform is ready, connectivity is good.  

How do we then personalise it to our customers? Those are the key areas of focus. The product that we are going to come up with, from a risk, personalisation and all the regulatory perspectives need to be looked at in a more holistic manner, rather than when we look at a single product, like how we have been looking at them for a long time. Like a mortgage is a mortgage, a personal loan is a personal loan, and it’s no longer that way. If we look at personal loan as one single product, from a risk management perspective, it’s straightforward. From a system perspective, it is straightforward. But when you want to personalise that, we want to make it at a customer level and apply analytics and AI on it. All the banks, I'm sure at a different speed, we are all embarking on those. The real challenge is, how do we then make sure that everything sort of moves in tandem, rather than if one part of it is lagging, then they will put everything behind.

NA: Speed, that's one key part, and building the system in such a way that you're able to build analytics be able to personalise the interaction. Definitely, that's the other part. I'm aware that RHB has been working towards a giant system. Give us some understanding of how you're building a giant data flow. If there are specific initiatives taken towards having those real-time or agile data flows by connecting your different systems. 

HHC: We kind of move backwards, our starting point is always on customer experience. When we start off with customer experience, and then move backwards to see what we need to do, rather than coming from almost like a bottom-up, what we need to do and therefore, potentially what we can do. Those are the keys in our view to drive objectives that we were striving for.  

NA: At this point let me pass to Kamaruzaman. Tell us what changes are required in your data and technology foundation to address this real-time data capability? What are some of the key challenges you faced? How did you build that foundation especially, targeted towards advanced analytics?

Kamaruzaman bin Mohd Noah (KMN): We have embarked on this journey for quite some time, at least five years back. Data today has been a lot more meaningful in Maybank, again, as mentioned by my colleague, as well as my fellow banker in RHB, Han. Back then, we knew what we want. This came from multiple items or multiple, I wouldn't say issue, but more of a problem statement, for lack of better words. Back then we know what we want. 

From there, we had the challenge of selecting the right data, and then pinpointing, which would be the right source of this data. At the same time, that provided us with the opportunity to find the right tool to get all these data. I can share with you the source of this data is, as we all are aware, came from our backend systems. It came from our host systems, and even the loan origination system (LOS), it contains a lot of data. Let's say if you talk about outstandings, we talked about impairments, areas, the data are all at our fingertips. By knowing what we want, then we work backward, what sort of data we wanted to see and at which source. Even from my own experience, for the same data sometimes it is available in multiple platforms within the bank. It all boils down to what exactly you want. I would call it the data currency, whether you want it T-1, or whether you want it T+00 time. At the same time, when we were part of this project, we always ask the question, yes, we have this data, but what sort of details do you want? Whether you want it to be just a connectivity, or it's just a static account data? We went down to that detail. I wanted to point back that this created an opportunity for us to find the right tool to use in the bank. I'm happy to share that in the last five years, we have explored robotic process automation  (RPA), where again, I'm sure all of you would agree with me when it comes to data. That somebody in the bank needs to download this data, need to filter this data. 

The Asians, when I say Asians is because when I move around the regions, they like to use the word data massaging. It is all about filtering the data. Again, we use Python in finding the right indicator, seconded by the right parameters. I'm sure these are nothing new to you and we use a lot of application programming interfaces (APIs) in our data interaction. I just want to touch when we do this, we look at all angles, whether it is from a risk perspective, whether it's from a financial perspective, and, again, I mentioned just now, a regulatory perspective and last but not the least, is profitability perspective. For us, at least in Malaysia, we know that for transparency in financial reporting, our tools have helped a lot in Financial Reporting Standard (FRS) reporting, when it comes to greater transparency. Keyword from here that I want to share with you is, we have moved quite tremendously in tool utilisation in helping us, especially when it comes to risk management, and forward risk management as well. 

NA: We have with us, Razak Idris, head of transformation management office at Bank Muamalat. Share with us how you are prioritising real-time data flows, how you're building that right foundation for real-time data and advanced analytics in your bank, and what kind of impact does it bring to you.  

Razak Idris (RI): Muamalat is one of the smallest banks here in Malaysia. From our perspective,  size gives us that flexibility and agility to implement the sacrifice of parts and to nail down or implement exactly what we need to do. In this case, to create that agility of data flow between the front to back, we created a few interfaces that allow the customer to directly input their data. We routed into the back end of our processing quite fast, and back to the bank, and that allows us to speed up the processing. Checking this electronic know your customer (eKYC). That happens to be better than GT, but it’s not much different from the other banks in Asia. Our infrastructure is still a bit of a legacy. Our core banking solution has a lot of limitations that we have to address. That’s a point of the key challenges that we are still trying to address. We somehow managed to resolve it through a combination of automation plus a bit of human intervention here and there. It's a meaningful approach. The other thing is similar to the other banks in Malaysia. Regulatory has been quite a key challenge to convince our regulator that we are doing good. We are addressing all the regulatory requirements and we're taking our customer requirements to the core. These are all our key challenges. 

NA: Legacy is one challenge most of the banks are facing as we’ve heard that repeatedly from our panellists today. At this point, Ang, I would like to take some feedback from you as well. What is an effective strategy for data architecture, how to drive revenue capabilities, advanced analytics? Share some of the initiatives that you have undertaken that have brought about change in your business revenues and profitability.

Effective strategy and design for the data architecture includes risk management, optimising data along with agile technology and data flows 

Swee Leong Ang (SLA): I wish I could say it's all about revenue, but my colleagues from the other two banks have mentioned, in the first perspective of lens, what we look at is a combination of what the reporting or use case objectives are. Some could be revenues or risk management. The Maybank speaker talked about better transparency in FRS reporting. The strategy for many of us at the moment is more of how do we optimise pre-existing use, it may not be revenue-driven. It may be efficiency, effectiveness. For banks, in particular, many of us have legacy data architectures. You can call it a data warehouse, an eclectic innovative solution (EIS). It's just that the technology has now evolved to a point where we can aggregate, do better metadata and master data management than before. In optimising, the lenses that we look through are multifold, one is strategic. Are we at a stage where we don't have a platform or do have a platform? Is the strategy to migrate, whether it's to cloud or just to enterprise-wide platform? Or is it an incremental step? There's no one size fits all in strategy. The most important part right now is to assess what you have, the maturity of your platform, and the priorities in what use cases need to be enhanced further. Within our bank, we are running multiple initiatives based on various data sets available. Some could be focused on robotics, and processing optimisation. Some could be more focused on  revenue and insights. In each case, the appropriate approach would be to do a detailed business study, do your justification and match the justification to the returns. It may not necessarily be revenue. It could be cost minimisation, more efficient or better coverage in analytics or customised behaviour and products. The main thing to this is proper quantification. The days of a simple business case are gone. People need to analyse and justify the population of data and what they expect to see as they return.

NA: It's a various aspect. It's not just revenue, it’s costs as well as risk management. Data is permeating across business functions.  Again, I'm coming back to it. For example, creating real-time data flows, as we've been discussing, are there some major changes that you did, which improved your functions in such a way? Has CIMB taken initiatives toward building agility in its data through clouds or some other initiatives?

SLA: We need to look at a bank slightly different from other industries. Banks are financial service providers. That sounds very fancy until you realise that finance means numbers, numbers mean data. Probably, the most critical item is our data in whatever form, from bank accounts to transactional flows. The adoption of cloud in particular brings with it a lot of recent regulations. If we look at other countries, for example, China has implemented its data security law. There's not much technical element of it, but they are essentially treating data like a valuable commodity. Within the laws, they talk about export controls and sanctions. If you look at financial services, the cloud will not necessarily be an enterprise-level migration. It could be based on certain use cases and classes of data that either the bank deems to be more mature and ready or less vulnerable to attacks. I can't speak for everyone here, but in ASEAN in particular, we've noted that the cloud migration of the data itself has been doing baby steps. Most of the cloud utilisation has been more on software as a service, tools where the data remains on premise. But there's no one size fits all. It's simply that our data is not just our most valuable assets. Banks have a fiduciary role to play, and we have to protect that data for the good of our account holders, lenders, borrowers and investors.

NA: At this point, let me ask Thomas from Hitachi Vantara to share his perspective. Give us an industry experience, what is an effective strategy and right design for data architecture to drive revenue and various other cost-efficiencies and requirements that banks have from their data.

as De Souza (TDS): We're witnessing a bit of a sea change in data architecture. There are three key trends. We are seeing their change in architecture, but we're seeing that across existing BI systems. Other participants mentioned earlier, there's an evolution for their existing systems in place, legacy things like big data, and other data platforms. What we've had before is starting to change. One of the key pivotal components of this is the emergence of cloud-native technologies, and from a data analytics perspective, high-performance computer. This allows us to have new capabilities. This is faster analytics and real-time analysis, and much larger data sets. If we start to embrace AI and machine learning (ML) and deep learning, we start to open up new capabilities within the type of products and services that financial services organisations can provide. When you think about this change in the architecture, and how does that impact my operating model, business, products, from a financial services perspective, I would agree with Ang. The opportunity is beyond revenue because it could be efficiency, cost reduction and increases in control. These are all important components of running a bank. One of the big things for me is that it increases agility, especially these sets of technologies and architectural changes allow you to create a common converged platform and a set of tools that reduces the complexity. This makes data available to a wider community and the data more embeddable, usable, and the method, whether that's advanced analytic components, or your data that is kind of new types of data, both structured and unstructured. It starts to create a new common platform for using data to drive revenue. The other benefits that you would see, I would say that what I'm seeing and what we work with some of our clients, when we look at things like risk that was mentioned, we're seeing our clients looking at new mechanisms, not just things like VA or extended shortfall. They're using other, ML methods like XGBoost to assess risk within their organisation. 

We're seeing natural language processing, which is a very large kind of component of ML, being used for robo advisors. We're seeing everyday reporting type of activities where before we used, churn them out, cookie cutter. We're starting to analyse those reports, both structured and unstructured data to derive meaning. We're seeing some different use cases. For me, if we focus on revenue, the top-line revenue growth opportunity is around new classes of products. You could imagine private wealth advisors are big in Southeast Asia and Singapore, you could see these new types of robo advisors providing in-depth analysis against people's portfolios and investments, the type of service that you would get from a private banker in the past. If you extend that out, you could see someone like myself choosing a bank account because I could have a specific personality or advisor to help me. Technology is changing, but it’s having a major impact on financial services right across the board.

NA: You mentioned a few emerging technologies that are having a lot of impact across banks, AI as well as advanced analytics. Now at this point, I want to talk a bit about ransomware, and cyberattacks. We've seen a surge in these attacks across the banks during the pandemic. Banks need a multi-layered defence mechanism to protect their systems and data from such attacks. How do banks build an effective protection and resilient data framework for ransomware and cyberattacks? 

Han, what are some of the essential elements to ensure there is minimal impact of these ransomware attacks and have business continuity during these attacks? Maybe you can talk a bit about the challenges and what are the essential elements?

HHC: The challenges are always going to be the speed of all these perpetrators  to penetrate the world that we've built. That seems to be at a much faster pace, as compared to what a bank is able to cope with. While the bank, at least, from RHB perspective, while we continue to harden ourselves on our wall, the key for us is to make sure that the speed of detection and if we got penetrated, then how do we respond? How fast do we respond to that? Because in this day and age, thinking that we won't get penetrated, I don't think that is something that we thought so, I would think that monitoring detection, as well as how do we then react to it, is quite key. Then the usual protection of our data - what we need to do, all those things are rather business as usual (BAU) works for us on a daily basis. If we see any vulnerability, then we will put on more controls surrounding it. Practicality-wise, the challenge is, do we still have sufficient skill resources in town? From a Malaysian perspective as an industry, the skill set is still rather limited as compared to some other more advanced countries. Those are the areas that we need to be a bit more creative in how do we look at cyber risk altogether. 

NA: The speed of detection, reaction and skilled resources, you managed to touch upon some very important points here. What about data storage? Ensuring the data is completely secured in back-end, even if there is an attack. Any particular initiatives around data storage or ensuring that the data is completely secured in such an attack? 

HHC: As a bank, we have embarked into multilayered securities for quite some time now. But as mentioned, given that new risks evolve and emerge on a daily basis, unfortunately, we don't know what we don't know, but what we know is to make sure that  those multilayers we do continue to test them and try to penetrate them to identify vulnerabilities. Those are key. We, as a bank is embarking into how we look at emerging risks to make sure that we are ahead of the curve. If you ask me, I don't think we are there yet. But we are definitely building all the necessary building blocks to get us to look at all those emerging threats and risks that we should be proactively managing before it even hits us. Those are key for us at this stage.

NA: Ang, perhaps some view from you as well. How do banks need to redesign their data protection storage to ensure ransomware resilience? What has been your strategy and design in your bank? 

Having a multi-layered approach to address cyberattacks and ransomware

SLA: We need to understand the fundamental operation, approach or vector of ransomware. Ransomware is not something that happens instantaneously. The defence perimeter of an institution has to be first compromised. The ransomware will then build up its little building blocks. I use the term loosely, until the day zero of the strike point, then everything starts to propagate, PCs  and servers start going down. The main challenge with modern day technology is that you talk about storage, BCM, a lot of systems replicate in near real-time. My backup, disaster recovery site will have almost identical settings to my production environment. That's where ransomware can become very dangerous because they've already built the same parts, the same infected malware, probably in your backups as well. If you cut over and swing to the DR site, or to your recovery site, you probably go down after that as well. To the earlier part, it's the risk  and impact assessment, and to work out the issue or the root cause, so that when we cut over to that, when we ensure we go to the other side, we don't infect the other side as well. It's quite technical, folks will have to work out. Maybe, my current weak state systems are already equally infected. I may have to restore an image from a month ago. The main reason why ransomware and resilience are very challenging is that it's not just flip the switch or trigger a tool. A lot of analysis goes into play. The other part that  Han, briefly mentioned is can we or are we ahead of the curve. 

A lot of times today ransomware or other perpetrators appear to be or are alleged to be state-sponsored or state-linked actors. We must humble ourselves to think that there is a high likelihood that these actors may have better resources than even a bank. I know that sounds very frightening. If you look at their ability to attack infrastructure, financial institutions, telecommunications, these people are well resourced, well-motivated and probably have higher-end tools and capabilities than some of the commercial providers. To your point, the strategy is a continuous evolution, analysis reaction to your point, but the main thing is hygiene. The funny part about data since we're on the subject is that data and system hygiene is very important. You need to keep your systems clean, vet or determine what happens when you open up new connectivity. There's no secret sauce to this, the regulators have put a lot of thought into it as well, the guidelines across ASEAN and Asia are very prescriptive. The main thing is, one, proper interpretation of those guidelines and, two, comprehensive implementation of those requirements.

NA: You touched upon some of the key areas here as well. How to stay ahead of the curve, which is very difficult because of the amount of resource that is available to them, the speed of change, and how to ensure your data at your back end  in disaster recovery  are equally secured. They perhaps take the threat in time.  Ee Laine, you can share with us the experience of banks in the region. What is an effective strategy and some of the best practices that you have seen?

Technology is a key enabler of data architecture, but people and processes must also be addressed 

Koh Ee Laine (KEL): I echo what Ang has said, it's true. We cannot keep up the awareness with the practice of all the technology they need to come up with, faster than we can even come up with things. Having a multi-layered approach is the key, and this applies to not just your production data, but your backup data. The last thing we need is that when we're hit with a ransomware attack, and we're not able to recover- like what Ang mentioned, it's all about the speed of response. Your speed of recovery. Being able to recover as fast as possible in the event you need to, is very key. We need to ensure whatever data protection solution that we put into place, or how we secure our backup data, it's crucial. Having measures to ensure the proper authorisation, authentication, or access to the backup data, ensuring that only the correct or the right people have access to it, having locks in place to ensure that your backup data is not tampered, these are basically some of the key areas that you will need to put into when it comes to the design of your data protection solution. Aside from that, the active monitoring as well, of both the live data, as well as the backup data. When you see anomalies, changes in the backup patterns and all that being able to detect and respond as fast as possible, allows us to mitigate the risks when it comes to ransomware attack. 

NA: Suraj, a comment from you as well, from Hitachi’s point of view on some of the best practices you have seen.

Suraj Kotipalli (SK): We have seen almost 200% increase in ransomware attacks in this post-pandemic world. Everyone needs to have a very good ransomware mitigation strategy. It can be taken into four different areas. One, is having a very good communication and training policy, which is our cyber resiliency or cyber insurance policy within an organisation, which includes training to the employees on how to look at those suspicious emails that are coming in. When to open, when not to open, they need to be very clear.  These attacks will happen, and organisations are vulnerable in the skills and agility of these hackers coming into it. These attacks will happen, so the question here is how do we react to that? What kind of strategies do you have for employees to respond back to it and how is your communication with the stakeholders? That is one area, having a good communication and cyber insurance policies. Second, ensuring the right level of authentication and encryption strategies because the data need to be predicted not only just where it resides but during the transit. Data is not just in one place, it is moving. It is going to cloud, it is coming back, it's going to vendors, partners. We need to ensure data during transit, as well as data at rest, both are encrypted, at the same time having a good authentication policy. The third important point here on ransomware protection is we can have great strategies on the back-end of the data centre level, and have a great set of tools and technologies. What we are seeing in the industry is the data at the endpoints, especially at the laptops, desktops or mobiles, where these users are carrying it, they are the most vulnerable. Post-pandemic, people are working from everywhere. People are using their own routers, they are using their own networking, and devices to connect. How do we ensure that the data is protected at that place? Technology is good, but there's a lot of other data in these endpoint devices. Having good technologies to automatically sync the data from the endpoints into the data centre. In the event of any attack, how do we help these users to immediately recover the data from the central location and give it back to them? Because a lot of banks have a policy of not paying to this ransomware hacker whatever it is. What you did, they're not going to pay. That's insurance, that's a policy. We don't want to increase the sanction where hackers piping to them, which means we need to see how we can give the data back to the users. 

The last point here is, the last line of defence is backup. When the attacks happen, the only way you can recover and get the data is where we have this backup. These ransomware hackers are attacking the backups these days. Ensuring something like using some kind of object storage solutions and working with partners such as Commvault, to have a good backup solution where it's completely locked. We have some kind of object law data immutability solution wherein whatever scenario, it is, even if a ransomware attack happens, they cannot change that particular data. It will create another version of it, but not modify the original one. These are different strategies that we can take altogether because not just one strategy works. You need to look at the holistic view to protect and have a good mitigation strategy, as Han said. Ang said attacks are going to happen. Nobody is 100% foolproof on this so the question here is do we have a good policy of recovering it or mitigating that risk? 

NA: It's not data in transit, as backup, it's the real-time data. Good to hear various perspectives on how to ensure data immutability there. The other angle that we want to touch upon is data governance. Because your data security, data immutability is fine, ensuring data quality, reliability,  proper data governance has become critical to have this data actionable and insightful. What are some of the challenges with regard to data governance? How are you ensuring proper data governance at your bank especially with the cuts because the structured and unstructured data volumes will be so much, how are you addressing this data governance issue?

HG: The biggest challenge, I can mention some of them, because they are very big for us. First of all people, which is probably quite surprising but a lot of people even within the bank see data governance as an additional responsibility or job instead of seeing them as a value because data is a new oil. It will need more time. If I can imagine 15 years ago, when I started data analytics or as a data scientist, they don't even know that, especially for Indonesia. Maybe it will take a couple of years before data governance is seen as part of data management and data quality is important for aspects in financial and in other industries. The second challenge  is the systems. The third challenge is about data protection, especially since we have a lot of structured and unstructured data. What we have done, especially Maybank Indonesia, is we take care first of the personal identifiable information (PII) and the sensitive data,. Other data, we can have a more relaxed policy. Even if we use cloud as some of the colleagues mentioned earlier, we usually choose which information we can transfer to the cloud platform. Especially the PII and all other sensitive information, we have stricter policies. We need to manage to have few people can access that information. For other information, usually, we give more freedom. 

NA: PII and customer data are much more sanctions for the bank. You have to be much more careful about that. You're privatising the data, which is more sensitive and with regards to risk management,  Razak,  how are you handling this issue of fragmented data ownership, and the standardisation of data processes, in order to have a better data governance?

RI: From our perspective, we always go back to the three key things. When we approach these projects or programmes, we look at people, process and governance, and the technology part. The process and governance regularly SMBH. Central Council that we call it Data Management Council looks at the overall data management and data quality initiative, as well as all the policies, procedures, and challenges that we need to address as a bank. With this central governance, it enables us to centralise all these key common issues that happen across the different views and different lines of business as well as operations. Then this council will drive and determine what types of policies, processes, and initiatives we need to undertake to address all the important or key challenges of any progress.  

On the process part, we looked at all sorts of policies and processes. We came out with data, we have policies, the dimension policies, even data quality, policies and processes. That helps us to standardise across the board and we try to define or assign data ownership to the business and make sure that each type of data is properly defined with their respective owners and what these owners are supposed to do. On the people side, we look at capability, time and training to ensure that our people have the right skill sets and knowledge to demand all this data retention. We designed our training programme based on what they need, and the level from the management and working levels. We put a carrot and stick approach. We put the data governance or data quality perspective with our key performance indicator (KPI) to make sure people implement all these policies and procedures and make it a day to day practice. From the technology side, we look at implementing technologies like data warehouse, data lakes, and we implement data quality tools, data capture tools in total to make sure that all these initiatives are implemented and we are able to measure what sort of impacts we have from all these initiatives that we are doing. That is allowing us to improve all our action plans.

NA: It's the ownership of data, the resources and technology as a major enabler. You mentioned a good point about KPIs and then how you're measuring the impact that it's bringing so that we’ll make a difference. Ang, perhaps a perspective from you around data immutability. How banks are  ramping up risk management strategy in data security and regulatory compliance?

SLA: Data governance and data quality guidelines have been around in banks for almost 10 years, primarily driven by the ISO, the Basel BCBS 239 requirements. It's very hard for a bank to assess credit or market risk if your data quality is not there. The processes we talked about have been there for a while, governance, councils,  community and committees. When people bring up the terms like data immutability, banks will be more familiar with terms like non-repudiation and integrity, meaning that’s your account number, balance, ID. It's slightly different again. The security is just to ensure that whatever we have, is not susceptible to unauthorised access, deletion or alteration. It could be ransomware, hacking and fraud. The main thing that applies is that whatever data we have represents an accurate and timely description of the customer, accounts, and even the bank's own financials. When you talk about risk management or strategies, I don't want to go back to security in layers and the like. A lot of effort these days is more focused on cleaning up and ensuring the data quality, not just in the old-fashioned way of saying, oh, we know the ID number is correct in system A and system B. A lot of effort not only in CIMB, but a lot of banks are doing this. It is now taking the opportunity to come up or enhance its existing single customer view. Adding to the layer of what people call data lineage, where did that phone number come from? How sure are we that his date of birth or the length of the relationship with the bank is accurate? How complete are the details about his portfolio and liabilities with the bank? A lot of tools and investments, we see not only within ourselves but within the industry. We focus on master data management. People now rebrand. Some vendors rebrand or package these as part of that data governance tool set. I'm not saying it hasn't been there before. But again, coming back to an earlier part about strategy, as you build bigger data architectures, you want to ensure that the quality of the inputs is there. Depending on the maturity of the bank, whether the bank has already consolidated a lot of their data, the other efforts then will be; A. secure what you have, B. establish a process where you can manage the various datasets before they become consolidated to ensure the data quality. That's a different concept from immutability. Compared to the more technical aspect of immutability. Immutability has been made popular by folks promoting blockchain and the like. That's good for transactional immutability, but not necessarily for the larger data set that the bank holds.

NA: Suraj, from Hitachi’s point of view,  would you agree with what Ang just mentioned, that these are some of the key issues that we face in data governance today?

SK: The data governance, it’s nice to have this conversation. For most of the organisations for 10 years, with the increasing focus on analytics, business intelligence and the importance of regulatory compliance, this has become mandatory. The issue here is not just the structured data. The unstructured data is growing at an exponential rate. It becomes even more challenging to bring some governance intelligence around this unstructured data. What we are seeing with banks is more like a journey. Intelligent data governance is more like a journey. It all starts with identifying the key sources of the data within the organisation, whether this is consistently being used. Then everybody touched upon this, detecting and assessing it and cleansing the data defence  to have a better data quality. Then, the next logical step from that is enhancing the metadata. Your Meta mission data  ensure that proper data management techniques can be adopted. Once, we have all this, then you can manage the data processes, automate the processes by data aggregation, centralisation, and reconsideration with the right level of extensions throughout the data lifecycle.  

Finally,  bring the right level of detail that can be visually interpreted so that the businesses can make the right level of business decisions, and ensure they meet the current and future regulatory responsibilities. This is what I call all the aspects of data governance, and as a journey, what are we doing in this space? Being a data company for more than 60 years, what we do is work with banks and financial institutions. We bring the right level of  data governance solutions. We have the tools, processes, expertise that we have in this area, just to ensure that organisations have every bit of data, which is the first step to make the data available. It is insightful, and more importantly, it is actionable. That's what we're partnering with all the banks and financial institutions across Asia.

NA: The other key element that we've talked about is how to utilise this data, real-time analytics and AI to drive customer journey. Even if you have the right data architecture and foundation, you need to have the application advanced analytics on top of it to predict customers, personalise service, have the ability for real-time decision making. I want to hear from Kamaruzaman about what's happening at Maybank. How is your bank utilising real-time analytics and AI effectively to provide a more contextualised experience to customers? Perhaps you can share with us some tangible examples of how you've achieved this and how this has impacted your bank.

KMN: Yes, we do have that technology at the moment where it is currently used for some of the claims processes as well as monitoring the areas. The challenge is on having the proper data in the right format. If the data is clean, then the next tool is taking on the filtering and decision making would be done smoothly. But then again, if you are dealing with data that need to be looked into,  that's where it hampers the decision-making process and that's why it goes into the repair queue and it goes back into the manual queue. We do have that in our setup today, for example, our insurance arm, real-time data. All those apps are submitted. They are indeed transmitted via the tools that we have today, be it Python, AI and API. We are glad that COVID provides us with an opportunity to work differently. With these tools, we are able to expand our work from home opportunity. Certain processes and data are beyond the limit and we will not be giving an assessment. Yes, coming back to AI, we are currently looking and exploring what other further opportunities we can use.  

At the end of the day, it is all about the risk element and how deep and how brave we are in taking that risk. For example, would you have the comfort to leave the computer to do your decision-making compared to having a four-eye check, for example. To a certain extent, maybe some of us would want to explore within a certain limit. We will leave it to the computer or to the AI and within a certain limit and above, we will still leave it to us or it could be a hybrid environment where it's a combination of AI checking it first and having filtered by the human. 

In Maybank we work closely with our risk unit, as well as our audit unit in having the comfort that indeed, we want to do this or what are the risk element that we are dealing with in the event it is not working in our favour. I just wanted to share, in Maybank, yes, we are going in that direction. 

NA: Han,  how are you doing at RHB, how you're privatising AI and some tangible examples? How this has improved your business impact or  profitability and revenues?

Data needs to be managed intelligently to ensure that it meets the business needs along with risk management, compliance and cost optimisation

HHC: We tackle that issue of AI and MLs from a credit lifecycle perspective. The focus currently is on transaction level. Is there more insight that we can gather and therefore personalise it for our client? All the way to even collection is what we are currently doing, more analytics on. From a risk management perspective, we continue to strengthen on the credit side, that's ongoing. But other areas such as liquidity are getting a bit more critical. Therefore analytics, we will be able to get some insights for us to better manage even though our liquidity position is quite strong. That is an area of risk that we should not let go too far from our sights. Apart from that, from ML and platform perspective, we are devising and it’s more of a hub and spoke. We are building capabilities in different divisions rather than centralising it as a group-level analytics. The group levels on the hub itself, the key element for them is to provide infrastructure. Make sure the piping is there, building a proper data lake for the analytics need across the bank that currently works quite well for us.

NA: Good to know you're doing that hub and spoke model. Any specific areas you want to highlight where it has brought a significant impact on the bank? 

HHC: The credit part is definitely bringing us a fair bit of traction. Again, when we look at the credit lifecycle, the first area that we tackled was the front end. Coming from increasing acceptance rate, decreasing default, those are very basic. Incorporating new information, data into our analysis gives us a new level of insights that we never had previously. Because when we started building models, most of the banks here in Malaysia started from a browser perspective. Then, we built on those capabilities from a commercial perspective. Then, on the compliance side, we’re still complying with regulations. The key part for us is two-fold. One is the front line, one transaction side, and the back end on the collection side. We have done a fair bit of analysis and that has surfaced quite well. 

NA: Hendy, a perspective from you as well, especially with regard to how to optimise cost and efficiency in analytics. How can you rationalise data across the organisation to enable this real-time data more efficiently? Any specific initiative or best practice you want to talk about? 

HG: One of the initiatives that we do, for example, we start to move the data that we use for analytics to cloud base. In this case, we try to have an end-to-end process from data gatherings, storing and then the usage can include data visualisation, and data science platform. We will start to look for pricing and collection scoring engines. In a sense, for the bank of our size in Indonesia, having a cloud-based storage will be more cost-effective instead of the physical one. It is limited to the data that we can upload to the cloud, or that we can store in the cloud. 

NA: Ang, a perspective from you as well, how are you building, optimising costs and efficiency, the analytics? How are you rationalising data efficiently? 

Unified customer experience and data availability is the key to digital transformation 

SLA: To build on the different points of the speakers, think of it as an ecosystem. Some of the tools that started out with say, risk management on risk models, etc. The base engine has other modules. We've tried to encourage other people to have use cases on that. Similarly, things that happened. We are a regional bank. Things that are successful in other countries can be replicated. One example  is that our chatbot is an AI-based tool. We have both a retail and a commercial or corporate version. The corporation started out live in Singapore. We've only replicated it in Malaysia. What we find is not only does it improve the customer experience, but in a lot of cases and we reduce the number of call centre queries because the chatbot answers the basic questions. Number two, the error rates in certain apps or openings go down because the chatbot has guided the customer or the prospective customer through the process already. It is not always about cost again. We started the AI chat more to help customers.  It picked up a lot during the COVID period because not everybody could go to the branch. We found the cost-savings as an added benefit, in addition to improving the customer experience.

NA: A chatbot is definitely a very good example. Banks have implemented that to improve customer service and make more efficient customer interaction. It may work in some areas such as customer experience. Suraj, you could share with us a perspective from HV as well, from Hitachi’s point of view as to what you've seen, how banks could build this cost and efficiency in analytics? 

SK: As an end-user of a bank, I have seen chatbots as well as these very interactive, real-time analytics on my account services where I can make some fast decisions on my portfolios and stuff. The transformations of banks are bringing AI and real-time analytics into the solution. I'll give it a little bit of a different perspective. What we're seeing is, this transformation is going to continue. Banks will embrace AI and ML technologies. But the key here is how fast we can bring this to the market. How we can do this in a more cost-effective way, not blowing up the cost of the infrastructure changes. The key here is to bring hybrid cloud kind solution. We heard the challenges that the banks are facing, where a lot of applications are still legacy, they're still on-prem. They have issues with data control. Cloud provides new innovative AI and ML services readily available. The question here is, how can we leverage those cloud services,  and ensure the on-prem applications and data controls or management, bring them together to get the best outcome? I would say it's more, that's why I say that hybrid cloud is very important to use those services and immediately bring them to the market. 

I want to give you a real-time example of what we have done for a large bank in the Asian region. We've built a private cloud solution for the bank, in partnership with VMware, and helped modernise the data centre and its applications on-prem. The bank cannot bring all the services that are available in the cloud, and built from scratch on-prem. It wanted to go and use certain services in AWS, for example, things like recognition service, the sage, and other stuff, until the transcript services. With tools such as VMware Cloud Foundation and Hitachi RAC scale, what we have done is, we help these applications to be integrated into the cloud centre, private cloud solution, integrated to AWS, which helps the customers to move the applications to the cloud and bring it back or  move across and keep the data still under its control. It's not like completely moving the data into the cloud. It solves both problems, where you can move the data. For unstructured data, we have the object storage solution from the data within the hybrid cloud. We put in the object storage because the data is just not structured, there's so much of unstructured data. What customers do is, they can easily sync the data from on-prem to the cloud. It's just syncing only, not moving the data company, just sync the data, then run the cloud services. When you want a fast process, the data in the cloud get the outcome. The metadata enrichment, cleansing,  any new innovative AI and ML services, and then bring only the outcome back on-prem and then delete the data in the cloud. So sync, process the data, get the outcome, and delete it. 

What happened now is the data is still on-prem. It is moved to cloud process there and got the outcome they want, and there is no data in the cloud. So, there is no worry about, the data is leftover in the cloud and we have some risk exposure or data sovereignty issues. The beauty is when you just delete the data in the cloud, there is no interest cost as well. It's very cost-effective that way. You're just using the service only in that few minutes of using the transcripts or risk or recognition service. This helps customers to get the best out of both, the cloud as well as on prem, at the same time keeping the controls and the costs load. These are some nice technologies they're using - bringing these AI and ML services very fast to their applications, and then helping customer experience as well. This is what we're seeing. I'm happy to say that we are partnering with banks in this area and helping this journey. Again, this is part of the solution. Because in this model, you don't need to use just one cloud. You don’t have to move everything to one cloud. You want a few services stream in AWS, you want a few services from Azure, you can have your services from lecture, some Google solution, for example, because they have so many services available. We want to leverage that. It gives them flexibility for banks to move the data where they want, run the applications. I run this service and then bring it back, so that's the real beauty of this one.

NA: I would like to ask for comments from all our panelists on how to build this data-driven customer experience? 

HG: Data analytics  and data governance journey,  we can call it baby steps. Because the data is so big, it's complicated, it's impossible to have it all in one to three years. It's a long journey, but if we don't take a step, then we won't get anywhere. Even the smallest step is very important. That's what we are trying to do.

RI: The key here is, it's not so much of technology. It's the people first, the process, then the technology. People rarely address the root cause of the key things like the people and process. They just jump to technology whereas technology is just an enabler. The people and process must be fixed first or must be ready before we met all these sorts of digital transformation data submission initiatives.

HHC: One piece of advice I would give is to look across capabilities across all. Like what Razak was saying, you need to look at people in the process. You need to look at technologies and therefore look across where you are today. Where do you need to bring it up first, and then in tandem, where you need to bring everyone together. Otherwise, you'll find yourself stuck in certain areas that potentially, you have the tools but you don't have the people or you have the people but you don't have the processes so, moving in tandem, is quite key for us.

KMN: At the end of the day, what we keep in our day to day process and forms part of data quality, since any bank or even any industry today rely on data to make decisions. I'm a strong advocate of clean data or data quality. That itself is a big project if you were to do a clean-up, but then again,  as long as we start somewhere to get our data in order and in place, that's the right thing to do. I agree with some of our colleagues here. It starts with the people as well as the operating procedure. We need to be very open, moving forward that as we point out as part of - at least for Maybank in our digital transformation, we should be open to living in an environment where we live with or in nature, where human and technology will be mixed up together, living harmoniously.  I'm a strong advocate of technology. I believe we have tools in the market today that is able to help us when it comes to decision-making without compromising the risks. 

SLA: In the process of digitalisation, banks in particular, must remember their data represents the customers. If we treat our data with the respect we do to our customers, we will put the right processes technology and policies in place. 

TDS: The financial service is data-driven, we are seeing a huge change in the technology landscape. We are seeing  opportunities to drive top-line growth and drive efficiency within these organisations. This is predicated on this change. I would say that in financial services, data has always been important, but it's especially important today because of the changes that we're seeing both from the customer perspective, and the rise of different sorts of intelligent technologies.

KEL: We've all heard that the unified customer experience is very key, as well as data availability when it comes to digital transformation. Therefore, we need to ensure that whatever platform we have, we are able to intelligently manage our data and ensure that it's able to give us the agility, scalability, flexibility, and security they need for our data. 

NA: It's clear that data effectively needs a combination of holistic data strategy with technology, data processes, people, and analytics framework. You need to prioritise to where you want to put what kind of framework and how to go about it to have the right data framework that can give you a unique competitive edge, not just in contextualised customer experience, but real-time positions and bringing greater cost-efficiencies. We heard about hybrid data cloud, how it can bring agile data and real-time analytics through AI and advanced analytics and how it can improve your customer journeys. Along with that, is how to address ransomware, and data security as a challenge. That's one key area and banishing that ransomware, data security challenges, and ensuring that the quality of data is actionable and insightful. The banks can reap the benefits that they seek from their data. It was a very interesting and insightful discussion where we got a varied perspective from all our panellists. I want to thank you all.



Keywords: Digital, Competition, Customer Experience, Banks, Structured Data, Unstructured Data, Silos, Fragmented Applications, Technology, Transformation, Data Governance, Financial Services, Data, Intelligence Solution, Risk Management, Resilience, Ransomware
Country: Malaysia, Indonesia
Region: Southeast Asia

Search

View Past Sessions View all

Image
Ron Shevlin on US banking: “Unprecedented regulatory headwinds make outlook…

In a conversation with Ron Shevlin, a renowned banking thought leader, we delved into how the US banks navigate regulatory challenges amid economic uncertainty and strategic shifts

Image
US fintech status report February 2024

Greg Palmer, head of strategy and host at Finovate conference series, analyses the developments in the US financial industry, highlighting how intense competition for deposits drives the adoption of fintech among banks, and allows digital…

Image
Global banking industry outlook 2024: Navigating complexities and seizing new…

The RadioFinance session on the Global Banking Industry Outlook for 2024 provided deep insights into the challenges and opportunities in the banking sector. Leading economists joined the discussion and focused on economic growth prospects,…

Image
Creating a successful global financial centre

Emmanuel Daniel, founder, TAB Global, spoke at CIFF about Shanghai's goal of becoming a global financial hub and how many legal systems create a flexible business climate.

Image
BCA’s Jahja Setiaatmadja: “Digital and relationship are our two strengths”

Bank Central Asia’s president director, Jahja Setiaatmadja, says that the key drivers of the bank’s success in the Indonesian banking industry are its digital prowess and good relationships with customers. BCA is the highest capitalised…