“Banks challenged to productise data to deliver moments of life customer experiences”

    Also watch on:

Senior executives in data, analytics and artificial intelligence (AI) from leading institutions in Asia Pacific review the key challenges in cloud adoption and effective use of data to generate desired business outcomes. However, in this digital journey, banks still have some way to go to reach their destination.

Balaji Narayanamurthy of Axis Bank, Devendra Sharnagat of Kotak Mahindra Bank, David Hardoon of Union Bank of the Philippines, Johnson Poh of United Overseas Bank, and Remus Lim of Cloudera shared their perspectives on operating in a multi-cloud environment and the issues they face in data integration, real-time intelligence and data governance.

Poh noted the struggle to bring domain, data and analytical expertise to generate business and customer value. Narayanamurthy pointed out that the primary revenue driver is data analytics as this helps the bank cross-sell products to more customers. Sharnagat, on the other hand, revealed that the hard part in constructing integrated real-time data is building the system around it, getting the right people and infrastructure.

Hardoon mentioned that the world is now more digital and financial services should be designed around a customer’s “moments of life”. The way to efficiently achieve it is through the cloud. Lastly, Lim highlighted the need for an open, scalable and agile approach to consolidating data into data warehouses and lakes.

The panellists also talked about the integration of multiple data systems to build data driven architecture, implementation of AI, machine learning (ML) based advanced analytics on integrated data, the breaking down of operational and data silos to enable agility and efficiency, and the complexity of data compliance and governance.

The following key points were discussed:

The following is the edited transcript of the session:

Neeti Aggarwal (NA): Welcome to our RadioFinance virtual dialogue. We have with us a distinguished group of senior executives from some of the leading financial institutions in the Asia Pacific. They will share with us their perspective on driving data intelligence in a hybrid and multi-cloud environment. I'm Neeti Aggarwal, senior research manager at The Asian Banker. 

Data is undoubtedly the digital economy's most valuable resource. In today's hyper-connected world with expanding ecosystems, the-real time data volumes are growing exponentially. Financial institutions have access to a massive pool of structured, unstructured and big data. Unlocking the power of this data effectively can empower them with a much deeper understanding of customers’ ability to offer innovative, personalised and contextualised services and to drive competitive differentiation. Artificial intelligence (AI) and advanced analytics application on this data can enable unique, predictive, actionable insights and real-time decision capability not just for customer service but also for risk detection and fraud prevention, to name a few. That said, many banks are still grappling with data silos, fragmented applications and complex environment of legacy solutions. Given the technology debt, a key challenge for institutions is how to transition to next generation data intelligence-driven organisation. On the other hand, new district players that are agile, nimble and cloud native monetise data with speed to tailor differentiated customer experience. There is an urgent need for institutions to not only develop a data foundation that is agile, scalable and efficient, but also utilise it for real-time customised service. Institutions increasingly explore integrated data architecture, supported by cloud-based infrastructure for efficiency, scale and flexibility. 

Now in this session, we will debate key issues surrounding the implementation of an effective multi-cloud strategy to build the right data infrastructure and to drive real-time analytics. The topic for discussions will be: an effective data strategy to transform into intelligence-driven organisation with real-time analytics and decision capabilities to deliver differentiated customer experience; designing the right cloud strategy to drive real-time capabilities with flexible, secure, and scalable operations; the challenges and complexities in a multi-cloud environment and how to address these and integration of data systems in a hybrid environment to meet data governance and compliance requirements. 

Now I would like to take this opportunity to introduce our guests. We have with us Balaji Narayanamurthy, he is president and head of Business Intelligence Unit at Axis Bank. He has extensive experience in banking, analytics and data to drive business decisions. David Hardoon, he is senior advisor for data and artificial intelligence at Union Bank of the Philippines. He is responsible for leading the bank’s wide strategy of data and AI. Devendra Sharnagat, he is senior executive vice president for data analytics and customer value management at Kotak Mahindra Bank. He focuses on advancing analytics and big data for business insight and operational intelligence. Johnson Poh, he is executive director, head group enterprise artificial intelligence at United Overseas Bank (UOB). He has been practicing data science in finance, consulting and common sector. And we have Remus Lim. He is managing director for ASEAN and India for Cloudera and he has been leading teams to empower clients’ district transformation journeys.

So having the right data architecture that integrates varied data systems and advanced analytics application is a prerequisite to drive data-driven or intelligence-driven decisions and enhance customer experience. Equally important is to support this with the right technology infrastructure to bring in the desired business outcome. 

Let's talk about data architecture first. Now in order to compete with new additional players and meet their evolving customer requirements, banks need to leverage this diverse set of structured and unstructured data effectively and to get the desired business insights that they're looking for. First, I would like to understand what have been your key goals and strategic approach towards integration of multiple data silos to build data-driven architecture across your bank? And where is your bank on this journey of integrated real-time data availability to drive business value? David, could you share your perspective from your experience with Union Bank?

David Hardoon (DH): Let me kick this off by saying that the holy grail of integrating data silos is a never ending and continuing journey. I want to start off with that because it’s one which I personally believe and we don't get to a stage of like, suddenly, that's it. There are no such things as silos within organisations. With Union Bank, we’ve taken a relatively reasonable and pragmatic approach by having a sense, an understanding of what is actually the various data assets across the organisation. What are we dealing with? Why are they there? There's a presumption as to why it's being collected. We don't collect data just because it's nice, warm and fuzzy. That also helps in terms of standardisation of the various data assets. 

Consolidate to reduce complexities

Second, going about in terms of consolidating and concurrently making these available from a quality assurance perspective in terms of usability perspective, dashboarding, analysis, or be it in data science. So that's something which is continuous. We're actually quite happy in that journey, in the consolidation across the disparate data sets – it can be multiple systems – in order of facilitating it for the business units or the analysts to start making the insights. However, we discover every day that other data set has not yet been ported across. But it allows us to make sure everything that is consolidated – it could be a virtual or physical consolidation – has merit, has value. We know what we're dealing with. We know how to deal with it. Because it is absolutely critical that quality hygiene, the understanding of the data is in place. Because a lot of times, we end up with a data swamp rather than a data lake, essentially. I’m happy to say it’s quite along down the path but it's taking a very systematic, very systemic approach in dealing with it.

Johnson Poh (JP): There are two parts to this question. The first part is what are we currently doing? The second part is what is our goal, as well as our strategic approach in enhancing the strategy moving ahead? This is pretty standard across multiple corporate organisations where data is one of the key assets where we are hoping to make use of data to generate insights for the purpose of enhancing our business value as well as operational efficiency. Just like most organisations, we're at the stage of consolidation where we are hoping to consolidate various data sources into a tiered system of data stores including the data lake or the data warehouse, data mass and more relevantly – in my context and in the domain of data science and AI – a data discovery layer for the purpose of experimentation, as well as insights generation. We're really at the consolidation as well at a stage where we are starting to think about how we can make use of data to generate an impact for the purpose of our business and operations. 

Moving forward, in terms of our strategic approach, we're hoping to cultivate a progressive yet measured data-driven mindset in driving prioritised business outcomes. We also hope to ensure that we practice this in a responsible way. In other words, we do place good emphasis on the ethical use of our data assets. And beyond technology, which forms the bedrock of data architecture, which is essentially the enabler of data science as well as data-driven analysis. We do believe that the process that enables the accessibility, availability, quality and security of data assets, is equally important. One of the key challenges that we do grapple with on a day in day out basis is this need to marry and bring together domain expertise as well as data and analytical expertise in order to generate the insights as well as value that we look forward to.

NA: So data consolidation, data discovery layer as well as domain expertise. We touched upon some very interesting points here. At this point, Devendra, if you could share about your thoughts on both the objectives as well as your experience on how you are building at your bank?

Devendra Sharnagat (DS): I do see the data journey as like boarding a running train because by the time you're trying to catch up, something has already moved forward. We started our journey four years back with objectives around business visioning, regulatory responses, application of machine learning and also real-time transactions. In this journey, the tracks that we have, which is an easier track to take, is technology. The difficult one is building that ecosystem around it, getting the right people, right infrastructure and people around that. That's where the whole crux lies. To bring that culture, it's very easy to put our data model in place, stating that every single element in the organisation through those four core objectives of business should be put across in a proper data model. However, there is a tag to every delivery. There is also an issue of knowledge and legacy knowledge around every data element within the organisation. Model it right, get the glossary right, have the linearity documented properly – these are the aspects which are very old to any organisation, including us, then we will progressing from this. The entire infrastructure and architecture that we have built around it is keeping all these aspects in mind. There are also two more aspects which makes it more complicated. One is the advent of cloud. So slowly, a lot of systems are moving to cloud, a lot of businesses, which are getting added, are cloud native. We work in an environment where it's now getting hybrid. So you have on prem, you have a cloud, you have systems which you just bought. So how do you virtualise and interact across data sets? That's the code of the architecture. These are the challenges and these are what we are looking at. We are trying to build an ecosystem, which helps us work across all of these real-time cloud hybrid. A system which allows the use of machine learning and AI on the fly.

NA: You touched upon a lot of areas here: data, cloud, as well as machine learning, AI. I'll request Balaji to share his thoughts on this as well. 

Managing complex data

Balaji Narayanamurthy (BN): In terms of our strategic approach towards data architectures, internally, we are calling it our data stack 3.0. So a lot of focus was on risk management-related analytics in the initial phase and expanded that to helping our businesses understand how they are doing and helping our sales teams understand how we are doing it. From a data architecture standpoint, the data stack 1.0 was largely about getting organised, building a data warehouse and building models and answering those questions. A few years later, we embarked on our data stack 2.0 journey. That was the world of big data. We also moved to our second generation data lake-based architecture centred around Hadoop. There the focus was more about handling larger volumes of data, ML models, focusing on deep learning models. The focus was on building better models, handling more data, handling more unstructured data and solving a wider variety of business cases than what we had solved earlier. In terms of data stack 3.0, the evolution is along two angles. One is on the innovation side. How can we make things better and faster? How do we make things real time? How do we make it here and now whether it is building and deploying models or whether it is getting some insights? The other angle is around how do we become more governed? It's no longer about deriving value through data. 

Organisations also realised data can also be a source of risk and issues in its own right. So putting more governance around data management has become a bigger focus. In order to do that, there is a lot of focus now from an architecture standpoint, on cloud-native principles. I don't want to necessarily say cloud on prem but cloud native principles to data management is the theme of what we call as our data stack. 3.0 in terms of how we think about the architecture.

NA: Banks always have had data, it's constantly increasing as many of you share as well. So the data volumes have really increased. The question is, why have the banks struggled? What were some of the challenges and constraints that they face in data management until now? Building this kind of data architecture where they could effectively utilise this data, optimise it better? So what were those challenges? And how can these be addressed? Devendra, I would like to hear from you on this topic first.

DS: I would say the biggest challenge is human beings, followed by technology, because technology is very easy to catch up especially in data architecture. The biggest piece is knowledge. Most of the banks across the world, they have larger dependency on partners who have ended up building a lot of product processes. When you try to bring all of these together, maybe hundreds of product processes together in one place and try to build one model data glossary, it's very important to capture a lot of these which require manual interaction. Legacy people, not only systems who have had this knowledge with them, they may not be around. A lot of knowledge lies with partners to bring that together, is the biggest challenge. Modelling it is easier than anything else. So once you have the eventual drivers, which connects your product processes, getting the lineage in place is the easiest. But getting the glossary in place is not easy because that requires people interaction and knowledge. I do feel that those are the real challenges especially touching upon our real-time data. I also spoke about one of the objective today, that everybody wants it instant. Now on real-time applications, there are a lot of developments, which goes back to product processors. Not all product processes in organisations are inbuilt. Having inbuilt application programming interface (API) or streaming engines are very important. We need an International Organization for Standardization (ISO) format output. It's not very easy to get it from every processor not only for your legacy system but also for futuristic editions of technology and processors. So it’s important to think through from a data architecture-point of view and application of production. These two challenges on people, knowledge, legacy knowledge and processors can be integrated. 

NA: David, if you could also share with us how you are addressing some of these key challenges that you face.

Aligning data with business models

DH: It's about how do you end productising the various application solutions? That’s the number data. There's always a high risk that you spend a lot of time trying to understand it, get to do something with it, get to use it to things like little sandboxes innovation. But it doesn't result in an actual application. It doesn't get productised. So one of the key elements is spending a lot of focus and to a certain extent, a preamble in making sure we have a clear understanding of how it's going to be used, how it's going to be operationalised. So it's very clear in terms of the underlying outcome. Again, there will always be errors, there'll be issues, there will be certain roughness in the edges. But as long as we know how we're going to use it, how it's going to be integrated from a business perspective, we're on the right track. That's number one. 

Number two is how do we do this on a systematic basis? We don't have this on one hit wonder. We do it again and again, and again. It's very clear to have an appreciation, an understanding of what are the repeatable, replicable stages which are very possible in the world of data. But also where the areas, specifically when you're going into the world of data science, that are a bit more unique, peculiar or specific to that application area that needs to be done. It's really this combination between the two and I call this an AI factory but I didn't want to use the term AI because this could be in any facet or even MAX to a certain degree. There's always ways of doing it in a more systematic and robust level. 

Finally, which is to me the way we are kind of leaning towards something is making the most of the underlying environments that are available to us. In this particular case, cloud, because one of the biggest challenges is as you're consuming more data, the integration of the various data silos – as we want to build more possibilities, analysis, applications – we're talking about an elasticity of compute. How do we make sure we're not limited by our own certain environment, with what we have? That's essentially one of the benefits cloud environments provide. That you have inbuilt manage service succession, that elasticity that one essentially needs. You can see how I’m alluding on how we’re addressing it – very cloud-friendly, advocate, very much positioning a lot of the applications in that environment, making sure there's a very systematic approach in dealing with things. And finally, it's really critical – understanding how solutions are going to be integrated with a business and operationalised.

NA: At this point, I would like to invite Remus in the discussion. Remus, you've been implementing platform for various banks. So share with us your perspective on this. What could be an effective strategy towards building a data architecture among banks, specifically in Asia? And how they can improve their competitive capability to this? 

Remus Lim (RL): I totally agree on what David has mentioned that data silo is not going away. We had this discussion with our customers 20 years ago. In today's environment, we’re still talking about data silos, disparate data. Even 30 years ago, most organisations were trying to find the holy grail, like what David mentioned, to build a single version of truth on a data warehouse. As we all know, that didn't really go well. That's not because of the technology. It’s primarily because the environment, the business is not static. It is dynamic and is constantly changing. That kind of era has made most organisations learn that now we just have to deal with the constant change and the dynamic of the business. As we all know, IT or technology is there to serve a purpose. It is there to enable business. 

What's a good strategy and I'll consolidate based on what I heard from the panellists, is that there are a few key ingredients that we need to make it effective. Number one is openness. The ability to allow interoperability across different environment across different data. Scalability is definitely one of the key issues and one of the key ingredients to make sure that the system they build, the platform, they're able to scale up. Last but not least is agility. How quickly can I go to market? If you were to build this fantastic data model that is able to detect fraud but it doesn't help if you sit internally without operationalising it as what David mentioned earlier. So is it really important to have a system that is open, interoperable, scalable and agile so that you can react to the market early? If I may put it in layman’s terms, it is basically to align that with a business outcome and not so much on the technology. The last thing we want to do is build a technology that hinders the business, by which we sometimes see that due to process, due to the ideal technology that most organisations want to build, it hinders the objective or the outcome that the business wants to drive. One more item is in the bank. It’s not easy managing all that. But again, in a bank, you have to ring-fence it with the regulations, with governance, as well as the security requirements. So it is a tough job.

NA: Yes, certainly, it's not an easy job. What we've heard from our panellists earlier as well is about driving towards an intelligence-driven organisation, bringing innovation and customer service. We talked about implementation of AI, ML, advanced analytics on this integrated set of data that we mentioned to bring actionable insights. Now let us deep dive into some of the areas as to how you are prioritising this advanced analytics in building your data architecture. So how are you prioritising your real-time decision making? How has this improved your customer experience and satisfaction? Share with us some examples on the way it is being done in your bank. So, Johnson, if you could share your perspective on how you are doing that at UOB?

JP: The real time technology is there. We can make use of a stack of tools as well as capabilities to perform real-time streaming applications where Spark, Kafka, both in the open source and at the enterprise level. So we are not short of technology. What is more relevant here is whether we have the use cases and the expectation to leverage these technologies. There are two angles that we can perceive this. One is from the customer lens and the other from the organisational lens. If there is a demand and expectation for that use case, for the application of the real-time capability, there will be a call to action. It's just like how COVID-19 is the accelerator with regard to our use of technology. Previously, there was so much consideration as to whether we are able to function effectively in a remote setting. But given the onset of COVID-19, there's a call to action. 

In a couple of weeks, we can see that everyone is functioning in a very comfortable state effectively or in a remote setting. The same goes for the use of technology, application of AI capability with regard to customer engagement and enhancing business value. So customer’s sensitivity and stickiness to the level of engagement will determine the use case for the application of real time capabilities. At the same time, the organisational perspective as well as the mindset on how we want to take on a more customer-centric and long-term view with regard to the generation of business value will be equally important when it comes to the take-off of the use and application of machine learning and AI. Some examples on the top of my mind will include applications such as location-based customer recommendation, promotions and from the business value generation angle, we have campaign optimisation modelling for customer marketing. From the operations management level, if you have seen our branch crowd status advisor, we made use of customer data, footfall data in order to forecast how crowded our branches are so that we can both manage customer expectations or safety at the same time, it serves as a good insight for the purpose of operations management across our physical branches. Last but not least, real-time insights applications will also be relevant when it comes to market insights delivery.

NA: Balaji you could also share with us some of the areas that you've prioritised real-time decision making and how this has improved the customer experience. Share with us the business outcome that you have seen coming out of the initiatives that you've undertaken.

BN: One is real time and one is overall business impact of analytics. I'll start with the latter first. As a bank, we are a lender. The primary business revenue driver is data and analytics that help us to cross-sell our lending products and lending-related products to a greater number of our customers as well as prospects. Using data and analytics around that is still the biggest one. In India, there is a lot of interest around new public data utilities that are becoming more ubiquitous. There is a lot to focus on United Payments Interface (UPI) as a payments platform that we have created and what it means for the monetisation of data and helping our customers use that. Second, there is this whole account aggregator framework and data portability that is making initial inroads where data is no longer that of the bank, it belongs to the customer. It can move freely around banks, at the insistence of customers. So it puts the onus more not the data itself but how well lenders can use the data to service that customer. So it puts the onus on modelling and advanced analytics to extract more value out of the data to service the customer, largely from lending. That's the big area of focus from our overall business standpoint, more specifically, zooming in on real time. 

The use cases for real time are growing. But when it comes to real time, the emphasis is a lot more on customer experience. And there are two angles. One is from a marketing standpoint, which is when a customer is with us in a mobile app that the journey is digital. How can I, in that instance and based on the right context, offer the right product and improve conversion? So that involves making offers available in real time. It's no longer about doing something and then pushing it. That works with the physical sales channel when you start selling digitally real time becomes a lot more important. So the ability to serve things in real time with sub-second latencies and at scale becomes a big deal. The other area where real time is coming up is customer experience from a service standpoint. As a bank, there are service challenges that we face on how we can use data and real time to empower the multiple points of interaction with that information about the customer. 

NA: What about the impacted outcome of these initiatives that you've taken? You mentioned a few on the lending space as well as on real-time customer service. What's the outcome in terms of business achievements you've seen from these initiatives? How successfully could you implement these?

BN: Some of the metrics that I can share is the percentage of our sales database-driven. This is what we track. So that is a database that is put out and gets converted to business. In some of the newer unsecured products, database is pretty much the only seller or the salesperson so 100% of some of our newer products like buy now pay later comes from the database. In unsecured products like credit cards and personal loans, upwards up to 60% comes from data-driven lending programs and in secured products like home loans and auto loans, that number is around the 20% mark. On the deposit side, a significant percentage of our sales on deposit products like time deposits as well as remittances are coming from data-driven sales. These are some of the numbers where we have seen some impact. Some of the initiatives, particularly around personalisation, using real-time data and customer service experience are somewhat new. 

NA: David, could you also share what are some of the areas where you are investing in real-time decisioning capability. If you could talk about how you have built that capability and what has been the impact of real-time capability?

DH: So one of the key strategies from the get go is DSAi, data science and AI-based models on a microservices layer. So whether it is on a batch mode or real time, it is natively conducive towards that. And to give an example of how it's been deployed, especially in these current situations of COVID-19, it's been extremely beneficial on lending for micro and small medium enterprises (MSMEs) where by leveraging data as well as alternative data, which is quite interesting because it's moving away from traditional financial metrics in estimating credit line as well as underlying risk. So availing loans to those who need micro and small medium enterprises in real-time basis, leveraging on scoring engines that are alternative in nature based on data science and AI. That's one example whereby it has a direct impact on the marketplace and financial inclusion point of view. 

The reason I gave that example is while on the retail side, there may be obvious studied differences across the different geographies. Interestingly enough, the space of MSMEs and small and medium-sized enterprises (SMEs) are quite consistent across the globe. Then if you look at an even more higher throughput type of real-time engines in terms of transaction monitoring, fraud-related credit cards, similarly, as the underlying ethos and the approach that we're essentially leveraging as micro services or data science models, these can essentially be consumed, leveraging on such techniques such as Kafka, Flink on a real-time basis to essentially score it in seconds or microseconds and act or interact according to the various business demands. It's definitely demonstrating and it goes back to my point on taking that data, building those capabilities on top of it and delivering them. So having that intelligence embedded in a much more robust manner allows the increase of safety and security while at the same time highlighting this is why we're doing it. So it's allowing for that interaction with consumers.

NA: Some very valid points in terms of controlling the microservice data architecture. At this point, I'm going to invite Joe Rodriguez. He's a senior managing director of National Services, Cloudera to share his perspective on what are some of the key challenges that banks face in data integration insight.

Joe Rodriguez (JR): Financial services organisations face some key data challenges. For several reasons, they tend to have multiple data silos. It's not uncommon to have redundant customer information amongst the different business units. Data volumes are growing exponentially. Regulatory compliance is driving some of this, as well as the continually expanding customer information. They are ingesting new sources of data from non-traditional sources, both structured and unstructured in real time or batch, that can be used as alternative sources of data for functions such as credit decisioning and real time data that can be further utilised for fraud detection and AML. What all these new and increased data provides with data analytics and machine learning are actionable insights that allow us to improve things like customer experience, reduced fraud, better managed risk and implement the efficiencies. The newer strategy and approach to address historical challenges to data is represented in this overlay. The key point is to create a data strategy based on business outcomes first, which will guide your cloud strategy. In other words, the cloud deployment strategy must support the data strategy.

So different organisations are going to be at different points in their data analytics and AI journey. 

There are different degrees of streaming and batch data both structured and unstructured. You need a platform that can handle both with a common governance layer, near real time and real-time data sources help make the data more relevant. Normal streaming and batch data come from core banking and lending operations in pretty much a structured format. As financial institutions start to evolve, they start to ingest near real-time streaming data that comes not only from customers, but also from news feeds. And they start to capture more behavioural data that they can now use to evolve their models and customer experience. Ultimately, they start to ingest more real-time streaming data, not only from standard sources like market and transaction data but also alternative sources such as social media and connected sources such as wearable devices, giving them more data, better data to extract intelligence and deliver personalised actions based on data and in real time at the right time and use machine learning and AI to drive anomaly detection and predict potential outcomes. In this retail banking example, by using all the data available, both traditional and non-traditional, structured and unstructured, real time and batch, using data analytics, machine learning in AI, you can provide a better customer experience, reduce churn and produce personalised, targeted marketing.

NA: Devendra, at this point, I would like to take your feedback on how Kotak Mahindra Bank is implementing AI and advanced analytics towards improving the actionable insights or business outcome that we've been talking about. Share with us some of the areas where you have seen a significant impact coming out of this implementation and examples of some metrics that you can share around that.

DS: With respect to AI and ML, what I find most challenging and intriguing is the fraud detection area. With the identity fraud and syndicated fraud that we see on the upsurge with digital transactions coming into play and going twice or thrice post-COVID-19, you realise that's one area where we have to be extremely agile and there is no way that a manual model will work there. We’ll need a self-learning, machine learning capability. Another area where we need to use a lot of machine learning is the projection of non-performing asset (NPA) projection. A lot of new developments are coming in therefore the call centre volume fluctuates if there is any new regulation coming in any new direction for customers coming in. We do see an upsurge in call centres. 

These interesting areas are very new to us: the traditional lending algorithms, instant decisioning for loans, all the recommendation engines. We have close to five or six platforms on which whenever customers interact with us, we are able to recommend the right product and the right offer to the customer which requires an instant decisioning based on condition, customer traversal and the transaction behaviour. So these are leading examples: fraud detection, lending, instant lending, recommendation engine and instant decisioning for cross sell. When it comes to impact, there are two ways we measure. One is probably more centric towards the investment of analytics. So we do run analytics like a business unit where we see whatever you've invested, what kind of return on investment (ROI) we have bought. We are one of the most efficient in terms of the ROI or the cost of revenue that have been measured. That also tracks in some investment in this area. But more importantly, what models are at a bank level, what you're contributing. There are a few businesses, especially with MSMEs, SMEs, business, banking, where you are in lending, you would still like to do a touch and feel and while the analytics influences, it's not STP, straight-through processing or an instant decisioning. It's a marginal contribution. But when you go direct to customer, there are businesses where the contribution is as good as 100%. In fact, there are businesses emerging which are completely 100% cloud and 100% instant, where every single decision is based on data and analytics and use of machine learning.  

NA: Remus at this point, if you could also share with us some examples of areas where advanced analytics implementation is bringing significant business impact for banks. 

RL: Similarly to what the panellists mentioned, some are typical use cases of credit scoring especially fraud detection that we are seeing in most organisations. We also see models more for research purpose, for the internal learning so that they understand customer behaviour, customer lifetime value. So those are what we see the most banks are trying to do. We're also seeing banks trying to see the future in terms of non-compliance. 

NA: I’ll move to the other area that we talked about until now, which is cloud and how cloud is enabling data architecture as a technology. Institutions have the option of private, public, hybrid and multi-cloud. What are some of the biggest considerations and concerns in cloud adoption? Given these, what do you believe is the most effective cloud design and strategy to drive real-time capabilities and to optimise operations? David, if you could share your views on this. you mentioned about Union Bank shifting a large part towards cloud? So tell us about your experience and your cloud strategy?

DH: In fact, it's not just shifting. A large portion is the goal and the ambition of the cloud only, especially within the end of the next year. But it's also an element of just immense productivity and efficiency. Whereas previously, it's everything's on prem. You have one to how many servers and machines now moving to cloud. You're actually increasing it by a certain expo. So there's a significant operational efficiency that is allowed to be gained by doing it, as well as having a much more robust and resilient management from an equipment perspective, patches, review. It's important in putting in place these strategies from a governance point of view, working closely, understanding the potential applications, contingencies. So that's how we've been approaching it. 

We've been running off with various experimentations starting with putting non-critical applications and learning from the underlying process. But an important message here is you can't learn from just theorising. At some point, you have to stick your foot in the water and realise how hot or cold it is and how much you like it. Then it's taking that one step at a time. We're actually going down the full monty whereby even core banking systems are going to eventually be on cloud. Now this, especially in my specific field, whereby it is leveraging on the data, provides immense efficiency. Because let's say tomorrow, I want to do a very low scale-computing that is necessary. On a prem environment, I have to go figure it out. I need to procure. It takes time. The benefit is that an elasticity that organically comes essentially from a cloud environment allows for as much as I need or how little I need effectively as well as on top of provisionally accessing to very cutting edge environment. So the bank has been very focused and very strategised because there is a very clear-cut digital agenda. And it's actually not just digital agenda in terms of being digital. It is in terms of, if you look at the Philippines, you’re talking about just under 110 million people, 65% which is effectively about 65-67 million people are either underserved or unbanked. If you want to serve 10 million people, 20 million people, we're not talking in the hundreds of thousands, yes, you can go build it yourself, but it might take a while. So in order to meet those kinds of targets, in order to meet those kind of objectives, you need to run in an exponential level. The only pragmatic way of doing it is through a cloud environment. It's literally a case of the sun rises from the east-type of scenarios. 

NA: You mentioned about efficiency and the fact that you were able to serve so many more customers. Are there any numbers that you want to share within the indicators about how your efficiency’s improved? 

DH: Essentially, you have an X fold in terms of individual management, in terms of the underlying. So every engineer can now have 50 to 250 machines. It's an immediate growth, exponential element. But a great example in an unfortunate situation is when the world went to lockdown – because the strategy was essentially to go cloud first, because the strategy is to build the environment on a micro-server, because the strategy was to be digital, which is based on cloud – essentially, overnight, the bank was able to onboard customers in the same manner because of significant needs and we're talking about within immense and heightened period of the lockdown in the Philippines. We've onboarded over 300,000 to 400,000 customers. These are the kind of things that if you didn't have those building blocks in place, it would not be possible. 

Finally, it's just the pragmatism of the ability of building solutions. It's not that you can't do it. You can. The challenge is time. When we're dealing with a world that is far more digital in terms of necessity, imagine the logistics: ordering food, travel, aviation, social media. There’s the same kind of inspiration I’m looking at and so financial services is saying, no, financial services should be designed around my moments of life. The only way to do that at that level of efficacy is from a cloud environment. Again, we're in the journey. We're not there yet. We're on the way, midway. But there’s a long way to go.

NA: Balaji, if you could also share with us the cloud strategy at your bank and some of the outcomes that you've seen following this implementation. So what type of cloud have you implemented? 

BN: Our cloud strategy has been focused heavily on the application sites. The data stack is still in its early stage but we can say that almost 100% of our newer applications that we’re building,  100% is happening on the cloud. That's something that we want to do and we are already doing. In terms of data stack, it’s no longer a choice. But the data stack also has to move to the cloud. What's driving it? One is that the IT applications are moving there. Second is around infrastructure, getting it set up. It does take time. But the more powerful is the cloud native architecture. Of course, it's easier to do it on cloud. You could theoretically get enough hardware, put an open shift cluster on top and do all of that on prem the same way. But obviously, it's easier to do it on a cloud. But the crux of it is cloud native design principles, both around IT application as well as the data stack, which lend agility to the piece that we talked around, real time. How organisations have to respond to marketplace changes really fast. Some of the mobile app upgrades that you see, some of the technology companies make it once every few days. And banks typically, the time lag is larger. That's largely because of having to adopt cloud-native principles. So we want that and we are doing it in our IT application. Data stack is coming very close. 

In terms of business outcomes, the focus is still the same. It is about how we bring agility and speed to our business deliverables. It could be how do I drive more cross-sell product to my existing customers? How do we create real-time alerts and intervention? It will be the same as personalised customer experience on their digital platforms when the customer comes out of the digital platform. It is not just a smooth journey, but it's also an intelligent journey. It's also a personalised journey. That happens through data. But that can happen for most practical purposes only if you are following cloud-native design principles and if you are on cloud. So that's what we’re striving for in terms of hard business outcomes on data stack. On cloud, it’s still early days. I want to be upfront on that. But the outcomes, when these happen, will be around these vectors. But it will take us to the next frontier in terms of how quick we can do it, at what scale we can do it, and at what cost we can do it. 

NA: Johnson, a few words from you on the complexities and challenges in a hybrid and multi-cloud environment. So what are some of the challenges that banks face in this environment? And how should these be addressed?  

Cloud cost efficiency

JP: So cloud is relevant for scalability, elasticity, as well as capability, especially in the realm of AI, machine learning, and real-time, data processing and insights delivery. At the same time, cloud also offers efficiency in terms of costs and processing capacity. When it comes to implementing a cloud strategy in a large corporate organisation with incumbent stack, incumbent technology and platforms, it would be tough to go for a big bang-type of approach. Unless there's a clear call to action. Given that there's so much challenges in terms of shifting mindsets, as well as managing governance processes and stakeholders, it would be more meaningful for us, as well as for the industry, depending on which organisation you're from, as well as more concrete to take on a more progressive approach, a progressive mindset, such as leveraging a cloud-first strategy in onboarding new projects, as well as new use cases as much as possible onto a cloud-based platform. Through a progressive as well as measured approach, it offers us a lot more transitional capability to manage the risks, at the same time, not stifling progress with regard to achieving the state of scalability, elasticity, as well as  capability that we've spoken so much of. 

NA: Devendra, share with us the cloud strategy at Kotak Mahindra Bank as well as the challenges that you're facing in this entire process or your key considerations.

DS: So I’ll go back to David where he spoke about 20 million plus customers. David, in India, most of the leading banks are in that zone, 20 million to 40 million customers. Having said that, cloud is not the first strategy for us right now because 99% of the data in Indian banks are still on-premise. Therefore, to migrate the data on cloud for any application is not going to be that easy. There are businesses which are cloud first and everything is cloud native. For those businesses, eventually all stacks, whether it is the analytical sandbox or the marketing automation platform is on cloud. The way we have built our cloud strategy is gradual, step by step. It cannot be answered efficiently through prem. So, whether it is real time, usage of machine learning or in search of marketing automation platforms, which from day one, have been on cloud. 

I spoke about the cloud native businesses. In these four areas, we have ensured that our journey is completely cloud-based because of the capability and agility which other speakers also spoke about. We are getting into Ozone, which is what we termed as multi-cloud, because there are a few applications which are very specific to a certain cloud stack and ensuring that all of these are locally placed in India. It's not allowed to have servers, even for cloud, outside of India. Core dimension is much more complex. We have multi-cloud and we have on prem. So we’re talking about our two hybrids and in that situation, what we are getting into is what we call the data virtualisation platform, where we are seeing diagnostic whether or not getting to a one zero game where everything will move to cloud. I don't think that's going to happen soon. But we still operate on Ozone. All our machine learning models are now on cloud. All real-time applications are on cloud-based applications. And they do contribute significantly to the businesses where we have applied so it's a significant impact. 

NA: At this point. I would like to request Joe to share with us some of the case studies and examples of cloud data in Asia towards addressing data challenges. 

JR: DBS Bank has a reputation as one of the most innovative banks in Asia and the world. Six years ago, they set out on a quest to become a data-driven tech company starting off with a $200 million investment in technology to trigger that transformation. Since then, DBS Bank has been continuously innovating. Over the past four years, the Cloudera platform has become essential for DBS Bank's data first architecture, supporting nearly all its business units, from retail and wholesale banking, to investment and compliance. The impact is driving positive outcomes throughout the business, from improving financial performance without analytics, to offering convenient services as with a mobile app, to delivering great customer experiences with AI. This new architecture has also helped DBS save millions in licensing costs and significantly reduced provisioning times and increased productivities.
Let's talk about Bank Mandiri, which is a very progressive Indonesian bank. As COVID-19 spread to Indonesia, the government quickly restricted citizen mobility and placed a COVID-19 protocol into effect. Bank Mandiri made plans to anticipate and prepare for the impact, quickly developing several dashboards in order to mitigate the risks of this pandemic. With Cloudera at the heart of their platform and using solutions like Cloudera data science workbench, Bank Mandiri implemented analytics capabilities to help deal with the pandemic's effects. In less than 48 hours, the bank built data mart and three dashboards one for real time monitoring of liquidity and daily bank branch transactions, the second to monitor employee health status, and a third for loan restructuring. These dashboards have been such essential to adapt to the rapidly changing situation brought about by COVID-19. As one example, customers who need to alter their loan terms, due to the changing economic situation, now benefit from a much faster loan restructuring, reducing the time to process these requests from five days to one day. A unified life cycle is required to use, manage and secure data effectively. We've learned with our customers that every business use case needs multiple data and analytic functions to process diverse enterprise data. Getting value from data is complex because the data is complex, distributed and rapidly changing, especially with more and more unstructured and alternative data, and more data coming from internet of things (IoT) or connected devices. 

NA: There are multiple areas and multiple facets of data integration and the challenges that Joe mentioned as well. So one of the key areas that are often talked about is in terms of the compliance requirements. So now, the compliance requirements have expanded, the new regulations are demanding disclosure of more granular data. Data privacy rules have evolved as well. So overall, in this hyper-connected world with growing data volumes, there is emerging complexity of data governance, data quality and meeting the regulatory requirements. What implications does this have on banks’ data management capability? So if I could have a perspective from you, David, as to what are the key complexities that are emerging in data compliance and governance in this scenario? And how do institutions need to redefine their data governance and workflows to solve these complexities? 

A need for a mature data governance model 

DH: Data governance is not new. It's been around for decades. It's something that we very much need to address. But where governance becomes critically important is actually connecting it and how it's going to be used. Governance for the sake of governance, this may sound a bit controversial, is entirely meaningless. That's why when we start using data in terms of reporting, in terms of applications into the data science that we suddenly discover all these data quality issues. They were there all the time, it's just that we never actually did anything with the data. What I'm getting at is where I believe data governance is and should be maturing, is really interlinking it with usage within the organisation. That's number one. Number two, it's really critical to highlight that there is a genuine risk. A lot of times, we have this understandable fixation, obsession on data, the amount of value it's going to bring the organisation but we need to realise that if done inappropriately, it's a risk. It's a genuine risk to the organisation. That's another dimension whereby governance is ordered to make sure to mitigate those particular aspects. 

The final one goes towards the more regulatory reporting, the more compliance related data. The way I like to think of it isn't in the form of regulatory reporting and compliance. The reason why is again, just calling out the elephant in the room, a lot of times, we take a tick-the-box-type of approach of what is that minimum necessary in order to get the regulator off our back? I like to think of this type of data as end of the data that's heart and soul to the organisation. So it's not regulatory reporting data, it's operations data that is also used for regulatory reporting. It's operational data. Data that we use for analysis. Data that we use to build the models that is also used for compliance purposes. That would naturally dictate to assure that this data that isn't just sitting alone, it's not being reported. If you think about it, it’s very much in the spirit of Basel Committee on Banking Supervision (BCBS) 239 in terms of making sure that the quality is in place. And the best way to show quality is use it. So that's some of the things that you are already seeing changes. You see organisations putting in place something that we need to put a lot more emphasis, especially that we're getting a lot hungrier in data. It's a lot more apparent, it's flaunting our systems. It's critical that we get these fundamental hygiene in place because imagine if we can't deal with the data today, how we're going to deal with the data tomorrow? We will suffer. That's just the honest truth. There's always the way of just throwing bodies at it. I don't think that's a scalable approach.

NA: Johnson, maybe we could hear from you as well in terms of the emerging complexity and data governance, as well as how decisions need to redefine your data governance workflows around that. 

JP: It is important to be clear about the different phrases that we use with regard to the context of data governance. There are three main areas we are often talking about and discussing. These three areas are data privacy, data security and data governance. While they are so similar in certain aspects, they do pertain to different areas of the of the data development context. Data privacy essentially relates to the proper use and handling of data: how data is collected and whether it can be shared with third parties and ultimately how it is used. Then there's data security. It is about how we can maintain and secure the confidentiality, as well as integrity of data. Last but not least, you have data governance, which relates more towards the management of data, the people, the process, as well as technology that an organisation needs to deploy in order to make sure that the right people get access to the right data. 

Given the increasing demands from compliance, moving ahead, there will be more collaboration across these functions within an organisation. There'll be a more well-defined overarching strategy in how we want to handle data in a more responsible and well-structured manner. The second point that we will be expecting is that given the increased usage of data, given the increase or expectation with regard to the responsible as well as structured use of data, data policy will become more and more entrenched within our corporate culture. Third, with the increased use of artificial intelligence, machine learning, and data driven approaches, these capabilities can be use and applied in the context of data governance, especially with where we are now. There's this larger volume of data that we're dealing with. There are a lot more reporting requirements. 

Creating reliable data governance 

JP: So data intelligence, analytics and machine learning can be applied in making the governance process a lot more intelligent. Last but not least, automation. This is closely related to intelligence with the use of automation as well as robotic process automation (RPA). Principles relating to governance by design will also greatly shape the way we practice data governance in the in the years to come.  

NA: Remus, I would like to hear from you as well in terms of data governance and the challenges around that, especially when the banks have hybrid cloud and multi-cloud environment. What are some of the complexities that are arising because of that? And how can these be addressed? 

RL: I strongly believe it is not a technology question, technology’s available. So data governance has been around for many years. Now organisations have more choice put it on cloud, to push on hybrid cloud, on prem. The ultimate question is for organisations to review policies as far as processes. Technology can enable that whether to automate it or build intelligence on the non-compliance of what Johnson mentioned. So technology-wise, it’s all available. It's more about the people and the process within the organisation that needs to be built.

NA: Devendra maybe you could share with us perspective from Kotak Mahindra Bank as well. Just a quick overview on how you are managing some of the governance challenges that you are facing. 

DS: I would say it's a paradox. It’s such a complicated area, data governance. Regulatory challenges with regulatory requirements, to fulfil that, it has more to do with DPI, customer information, personal information and to protect that either through masking or through restricted access, and also maintaining the history of access and also the state of data where it is aligned. That is one area where it's very critical. Other than that, I do see a balancing act between data democratisation and data bureaucracy. 

So governance should not lead you to a situation where it becomes very bureaucratic, beyond regulatory as that is an issue that we have to prioritise. Most of the organisations that I know of, even bigger banks, they are still on a journey of data adoption and data democratisation beyond those what we call as a central excellence unit, which works on reporting AI and ML. But the actual impact does come in when your layer two, layer three of the organisation starts accepting data. So ensure that the data governance policies do support that kind of democratisation and usage without hampering the actual transition of data and transfer of data, but in a very controlled manner. So it's a balancing act between how we democratise without being bureaucratic when it comes to governance. 

NA: Balaji, some final thoughts from you as well on this topic?

BN: I see the massive focus on data governance coming of late. Data governance has been around concepts but the focus and intensity have increased in the last couple of years coming from three sources. One is the regulators. The data that you are reporting to me as a regulator is beyond question beyond doubt. Some of the BCBS 239 that you find in the West is indirectly coming here as well. So the regulators are saying, put these processes in place so that whatever you're telling me is 100% right and things won't change. The second big angle is coming from a data privacy standpoint, General Data Protection Regulation (GDPR) and some of the other regulations plus some of the tech company battles around data has brought this to the fore these days. Who does data belong to? Data belongs to the customer, not to a technology company or to a bank, or to a fintech. You are a custodian of the data, not the owner of the data and you have to use it. 

The second responsibility is the second dimension that is driving it and is largely coming from the government. India also has a GDPR-like legislation that's currently lying in the parliament to become law pretty soon. So this the idea on privacy. The third angle that is coming in is just internally. We have large amounts of data, petabytes-plus data. And for us to stay on top of that and know exactly what is sitting where on the whole data governance, data discovery lineage has become important. It's no longer something that you can manage just by picking up the phone, speaking each other. These are the three dimensions that are driving a big area of focus, something that as an organisation, we are taking very seriously and we are investing significant amounts of time and money on it.

NA: It's been a very interesting discussion today. I’ve heard a lot about in terms of how the banks need to look at data strategy, both holistically. How they need to devise a data strategy that will drive their cloud strategy towards effective use of data. A common governance layer is needed to manage this data landscape and the banks are adopting varied cloud strategy from different stages of journey that they are in. So along with that are the challenges with regard to migration and that is also driving the cloud strategy of banks as to how they look at data architecture and what is the right strategy to meet their own unique requirements. So some of the key areas we discussed were AI, ML, advanced analytics implementation, real-time integration, real-time decisioning, fraud management, business areas such as lending, where these applications are finding a lot of focus and the banks are generating significant business outcomes from these implementations. 

There is also a view with regards to how the banks need to have a more systemic mindset, a systemic approach to generate the kind of data outcome that they're looking for. They need to look at it from a multi-year journey as to where they want to be and where they are today and how to approach that with cloud as technology becoming a key enabler, bringing in elasticity of demand, the flexibility of technology to generate the efficiency and scale out of it. I want to thank you all, Balaji. David, Johnson, Remus, and Joe for sharing your views with us.

Keywords: AI, Ml, Advanced Analytics, Real-time Integration, Hadoop, API, Kafka, Covid-19, Dsai, MSMEs, Flink, AML, NPA, ROI, Multi-cloud, Ozone, Data Science Workbench, IoT, RPA, GDPR

Search

View Past Sessions View all

Image
Ron Shevlin on US banking: “Unprecedented regulatory headwinds make outlook…

In a conversation with Ron Shevlin, a renowned banking thought leader, we delved into how the US banks navigate regulatory challenges amid economic uncertainty and strategic shifts

Image
US fintech status report February 2024

Greg Palmer, head of strategy and host at Finovate conference series, analyses the developments in the US financial industry, highlighting how intense competition for deposits drives the adoption of fintech among banks, and allows digital…

Image
Global banking industry outlook 2024: Navigating complexities and seizing new…

The RadioFinance session on the Global Banking Industry Outlook for 2024 provided deep insights into the challenges and opportunities in the banking sector. Leading economists joined the discussion and focused on economic growth prospects,…

Image
Creating a successful global financial centre

Emmanuel Daniel, founder, TAB Global, spoke at CIFF about Shanghai's goal of becoming a global financial hub and how many legal systems create a flexible business climate.

Image
BCA’s Jahja Setiaatmadja: “Digital and relationship are our two strengths”

Bank Central Asia’s president director, Jahja Setiaatmadja, says that the key drivers of the bank’s success in the Indonesian banking industry are its digital prowess and good relationships with customers. BCA is the highest capitalised…