‘Open’ for Business: New Banking Models Demand Data Strength
While the premise of “Bank as API” has been around for a while, it’s fair to say that adoption of this concept grew considerably in 2018. Banks’ senior tech leaders have noted throughout this year that today’s disruptive tech demands a change in operating models, culture and perspective. Many digitalization initiatives have been geared towards this idea that a bank’s technology stack should be a clearinghouse for new applications—able to rapidly integrate the latest mobile payment channel, or facilitate sandbox-style development across enterprise functions, without much delay or reengineering. Going forward, the network-effect benefits of openness are now seen as more important—and also potentially more cost-reductive—than operating as a black box.
This change in posture is actually the next step in a longer trend. After all, the notion of ceding a little autonomy isn’t entirely new to large banking institutions. Firms have long opened the gates to their kingdoms just a little, whether it be for messaging and trading protocol standardization, or post-trade processors and other industry utilities. While often successful, these convergence points represent the exception rather than the rule—and take many years to reach fruition.
Likewise, vendors of different stripes have aspired for many years to serve as a kind of universal API for various trading ecosystems and data functions. Some newcomers have done so explicitly from day one; others have taken the route of gradually building that capacity out, on top of core system offerings that already have a foothold. And there is a good argument that much of the recent years’ mega-streak of fintech acquisition has been driven by this kind of goal (if not also attractive pricing).
Beyond that history, there are hints and clues out there from which Bank-as-API initiatives can surely learn. Taking more of a venture capital (VC) perspective to your investments; encouraging incubators and hackathons to source talent; moving as much to the cloud as you can, and always thinking about user interface and experience (UI/UX), particularly for mobile.
The difference, and challenge, really comes down to scale and impact. It’s one thing for a startup to think this way and go out on a limb. But how does a large universal bank steer its way towards accepting open-source blockchain development, or collaborate in building the next Zelle? At the top line, it’s about accepting the risks and calibrating your aggressiveness against the possibility of disintermediation—or even self-cannibalization. Trying to lead out in front of an automated settlement initiative is tough because the front office may not realize the benefits (or really care). Re-engineering your payments infrastructure around a suite of apps is tougher still, because those same folks may envision themselves soon out of a job.
It often goes unsaid, but the same problem applies to internal tech teams as the wider world of as-a-service and open-source possibilities attached to disruptive technologies, like distributed ledgers, opens up. It’s about information gathering and moving with conviction—persuasion and negotiation as much as revolution.
So then, how and when to move? Even if your bank is chock full of institutional buy-in, there is also a critical practical matter lying just below: is your data management estate in the right shape to facilitate this kind of shift? After all, the world’s greatest API is still useless without the right data flowing through it, the right way. That list of questions grows longer in a time where inputs and partners are increasingly flexible:
- Are the proper controls in place to handle clients’ personally-identifiable information (PII)?
- Can your data processors handle changes in format or velocity, matching on-the-fly changes one might see in a flexible API? What about managing the load?
- Conversely, how quickly can your platform spot abnormalities and address them, especially from a new data source?
- And how to do all of this in an environment where prolonged tech transformation and transition comes by design?
Little wonder why banks are eyeing data integrity as they move into a more digital, open era. Without it, realizing benefits of this new model is difficult: you can’t properly identity opportunities or move fast and effectively upon them without it; you can’t maintain predictability and stability without it; and you can’t really keep up with the competition without it.
Like a Dream
In the end, the most exciting elements of open banking are novel: hitting on the next product idea, perfecting it in the sandbox, spinning it up, and getting it in front of as many customer eyes as quickly as possible. Doing that over and over vastly expands what a banking enterprise can be, and reimagines the technology capabilities right at the center of it all. In some ways, it reads like a dream.
For all of that, a central and age-old requirement—data quality—remains constant. Today it just looms far larger.
Jan's original article can be found on LinkedIn.