Did the Compliance ‘Golden Age’ Miss the Mark? Survey Says...
Like the years before it, 2019 is sure to have its share of fintech industry research. Having recently commissioned surveys of our own at Gresham, we often find that these work best when they serve as conversation starters: to show the industry something we didn’t already see, or disprove conventional wisdom. Often this interest is reflected in a survey’s design, with specificity and subtlety of questions telling the tale. Other times, though, it reveals itself by omission and what seems to be glaringly missing. Sometimes a bit of both are at play.
That was the case earlier this year when McKinsey and Company put out its 2019 benchmark survey focused on banking compliance functions, The Compliance Function at an Inflection Point. The consultancy rightly painted this as an interesting time for compliance. After a decade of post-crisis years spent near the top of the institutional agenda, spend in this area is flattening out or even receding; therefore it’s fair to assess the progress that was made (or not) during compliance’s era in the spotlight.
The results were mixed, and some quite predictably so. The study reported unevenness in terms of both investment and functional maturity, as well as wide organizational diversity in each bank’s compliance structure. Bank operations that are larger, more complex or vulnerable to abuse like rogue trading, and more immediately subject to regulation have grown out bigger compliance programs. Interestingly, though, those surveyed didn’t perceive any correlation between these two variables - spend and effectiveness. That same sentiment played out in a separate subset of questions on technology.
In short, building bigger isn’t always seen as improved. And in the breakdown, the direction of respondent complaints piqued our interest.
Show Me the Data!?
Two details stand out. First is the anecdotal sentiment that compliance technology is being bolted on the wrong way, reducing its effectiveness and causing headaches among tech teams who mostly attend to user issues rather than move on to the next thing. Second, and perhaps more mind-boggling, is that almost 80 percent of compliance cost is still - emphasis on still - tied up in personnel, compared to less than 10 percent on technology. If that split was more extreme in 2009, it probably wasn’t so by very much. In other words, despite all the tech advancement that has come in the past few years, the general response to a compliance issue remains the same: bulk up, throw more humans at it, worry about the cost later.
What explains this? The theory here goes back to the top - and the interest in what’s missing in the numbers, rather than what’s there. In its conclusion, McKinsey points out a healthy mix of tech trends bubbling up and things firms should be doing to rationalize personnel and level up. But one need only do a quick reread of the results and tap “Control+F”, and you’ll find “data” is mentioned only four times. Similar markers, like “controls”, aren’t that more numerous, and when queried about directly, were among the lowest areas of functional maturity for G-SIBs and Non-G-SIBs, alike (see Exhibit 3).
Missing the Mark
Even if taken as casual observation, that is a massive story on its own - and it is no criticism of the survey, so much as a measure of where we are. Firms simply aren’t looking at compliance as a data-first function, and to that extent, past post-crisis strategies have clearly missed the mark.
Sure, there are legitimate reasons why. Compliance needs to be a lot of things to a lot of stakeholders, and that was true in the past as it is today. It has always held a slightly floating and fuzzy role, touching several other parts of a large banking enterprise as it does. No two compliance structures are the same, and human judgment matters quite a bit. That makes automation-based operating models more difficult to achieve. Likewise, the variety of pulls on transaction data - be they know-your-customer (KYC) checks, sanctions monitoring, suitability rules, trade compliance, or the ever-larger array of prudential reporting requirements - makes it difficult to streamline or reengineer related data infrastructure on the fly. And it’s fair to say that data dexterity as we know it today is still a new phenomenon (remembering that iPhones barely existed when the crisis hit).
No Time for Excuses
Still, these excuses aren’t good enough to throw the hands up and hold compliance behind, particularly when enterprise data management (EDM) principles and innovation have arrived within many other parts of the bank, and when regulations guiding the governance of data itself are coming into force and widening in scope.
Indeed, a data-first approach needn’t be mutually exclusive to the challenges mentioned already. Human judgment in compliance matters ought to be informed by accurate, available data. We’ve already seen new developments like smart contracts combine processing of multiple due diligence checks and other middle-office tasks like margin distribution through automation. Meanwhile, reconciliation and regulatory reporting are powered by engines using far better rote performance than even a few years ago, cutting down on time and exceptions management cost.
All of this relies on investment in data integrity and improved infrastructure further up the stack. Even if these efforts fall short of covering every last task, the McKinsey survey results highlight the need to change institutional disposition. Any discussion of banking compliance in 2019 should find data front and center, rather than buried. Bigger, as the study showed, isn’t better. Smarter is better.
Ian's original article was published on Finextra.