A powerful reconciliation and control platform is useless if it can't handle any data in any format, from anywhere across your organisation. So how do you ensure that it can?
Our three stages of data control podcast series sees Neil Vernon, Gresham’s CTO in conversation with Gert Raeves, Adox Research. Neil talks us through the three stages of data control: from the static and fixed reconciliation engines of old to under-delivering offshored centres-of-excellence, and ending with current state-of-the-art flexible and end user-focused reconciliation solutions which support highly the highly automated reconciliation processes that a modern business needs.
Listen to 'The Three Stages of Data Control - Part 4' below
Data in financial institutions is increasing in volume and complexity all the time, which means that data quality and control challenges and reconciliations processes are too. How can firms identify what can be simplified versus what is inherently complex? Tech firms are relentlessly pursuing simplification where they can, particularly where outdated legacy technology is making life unnecessarily hard - but the quest for simple has its limits. Many business and tech processes are unavoidably complex - so how can a firm like Gresham help mitigate the operational and financial risk of complexity? That complexity is not just technological - it is also organisational, cultural and driven by the unique nature of the financial services industry.
Neil discusses root causes of complexity, their impact on firms' ability to change/be agile, and what tech firms can do to help.
“Replacing legacy means you need to be brave, bold. Don't copy something from a legacy system if you don't know what it is for - switch it off! If you have an agile platform you can quickly course-correct and implement just the controls you need”
Catch-up with Part 1: How to build a platform that can deal with any type of data
Catch-up with Part 2: Debunking myths around silver bullet technologies
Catch-up with Part 3: Collaborate to innovate