New rules are being written, demanding greater reporting from financial institutions to their respective regulators, and more rigorous stress testing for systemic players. To satisfy both these constraints, banks will need to store more data (both real-time and historic) in a consumable and accessible format. In turn, technology strategies must be implemented to deliver these operational requirements.
A previous post provided an overview of the Three Dimensions of Transparency. Below, details on how data best practices can help satisfy compliance requirements.
Data Management Best Practices
First and foremost, there are implications on data management best practices, and the database technologies needed. Canonical data models should be designed as a foundation, with the appropriate data management services built on top: acquisition, validation and normalisation; data quality management; event and exception management; reporting; and distribution and orchestration.
The database platform must be trusted to guarantee integrity of data, in order to be a single view of the truth. It will need to be need to be high performance (read and write), and likewise scale in a cost effective way.
There will be an uptick in Business Intelligence to empower firm-to-regulator collaboration. It may be a strain on existing databases not designed for this new load. The growth in the sophistication of Business Intelligence and the demands of regulators is likely to need the elastic compute potential only the Cloud has to offer. The right partners offering the full range of Cloud services will be fundamental. That partner can support banks adapting to compliance both in the short-term, and over the long haul. Since regulations are not set in stone, a flexible approach to data and services is essential.
“What if” planning, stress testing and Monte Carlo simulations, with their intermittent demands for high bursts of processing, are also an ideal fit for Cloud computing.
On the other hand, Cloud services have regulatory implications themselves, particularly from a data privacy perspective. Technology and regulation are out of step here. Firms may have to rely on private clouds and hybrid solutions until regulations catch up. For example, intense processing tasks may take advantage of public cloud infrastructure, whilst data storage will be normalised and accessible, but stored in a private cloud.
Complex Event Processing
Finally, some scenarios cannot be efficiently satisfied by relational databases. They will need Complex Event Processing technologies to be part of the stack, fully integrated with the databases. For example operations that require summation of streaming data are cumbersome for RDBMSs, but easy for CEP engines.
Overall, organisations must be able to act on events in real-time. Rules engines offering the flexibility to adapt to new and yet unknown demands, and proven to scale to extreme data volumes, should be part of every firm’s data strategy.
The Other Dimensions of Transparency
These best practices designed to satisfy compliance requests for information will of course benefit financial institutions’ own operations. The first post in this series spoke to the opportunity compliance brings. The final post concretely looks at how markets dynamics as well as internal management are putting pressure on firms to improve transparency – and what technology is actually needed to deliver.
This post is the second in a three part series on Transparency: