The interwoven perks and pitfalls in building stress testing models
Stess testing has become an essential activity rather than simply best practice says Martijn Groot who adds that banks require comprehensive data management capabilities to reduce the operational risk of unknown data.
Martijn Groot, VP product management, Asset Control
Over the past five years, stress-testing has been coming of age, advancing from an add-on to normal day-to-day risk reporting, to being firmly entrenched as a distinct discipline under the umbrella of risk management.
It is still taking institutions by surprise, however, quite how much time is required to stay abreast of the regulations and requirements.
While stress testing has become part of the normal risk management cycle, increasingly operationalised and accepted as part of routine within financial institutions, many challenges around its implementation remain.
As organisations have experienced more stringent stress-testing requirements from different regulatory regimes, inherent new challenges have emerged across the board. Being able to fulfil these new requirements necessitates both company-wide transparency across product silos and high quality data – each of which many organisations lack.
For financial institutions, a lack of centralised data infrastructure creates significant foundational issues. Organisations can struggle particularly due to substantial pre-existing data gaps and missing historical data. For regulations such as BCBS 239, data quality and infrastructure is paramount and therefore finding and correcting errors may be an onerous task, but it is essential.
As the regulatory climate becomes more rigorous, it is fundamental to have data transparency and ensure rich input. Banks are often forced to spend large amounts of time simply avoiding and correcting errors in Excel, such as automatic rounding, as data quality is a fundamental precursor to creating a central data warehouse – something organisations have no choice but to prioritise.
Specific challenges that institutions face include: overcoming granularity differences and scenario augmentation in moving from macro to micro variables, mapping specific risk factors to the regulatory imposed shocks, managing different sets of scenarios, and integrating market data sources with regulatory and internal scenarios.
In alignment with stress testing, data governance has come to the forefront for a lot of banks in the last five to seven years. With the growing importance of central reporting systems, data governance has become a process that needs to be thoroughly vetted, rather than a reactive function.
Institutionalising a new data strategy is a complex task that can be approached in a variety of ways. The 2015 FIMA benchmark survey reveals that 48 percent of organisations take a top-down approach and 27 percent choose for CDOs to drive the global data strategy. Whichever way an organisation chooses to approach this challenge, it is clear that having a team that understands the business – and the variables they need to account for – enables the best implementation model.
When it comes to reporting output data, the quality of output relies on the quality of the data going through the end-to-end processes. Institutions struggling with this can benefit greatly from implementing powerful quality checks and defining a common, unambiguous data dictionary.
As well as being a great source of rapidly accessible information, effective data management is a necessary precondition to stress testing. A firm's success depends on its ability to quickly and easily access and integrate the data required, while having confidence it is correct, current and complete. On top of this, stress testing appears in new global regulation, including the Fundamental Review of the Trading Book (FRTB), which also puts additional demands on price histories and the determination of ‘modellable' risk factors.
In order to operate efficiently, banks require comprehensive data management capabilities to reduce the operational risk of unknown data. These track data quality and expose lineage and audit trails in an explicit, transparent and readily searchable form throughout the stress testing life cycle.
As stress-testing is now solidly established as an essential, rather than a best practice, it is paramount that financial institutions take steps to prioritise data quality to ensure they benefit from building stress-testing models. A centralised common market data and scenario data platform can be adopted in order to both stay on pace with regulation and be prepared for more frequent and detailed stress testing – ensuring the most useful insights and results.
Organisations must make a commitment to culturally embrace stress-testing across all branches and levels. Managing this cultural change means that stress-testing can be accommodated throughout the organisation and ensure that best practices become the norm, not the exception.
Contributed by Martijn Groot, VP product management, Asset Control