Posted on: March 26, 2021

Steve Byfield of Whistlebrook takes a look at how a technology-based solution could assist in addressing the ‘fourth pillar’ of the asset liability committee (ALCO) guidance.

The asset-liability committee (“ALCO”) is the primary balance sheet risk forum for building societies and banks and the Regulatory Authorities recognise that an efficient ALCO process enables the effective management of risk and optimisation of the balance sheet.

In fact, it would not be an exaggeration to say that the ALCO more than any other of the governance forums is critical to the long-term financial health and long-term strategic decision making of the organisation. In the era of the Senior Managers Regime a robust ALCO infrastructure is recognised as an essential ingredient in best-practice corporate governance.

At the pinnacle of this ALCO infrastructure is the ALCO Committee but a committee can be blessed with the finest minds in an organisation but will only provide effective insight with timely, relevant, and accurate information.

Over a decade ago the FCA, as it then was, in a “Dear CEO” letter outlined the guidelines for effective ALCO practice which were codified within Supervisory Statement LSS1/13.

Subsequent guidance has built upon the four themes identified within that guidance. The four themes identified:

  • Defining the committee’s key purpose
  • The composition and authority of the committee’s members.
  • The degree of challenge observed at the committee.
  • The forward-looking nature of, and decisions made by, the committee.

The consensus seems to be that over the last decade that the first three themes have evolved and strengthened across financial service organisations. Hence it is the last theme that this article explores further and considers how a technology-based solution could assist addressing the fourth pillar.

The Supervisory Statement LSS1/13 observed in 2013 that the ALCO “is often unduly preoccupied with monitoring and commenting on the past.”

The PRA commented that a good ALCO focuses “more on the effects of the future plans” & “strategy at the entity.”

For many organisations it is the very processes in place to produce information for the ALCO that result in business considerations be more tactical and historic than forward-looking and strategic.

This will be explored further by considering the following:

  • How is ALCO reporting currently delivered?
  • What are the challenges in orchestrating the data to provide that reporting?
  • The risks inherent within spreadsheet reliance for stress-testing?
  • The importance of data consistency?
  • And lastly how a technology solution could address these challenges?

 How is ALCO reporting currently delivered?

The production of the ALCO pack typically involves many skilled individuals segmented within different functional areas within a firm. The reporting content is drawn from multiple sources, both internal and external.  The collation and orchestration of this data follows an iterative process as information becomes available, KPI & regulatory content is generated and an analysis of the output is undertaken.

The collation of data and the preparation of charts and tables based multiple data sources is a significant overhead and time-consuming process.  The enormous effort by skilled resources in capturing the data by necessity reduces the time available for the all-important analysis and commentary, both crucial to the completion of the ALCO pack and to determining business insight.

The ALCO pack is usually available for review by working day 10 but its full completion can be delayed if available resources are impacted by staff sickness, working from home restrictions or other business priorities.

In addition to the “boiler-plate” reporting of historical trends and current positions, it is vital that firms explore future risk with the modelling of a variety of stress test scenarios.

It is surprisingly common to find that areas such as cash flow modelling, LCR modelling and liquidity forecasting are carried out in isolation to the firm’s core systems.  Indeed, many firms use spreadsheets to post-process collated data to measure projected risk outside of their core reporting platforms.  This resulting output then needs to be transcribed and transformed into reports for inclusion in the ALCO pack, with the inevitable delays in production and risk of error.

The external observer can only conclude that either, the incumbent asset & liability management “ALM” systems will not support automated production of this data, or the functionality within the ALM system is insufficiently flexible enough to support the needs of the firm.  For an ALM system to fully support the ALCO process it must support the application of both parallel and non-parallel stresses.  It must also support the triggering of management actions and an assessment of their impact in mitigating the effects of particular stress scenarios.

What are the challenges in orchestrating the data to provide that reporting?

The data required to complete the ALCO pack will be sourced from multiple disparate systems such as the core banking system, treasury management system and financial accounting ledgers amongst other sources.  Data orchestration acts to combine siloed data in these different systems to facilitate access for data analysis, calculations, and modelling.

Orchestrating data across multiple systems is typically problematic for several reasons:

  • Data types and formats differ between source systems.
  • Each system has its own proprietary data structures. If the source data is tracked for audit purposes in the source system, this can add additional complexity into these data structures and in the logic required to select the appropriate records to carry forward.
  • Some measures may not be directly available from the source system requiring transformation and/or amalgamation of data as part of the extraction process.
  • The output from extraction processes must be suitably structured to support downstream analysis and to feed into financial risk models.

Clearly, a comprehensive knowledge of all the source systems is required to meet these challenges head-on.  Changes in structure in the source data must be monitored, assessed, and acted on to ensure that data orchestration can proceed without issue. This will be an ongoing house-keeping challenge as individual source systems are replaced, evolve or upgraded.

Without systemic orchestration of these data, the provenance of the data within the ALCO pack is reliant on manual processes and data extraction.  Not only does this increase the risk of unintended errors, and a resultant loss in data integrity, but the processes will be involved and time-consuming.

There are significant advantages to the firm in leveraging a single orchestrated data set for multiple purposes, for example, to drive regulatory reporting output, form the basis for risk modelling and to drive reporting in the ALCO pack.  Consistency in the output is key to confidence in the data, the conclusions that can be drawn from those data and the insight derived.

The risks inherent within spreadsheet reliance for stress-testing?

The popularity of the use of spreadsheets for stress testing is understandable given the flexibility and familiarity with Excel.  However, it is increasingly recognised that spreadsheet use suffers from governance deficiencies when it comes to documentation, testing and key-man risk.

The complexities in the data links that feed spreadsheets, the definition of stress tests and the extensive functional logic to accurately apply stress tests all combine in creating considerable spreadsheet complexity which is impossible to determine for all but a limited number of specialised users.  This complexity is further compounded when firms expand their operations, product ranges and data volumes increase.

Whilst some errors in formulae often produce clearly identifiable effects, others may pass silently and unnoticed.  In addition, it is all too easy to substitute an absolute value into a calculated cell breaking the data flow through the models and such fixes promulgate through future versions of the spreadsheet without anyone noticing and creating unintended consequences.

No system is 100% error-free however suppliers of financial systems provide products that operate under a regime of tight change control, providing the necessary support to deal with issues and changes as they arise.

The same cannot be said of spreadsheets.  To operate these under the same levels of governance, a firm must produce and maintain comprehensive supporting documentation.  Alterations to the spreadsheets should be made subject to manual change control management and any changes made to the spreadsheets should be widely and independently tested.

Even with these procedures in place, there is still an inherent key person risk associated with these spreadsheets.  The absence of the authors, even temporarily, will often lead to operational challenges unless the knowledge and expertise has been disseminated more widely.

 The importance of data consistency?

Some prime examples where data consistency is essential are listed below:

  • There should be consistency in the methods of extraction and post-processing of data between different iterations of the ALCO pack.
  • Measures sourced from multiple source systems should share a consistent methodology in their derivation.
  • The output submitted in a firm’s regulatory returns should be consistent with the output in the ALCO pack where there is duplicated or derived information.
  • Trend analysis relies on consistency in data and approach for meaningful comparisons.

The importance of the third item in the list above cannot be stressed enough.  Leveraging regulatory data for the purposes of ALM and, in the process, ensuring consistency between both functional areas is key to engendering trust in the ALCO pack.  Questions arising from the ALCO pack should be focussed on what messages the data is conveying rather than on the accuracy of the data.

Issues with data consistency undermine confidence in the output contained in the ALCO pack and subsequently devalue any conclusions that are drawn from the data.  Data consistency needs to be proven if challenged.

Whilst the focus naturally falls on the source data, data consistency needs to extend further to encompass the parameter sets and assumptions used in generating reports, modelling stress scenarios and the availability and application of management actions.

As has been discussed, the ALCO pack is a strategic document underpinning the decision-making process at an executive level.  The implications of basing decisions on poor quality data can be harmful at best and catastrophic at worst.

And lastly how a technology solution could address these challenges?

A technology solution offers the opportunity to streamline the ALCO pack generation process in the automation of those processes that are not reliant on an analyst’s expertise.  Any solution must look to maximise the time available for analysis rather than preparation.

Many ALM solutions are predicated on the availability of a ‘single data source’ where data orchestration is configured in advance and applied consistently throughout the solutions’ use.  Inevitably, some firms are deterred from adopting a particular solution as the upfront investment in time to configure this orchestration is perceived to be prohibitive.  This is compounded by the range of source systems actively used in the industry and the need to tailor this orchestration to specific systems. Whistlebrook delivers this “single data source” by way of the ALFI single trusted source of truth® data warehouse.

There are solutions that mitigate this initial cost through the establishment of strategic partnerships with the providers of the different core systems.  In doing so, permanent, standard interfaces can be built into these core systems providing a ready-to-use interface with the ALM solution.  This can significantly reduce the resource burden in taking up a fully systemic approach to ALM.

The return on this initial investment can also be maximised in certain offerings with a common data source feeding multiple functional areas such as regulatory reporting, asset liability management, hedge accounting and effective interest rate assessments.  This also promotes data consistency across systems.

Despite their popularity, the use of spreadsheets is increasingly considered antithetical to good governance primarily because of an all too frequent lack of guaranteed and robust oversight.  As a serious alternative to established manual methods, a technology solution must address this by providing the necessary capabilities with sufficient flexibility for the financial modelling of stress test scenarios.

There is an increasing focus on forward looking analysis by the regulators however it should be recognised that this is complimentary to the analysis of historical trends and the current business position.

The analysis of historical trends is essential for insight in how positions are changing over time and will therefore influence the forward-looking analysis and conclusions.  The  analysis of historical trends also provides the means to review the efficacy of management actions that have previously been taken.

As such, a technology solution must provide the platform to perform a complete and rounded analysis both of the historical information and the future.

Conclusion

For organisations to have fully effective asset and liability management the complexities in orchestrating consistent and timely data need to be addressed.

It is only with the insights drawn from a comprehensive forward and past looking assessment of the firm that demands of the regulators, non-executives and other stake holders will be met.

Reliance on highly skilled staff spending extensive periods of time manipulating and consolidating data will not deliver the business effectiveness that is required.

The only means to liberate top talent to be able to draw business insight is by way of a technology solution.

If you would like to learn more about how Whistlebrook helps organisations meet the data automation challenge for both ALM and ALCO pack production, please contact me at steve.byfield@whistlebrook.co.uk