Review of the analytical framework supporting financial policy at the Bank of England

Our Financial Stability Papers are designed to develop new insights into risk management, to promote risk reduction policies, to improve financial crisis management planning or to report on aspects of our systemic financial stability work.
Published on 19 August 2025

Financial Stability Paper No. 52

Oliver Bush, Anne-Caroline Hüser, Pippa Lowe, Rhiannon Sowerbutts and Matthew Waldronfootnote [1] footnote [2]

Executive summary

The UK financial system plays a critical role in the economic activity of households and businesses via its provision of services to them. These include financing of production and consumption, deposit and liquidity facilities and saving, investment, insurance and payment services. In turn, these services rely on a complex supply chain including technological, data, legal and other financial services.

The importance of a stable financial system follows from the importance of financial services for economic activity and hence economic welfare. Instability that undermines the reliable supply of such services can impose significant costs on the real economy by hampering households’ and firms’ ability to consume, produce and invest. Conversely, a stable financial system supports economic welfare by providing essential services to households and businesses – both in good times and bad.

The Financial Policy Committee (FPC) of the Bank of England (the Bank) is tasked with ensuring financial stability in the UK. It has been a statutory body for over 10 years, having been established as part of sweeping reforms in the aftermath of the Global Financial Crisis (GFC). The analytical framework supporting the FPC’s policymaking has served it well in that time.

Since the FPC’s inception, the nature of threats to the stability of the UK financial system has evolved, as have the analytical tools available to help tackle those threats. The FPC, and the analytical framework supporting its policymaking, has adapted accordingly, and is at the forefront of efforts to identify and tackle risks across the system as a whole.

After more than a decade of experience, it is timely to take stock of that analytical framework, and to ready it for the next decade. With this rationale in mind, this paper proposes some enhancements to that framework, which will inform Bank staff’s investment work over the next few years. The intention is that these proposals will also serve as a starting point for wider dialogue, forming the basis for further engagement and collaboration with researchers and practitioners outside the Bank.

The key enhancements proposed by this paper are summarised as follows. They are grouped according to three elements that are common to the analytical frameworks used by many financial policymakers. That is: i) policy objectives and instruments, ii) risk assessment and policy evaluation, and iii) data (and data analytics).

The objectives (and instruments) of financial policy

The financial policy landscape is inherently complex. It is characterised by a wide, overlapping set of policy instruments, necessarily broad objectives that are open to interpretation. And financial system activities, risks, and frictions (eg impediments) in the efficient provision of services to the real economy are constantly evolving. These complexities emphasise the importance of two areas of further work.

  1. Building and communicating a more specific articulation of the FPC’s objectives. Key to this is describing more precisely what (not) achieving them looks like in terms of the provision of ‘vital’ financial services to the real economy. This should be supplemented by more quantitative work and research on the relationship between financial stability and economic welfare.
  2. Enhancing the FPC’s system-wide policy strategy. This is important for ensuring that the FPC deploys its tools as effectively as possible, enabling it to focus on what is most important to system-wide stability and economic welfare, and to understand the system-wide implications of its policy response. Importantly, this includes ensuring that risk is not simply transferred to other parts of the system that are less equipped to manage it. An enhanced strategy would include a more detailed understanding of the circumstances under which different interventions are most effective and of the interaction between tools and their impact on the FPC’s objectives and ultimately, the implications for economic welfare.

Risk assessment and policy evaluation

The Bank’s current approach to financial stability risk assessment – like that of many other financial policymakers – focuses on identifying features of the financial system that might make it prone to instability (‘vulnerabilities’) and propagate shocks to the real economy. Given the complexity and changing nature of the financial system, identifying all vulnerabilities and propagation channels is a challenging task. The FPC has approached this from an increasingly system-wide perspective, enabling it to quantify risks and their impacts more systematically over time.

Nonetheless, there are naturally opportunities to improve the framework. Expanding and strengthening system-wide scenario analysis is a key plank of ongoing efforts here. The absence of a holistic model that fully captures propagation of shocks through the financial system to the real economy means that, in practice, the approach to constructing scenario analysis (and any other risk assessment technique) will need to ‘join the dots’ by drawing on multiple inputs. Inputs include: i) models of parts of the system or behaviours, where these exist or can be efficiently developed and ii) other quantitative/qualitative evidence and assumptions about vulnerabilities and behaviours – for example, firm information on their responses to specified scenarios, behavioural rules of thumb, and data on household and firm exposures. Expanding this approach includes a range of measures to:

  1. Establish a coherent approach to ‘mapping’ the financial system, to systematically identify both new and existing vulnerabilities, and trace how shocks propagate to the real economy (and so how they might affect welfare). The resulting maps would be built and added to over time using a range of information. For example, through intelligence, data gathering, and system-wide exploratory scenario (SWES) exercises.
  2. Produce a system-wide dashboard of indicators for the vulnerabilities and propagation channels identified in the map and collate estimates and data that help size them.
  3. Expand modelling capability. First, continue to invest in models that help assess propagation of shocks by, from, and to banks – and other entities such as insurers and investment funds – in macroeconomic and financial market stress scenarios. Second, build a suite of system-wide models to analyse the impacts of financial market stresses on the financial system more broadly. This could reduce the need to rely on industry through SWES-type exercises to model specific scenarios, though engagement with firms will help corroborate modelled results and behavioural assumptions as the industry continues to innovate. Third, further develop modelling approaches to capture feedback mechanisms between the real economy and financial system not captured by other proposals here.
  4. Develop a modular approach to scenario analysis. This would build on mapping, indicators and models to provide an end-to-end approach to scenario analysis – from the scenario through the system to real economy impacts. Depending on the risk being assessed, different inputs would be needed to simulate different parts of the propagation chain, including those supplied by the mapping and indicators. The aim would be to standardise inputs as far as possible and – while recognising the caveats and qualifications – use a modular approach to combine them, to more fully simulate the propagation of shocks to the real economy. Scenarios for different parts of the system could also be connected using system-wide models or by joining the dots between different existing stress-testing exercises – banking, insurance, central counterparties, and SWES scenarios.
  5. Invest further in systemic risk (and less conventional) indicators, to act as ‘top-down’ cross-checks on ‘bottom-up’ analysis of the sort proposed above.

By identifying how the financial system propagates adverse shocks, this scenario analysis approach can help evaluate policy interventions in stresses. But it is less useful for evaluating the costs of such policies outside of stresses. This matters because the FPC must not exercise its functions in a way that would be expected to damage the medium or long-term growth of the UK economy: so, the costs of delivering financial stability are a critical consideration in its policymaking. The FPC’s focus also extends beyond preventing instability, to removing frictions that can hinder the financial system’s support of the real economy in normal times. This paper therefore proposes the following work on policy analysis:

  1. Continue to develop capabilities in modelling agent heterogeneity to support analysis of a range of policy interventions.
  2. Develop modelling (jointly with staff supporting the Bank’s Monetary Policy Committee) that analyses the interaction between the financial system and real economy (and so between financial stability and monetary policy).
  3. Explore development of structural modelling of the effects of policies that enhance resilience in core markets.

Data (and data analytics)

Data are essential for risk and policy assessment. Bank staff’s data and analytics strategy focuses on closing key data gaps, improving data access and usability, and extracting more information from existing data. Key priorities include:

  1. Increased use of automated dashboards and tools, using AI where appropriate. This can help improve the efficiency and effectiveness of risk monitoring.
  2. Identifying and addressing key data gaps beyond those already identified by the Bank and FPC.

Implications of this paper’s proposals

This paper’s proposals should enhance policymaking by enabling more precise assessment of how the FPC’s objectives can be undermined, and more effective evaluation of policy measures to address this. In turn this will support the FPC in being selective and deliberate in its focus. The aim is to provide the FPC with analysis that:

  • Takes an (even) more system-wide approach to risk assessment, with an emphasis on scenario analysis and the interaction between the financial system and the real economy. This includes continuing to i) build expertise in less well-charted areas like market-based finance, to tackle emerging risks and those outside the banking sector and ii) deepen understanding of the economic impact of any vulnerabilities that threaten financial stability and of inefficiencies that hamper growth. This can help develop a more specific description of what success and failure to meet the FPC’s objectives look like. A better understanding of the interaction between the financial system and the real economy can further enhance analysis of the impact of policy interventions on objectives for both stability and growth.
  • Identifies risks to the FPC’s objectives more systematically and comprehensively and enables a clearer comparison of those risks and policies to address them. This can be achieved in part by applying a consistent map of vulnerabilities and propagation channels to different shocks and harnessing advances in data analytics and modelling to quantify risks, alongside complementary approaches. A consistent approach matters because comparability of risks is key for policy strategy, including how and when to use ex-ante tools (like prudential buffers) versus ex-post interventions (like market operations) optimally.
  • Integrates a more systematic analysis of frictions underlying financial system vulnerabilities and inefficiencies. This can sharpen analysis of different policy interventions and may increase the number of viable policy options by revealing multiple points for intervention.

1: Introduction

The Financial Policy Committee (FPC) was established as a statutory body in 2013, with a primary objective to protect and enhance the stability of the UK financial system. The FPC was created as part of sweeping reforms to financial services regulation following the Global Financial Crisis (GFC).

Reflecting these origins, the early focus of the FPC was on the stability of the banking sector. Since then, the nature of threats to stability have evolved significantly. UK banks are now demonstrably more resilient. They were able to continue supporting the real economy through several recent stresses, such as the collapse of several mid-sized US banks, the failure of Credit Suisse, the pandemic and the sharp increase in interest rates that followed.

But as threats to stability from other sources have grown, the FPC’s focus has shifted accordingly. Notably, the role of non-banks, including market-based finance (MBF), has grown substantially. MBF has also been at the heart of several recent episodes of stress, including the global dash-for-cash in 2020 and the UK liability-driven investment (LDI) crisis in 2022. The financial system has also become more interconnected, including globally, and digitised – which has made operational resilience more important while, arguably, increasing risks to it at the same time.

The analytical framework supporting the FPC has adapted to these developments and is at the forefront of efforts to identify and tackle risks across the system.

The tools available to support financial policy analysis have evolved too. There has been material progress in data analytics, alongside advances in the academic literature on risk assessment and policy evaluation.

Finally, there is increasing focus on the financial sector’s role in supporting economic growth.

Over a decade of experience makes now a good time to take stock of the analytical framework that supports the FPC’s policymaking, and to ready it for the next decade.

This paper is structured around three key elements common to such analytical frameworks, stylised in the Figure 1 (in white).

Figure 1: Elements of the financial policy analytical framework

Figure 1 presents the principal components of the analytical framework supporting financial policymaking. The figure illustrates how the three key components of the analytical framework interact.   It is structured in three horizontal layers, each representing one these components.
The top layer of the figure comprises two adjoining boxes: one for policy objectives and the other for policy instruments.   The first of these two boxes shows that the FPC’s primary objective is financial stability, and its secondary objective is to support the economic policy of the government.  The second box shows the policy instruments as FPC powers and tools.  An arrow runs from the instruments box to the objectives box, signifying that the use of instruments is intended to achieve these objectives. Importantly, this arrow is double-headed.  This because not only do instruments achieve objectives, but objectives also guide both the selection and ongoing evaluation of instruments.
Moving to the middle layer, the transmission framework is split into two distinct but related boxes. The first box is risk assessment and assessment of inefficiencies.  An arrow extends from it to the objectives box above it —demonstrating that identifying risks and inefficiencies is crucial for identifying ways in which the FPC’s objectives can be undermined.  From this same box, another arrow points to the second transmission framework box which is labelled policy evaluation.  This connecting arrow illustrates that assessing risks and inefficiencies also informs the process of policy evaluation and selection.  The policy evaluation box is also linked back to the policy instruments box in the top layer of the Figure by a double-headed arrow.  This illustrates the two-way relationship between policy evaluation and use.  
The bottom layer of the figure shows the third key component of the analytical framework, which is data and data analytics.  This is symbolised by a single box.  From here, two arrows extend upward to the transmission framework in the middle layer of the Figure.  One of these arrows goes to the box showing risk assessment and assessment of inefficiencies and the other to the box showing policy evaluation. This illustrates that that data and analytics feed into both risk assessment and policy evaluation processes, including by enabling ongoing monitoring, identification, and analysis of risks and inefficiencies.  
The overall flow of arrows and box placement in the diagram emphasises a cyclical and interconnected process, with information and evaluation continually circulating to support effective and adaptive financial policymaking.

The first is policy objectives and instruments; the aims of the policymaker and the tools available to it in pursuit of those aims. Second, a transmission framework that describes how the variable(s) targeted by policy can be affected by shocks and policy. In practice, separate frameworks are often needed for risk assessment and policy evaluation. Third, data (and data analytics) are used to describe the state of the system, help calibrate models and inform modelling.

The FPC’s primary policy objective is financial stability. Its responsibilities in this regard are formally described as relating ‘primarily to the identification of, monitoring of, and taking of action to remove or reduce, systemic risks with a view to protecting and enhancing the resilience of the UK financial system’.footnote [3] While particular systemic risks are emphasised – relating to markets, risk distribution and unsustainable levels of leverage, debt or credit growth – the FPC’s primary objective is not further defined.footnote [4]

The FPC is prevented from exercising its functions in a way that would be expected to damage medium or long-term growth of the UK economy. This can be thought of as a constraint on what the FPC should do in pursuit of its primary objective.

Further, the FPC also has a secondary policy objective to support the Government’s economic policy, including for growth and employment.

Reflecting the myriad of ways in which financial stability can be appraised and undermined, the FPC’s objectives leave broad room for interpretation. This could make it harder for the FPC to scale threats to its objectives and performance against them.

To fulfil its functions, the FPC has a range of policy instruments available to it. These include legally-binding powers of direction to the PRA or FCA over certain macroprudential measures,footnote [5] and non-binding powers of recommendation to a broad set of UK bodies. The FPC also sets the UK’s countercyclical capital buffer (CCyB) applying to banks. Other instruments it can call upon on include Bank of England (Bank) balance sheet tools, like market operations, and its own communications.

The Bank’s overall approach to risk assessment and policy evaluation draws on a comprehensive risk assessment framework. Like many other financial stability policy institutions, it has in practice focused on identifying financial system ‘vulnerabilities’ and the ‘resilience’ of key sectors and markets to shocks. But further developing understanding of the real economy effects of financial system propagation of shocks is a growing focus. As is expanding system-wide analysis and identification of frictions underlying vulnerabilities (and inefficiencies). These developments are particularly relevant in the context of non-bank vulnerabilities for several reasons. These include that the frictions underlying them – and their impacts on the real economy – are typically less well understood than for the banking sector. The non-bank sector often also plays more of an indirect or intermediating role in supporting the real economy. Relatedly, it is also highly interconnected.

A further key challenge in the Bank’s approach to assessing risk and evaluating policy is measurement. There are naturally limits – including technical constraints – to quantifying risks and inefficiencies. But there is always scope to build on advances in the academic literature and practice.

Data (and data analytics) are used at the Bank to describe the state of the financial system, help calibrate models and inform analysis of shocks and policy interventions. There have been significant advances in data gathering, analysis and modelling since the FPC’s inception. In particular, for non-banks. This includes the use of system-wide modelling and exercises, such as the Bank’s recent system-wide exploratory scenario (SWES). But gaps in data availability and use remain, and there is more to do to enhance understanding of system dynamics.

Some measurement limitations and data gaps will be inherently unresolvable. This means that quantitative analysis should be supplemented with alternative, complementary approaches such as supervisory and industry intelligence, horizon scanning and the use of SWES-style exercises to build and test understanding of the system. Bank staff already use these approaches in financial stability analysis, but more broadly and systematically integrating them in the analytical framework will enable them to be used more effectively.

The analytical framework described here supports the FPC’s key operating modes, set out in Figure 2.footnote [6] The first is to identify ways in which the FPC’s objectives can be undermined, for example by identifying risks, vulnerabilities and inefficiencies. Where an issue is identified as having the potential to undermine the FPC’s objectives, the FPC takes action, which can involve supporting actions by other authorities or taking direct action itself. If it is determined that the issue is not capable of undermining the FPC’s objectives, or is already being mitigated, the FPC shifts to monitoring mode.

Figure 2: The FPC’s operating modes

The Figure shows three boxes representing each of the FPC’s operating modes.  Starting at the bottom left is a box labelled identify.  Where an issue is identified as having the potential to undermine the FPC’s objectives, the FPC takes action.  This is illustrated by an arrow that extends clockwise from the identify box to a box labelled take action at the top of the figure.  If it is determined that the issue is not capable of undermining the FPC’s objectives, or is already being mitigated, the FPC shifts to monitoring mode.  This is signified by an arrow extending clockwise from the take action box to a box labelled monitor in the bottom right at the bottom right of the figure.  The monitor box itself feeds back into the identify box, completing the clockwise flow of the arrows around the figure.

All aspects of the analytical framework described here support the three operating modes of the FPC shown in Figure 2. For example, the transmission framework – supported by data (and data analytics) – enables identification and assessment of the ways that the FPC’s objectives can be undermined, the evaluation of policy to tackle them and monitoring of their incidence and scale. And policy objectives and instruments determine the basis and means for FPC action. Given the importance of the analytical framework to the FPC’s policymaking, it is essential that it continues to evolve with developments in the financial system and remains at the forefront of efforts to identify and tackle risks across the system.

This paper explores the issues discussed above in further detail. Its scope is limited to that of the analytical framework underpinning the FPC’s policymaking. Neither the institutional framework for UK financial policy nor the FPC’s governance processes or operation are in scope. These elements are taken as given for the purposes of this paper. This paper does not undertake a full review of the analytical frameworks underpinning macroprudential authorities’ policymaking in other jurisdictions. However, the work proposed here will draw on international experience and best practice, taking account of similarities and differences between institutional arrangements.

The remainder of this paper is structured according to the three key elements of financial policy analytical frameworks outlined here. Section 2 discusses the objectives (and instruments) of financial policy. Section 3 considers its transmission frameworks – ie risk assessment and policy evaluation. Section 4 looks at data (and data analytics) and Section 5 concludes.

2: The objectives (and instruments) of financial policy

As illustrated by Figure 1, objectives and instruments are defining elements of any policy framework; determining what a policymaker is trying to achieve and how. Section 2.1 examines key insights from the literature on financial policy objectives. It includes a non-technical summary of the relevant concepts (Box A). Section 2.2 turns to how analysis supports the effective use of policy instruments. Section 2.3 assesses how these elements are reflected in the UK’s financial policy analytical framework. Section 2.4 concludes and suggests some enhancements to this framework.

2.1: The rationale for and features of objectives in financial policy

The financial system delivers significant benefits to society by overcoming some of the costs and barriers to activity that would exist without formal financial services. In a world of purely bilateral transactions, such ‘pure’ frictions (eg asymmetric information) and distortions (eg credit rationing) would severely limit risk sharing and efficiency. But financial systems are not perfect. They cannot (and do not) fix all frictions, and they can themselves create or worsen frictions. These ‘derived’ frictions and associated distortions (eg externalities) can, alongside non-rational behaviour, drive vulnerabilities (eg excessive leverage) and inefficiencies (eg misallocations) in the financial system and real economy. These can pose substantial real economic, and so welfare, costs. This justifies a role for public policy. Box A sets out more detail.footnote [7] This section starts by discussing what the overarching ambition of such policy should be, and the limitations to achieving this. It then considers key insights from the literature on appropriate policy objectives.

2.1.1: The overarching ambition of financial policy and limitations to achieving it

Since public policy ultimately aims to enhance welfare, it is useful to frame its goals through the First Fundamental Theorem of Welfare Economics. This states that, under conditions including perfect competition, complete markets and full or perfect information, outcomes are Pareto efficient.footnote [8] This is the ‘first-best’ (or ‘unconstrained efficient’) outcome. For policymakers, this implies intervening to reduce the welfare costs of financial frictions.

But in practice, many of the conditions underlying the First Fundamental Theorem do not hold. For example, some frictions and non-rational behaviours are inherent to financial activity and cannot be fully eliminated. Other frictions and distortions may be associated with fundamental features of supporting activities, such as the form and functioning of the legal system, technology, and tax.

For these reasons, the first-best outcome is unattainable. In that case, the theory of the second best applies (Lipsey and Lancaster (1956)) and the role of the policymaker is to intervene to achieve a ‘constrained efficient’ second-best outcome. That is, trying to maximise welfare subject to unavoidable constraints.

Some second-best policy interventions may appear to lean against the (unachievable) first-best outcome. For example, it is well established that in some cases a policymaker can achieve better outcomes by limiting the excessive build-up of private debt (eg Lorenzoni (2008)). For example, where borrowers’ reactions to collateral constraints in stresses impose negative externalities. This is despite the fact that, without the social planner’s intervention, private debt is typically sub-optimally low compared with the first best outcome.

Consistent with this, second-best interventions that improve overall welfare are likely to involve trade-offs. Some agents may be made better off and others worse off. And economic welfare may be increased in some states/times but lowered in others. In some cases, interventions that improve welfare might exacerbate inefficiencies to reduce vulnerabilities. Ie redistributing costs from stressed states of the world to less stressed ones.

Any trade-offs must be carefully balanced within the policy framework. Ultimately, interventions should only be made when they deliver net welfare gains. For example, building resilience ex ante is only justified where the benefits (in terms of reduced welfare costs of crises) are not outweighed by any costs (for example, to growth).footnote [9]

Many policy models targeting second-best or constrained efficient outcomes assume that the policymaker has perfect information (and perfect instruments), while relaxing assumptions such as perfect information for other agents. But in practice policymakers often have imperfect: i) information about frictions and behaviour and ii) tools to tackle them (Section 2.2.1 discusses this in more detail).

2.1.2: Insights from the literature on desirable features of financial policy objectives in the real world

As alluded to in the previous section, the literature is useful for understanding the role of particular frictions and distortions, like pecuniary externalities, in reducing welfare. But it is less useful in guiding what policy objectives should be in practice. This is because most papers assume that a single policymaker (or social planner) directly maximises welfare. There are two practical problems with doing this. First, it is not possible to measure economic welfare in a sufficiently robust way to guide policy. Second, different aspects of public policy are generally delegated to different policymaking bodies.

The practical solution to these problems is for a (delegated) policymaker to minimise a ‘loss function’footnote [10] that defines their policy objectives in an approximately welfare-consistent way, when accounting for the loss functions and actions of other policymakers. The resultant terms in the loss functions represent specific intermediate targets. There are two distinct, although connected, challenges to this approach: i) the form of delegated loss functions and ii) their delegation to multiple policymakers.

Form of delegated loss functions

Some of the literature has explored what financial stability terms a financial policy loss function could include. These papers often extend otherwise standard monetary macroeconomic (‘New Keynesian’) models by incorporating financial stability-relevant frictions or distortions. These result in terms beyond the conventional inflation and output gap terms typically included in loss functions delegated to monetary policymakers:

  • Ferrero et al (2024) incorporate a household collateral constraint and limits to risk sharing between borrowers and savers. These frictions give rise to two financial-stability relevant terms in the loss function. These are the gap between borrowers and savers in marginal utilities of: i) non-durable consumption and ii) housing. These terms reflect both the incomplete risk sharing and the pecuniary externality driven by the collateral constraint.
  • De Paoli and Paustian (2017) incorporate a moral hazard problem between banks and depositors, which leads to a binding leverage constraint and an endogenous spread between lending and deposit rates. These give rise to an additional effective interest rate term in the loss function.

While expanding the literature to map a broader range of frictions into loss functions would be valuable,footnote [11] it would not alone yield a usable financial policy loss function. As Ferrero et al (2024) show, welfare-consistent loss functions often include terms that are difficult to measure. Moreover, the number, complexity, and model-specific nature of financial frictions would require numerous tailored terms.footnote [12] These challenges likely explain why the literature has yet to: i) determine whether a simple loss function could approximate optimal welfare-consistent policy for financial stability and ii) reach a consensus on what terms it might contain. By contrast, these issues have been overcome for monetary policy with an acceptance that minimising a weighted average of deviations of inflation from target and the output gap is likely to be a good approximation to a welfare-consistent loss function.footnote [13]

Delegation to multiple policymakers

In the real world, policy is typically delegated to multiple policymakers. Reasons why include the benefits of institutional independence, specialisation and expertise. The sheer breadth of public policy also makes it impractical for a single body to manage effectively.

In general, microprudential policy targets the safety and soundness of individual financial institutions, while macroprudential policy targets system-wide stability.

The case for separation of micro and macroprudential policy is largely a practical one. It would be impractical for a single policymaker to oversee the financial system’s vast range of entities and activities, and their interactions with each other and the real economy.

However, theoretically distinguishing the roles of micro and macroprudential policy is difficult, partly because both contribute to system resilience and can interact with each other. These interactions are explored further in Section 3 and are central to understanding how micro and macroprudential goals may align or conflict. The absence of a clean separation in roles could lead to ambiguity in policymakers’ objectives and accountability.

Regardless of how the split is defined, this discussion suggests that practical issues around the delineation of roles and interactions between micro and macroprudential policy are inevitable. This emphasises the importance of co-ordination between them: failures can prevent delegated policymakers from achieving the level of welfare that a single, unified policymaker might. Useful insights can be drawn from papers that formally study the interactions between policymakers. For example, De Paoli and Paustian (2017) has studied macroprudential and monetary policymaker co-ordination. The authors assign welfare-consistent loss functions containing: i) standard inflation and output gap terms and ii) output gap and effective interest rate terms to i) monetary and ii) financial stability authorities respectively. They find that non-coordinated (Nash equilibrium)footnote [14] policy leads to welfare losses compared to the coordinated (ie social planner) equilibrium, with the size of these additional losses depending on the precise institutional arrangement between the two policymakers.footnote [15] Poor co-ordination between macro and microprudential policymakers might also be expected to lead to welfare losses. These losses could be high, given strong links between microprudential tools and macroprudential goals. Macroprudential authorities, with their system-wide perspective, are well placed to facilitate co-ordination and reduce these costs.

Overlaps and interaction between the prudential policies of different country authorities are also relevant. For example, BIS (2017) demonstrates that macroprudential tools – such as loan-to-value limits or countercyclical capital buffers – can have unintended effects on other economies. Moreover, MBF activities in particular tend to operate on a global basis, which increases the degree of international interconnectedness. For these reasons, international fora, which enable co-operation among macroprudential authorities from different countries, are very important.

2.2: Instruments of financial policy

This section first discusses the overarching ambition of policy instruments and constraints on achieving that. It then considers the necessary features of analytical frameworks supporting the use of policy instruments.

2.2.1: Overarching ambition of policy instruments

The previous section argued that financial policymakers’ overall aim should be to maximise economic welfare under ‘second-best’ conditions. They should do this by tackling frictions and their effects, accounting for trade-offs,footnote [16] such as between effects on efficiency and stability.

Doing this requires the policymaker to have enough instruments to target effectively financial frictions and their effects.footnote [17] But, as noted briefly in the previous section, there are limits to what policymakers can achieve in practice, because:

  • Policymakers have imperfect information about frictions, distortions and non-rational behaviour, and so face uncertainty about the effects of interventions. Frictions may also involve significant complexities. For example, they and their costs may be time-varying. In some cases, this is caused by constraints on agents (such as collateral constraints) which only bind occasionally. In others it is caused by the possibility of sudden runs, freezes or disruptions in intermediary funding markets.footnote [18] Some frictions and distortions may also interact with others. For example, asymmetric information can contribute to missing or incomplete markets. And financial services mechanisms that evolved to tackle one friction or distortion may worsen others. For example, the use of collateral may help mitigate the impacts of asymmetric information in lending, but this can create collateral constraints which may drive fire sales under certain conditions. These complexities can make choosing and calibrating an appropriate intervention difficult, particularly for ex-ante instruments.
  • Policymakers may have imperfect instrument sets and face constraints in using them. Imperfections include unclear objectives, gaps in tools’ (collective or individual) coverage, and implementation lags. Some tools may also affect other frictions or distortions. For instance, enhancing resilience at the firm level can sometimes worsen the impact of system-wide vulnerabilities (Section 3). Relatedly, some tools may be unable to tackle frictions and their effects on their own and so multiple instruments may be needed. All these issues can make policymaking more complex.
  • Unclear objectives can also make deploying policy effectively more difficult. As can – often relatedly – co-ordination problems between policymakers. Other factors that might also undermine policy effectiveness include: i) pressure for looser policies as memories of crises fade and create complacency (Palley (2011) for a broader discussion) and ii) ‘leakage’ to jurisdictions and sectors beyond policymakers’ reach, including resident entities domiciled elsewhere (Reinhardt and Sowerbutts (2015) and Frost et al (2019)).

2.2.2: Necessary features of analytical frameworks supporting the use of policy instruments

The challenges set out in the previous section highlight the need for a strong analytical framework to guide policy instrument use. This includes: i) a clear understanding of how instruments ‘work’, ie how they affect frictions, behaviours and/or their effects, ii) mitigants and responses to challenges in using instruments and iii) well-defined objectives and a coherent strategy for instrument use. These are explored in turn:

i) A clear understanding of how instruments ‘work’ is critical for their effective use. In this context, Box B summarises key financial policy instruments and the frictions, distortions and behaviours they target.

ii) Mitigants or responses to challenges in using policy instruments can help improve their effectiveness. For example:

  • Co-ordination problems between policymakers can be mitigated through clear objectives and supportive governance. This includes clear and agreed responsibilities, structured processes for information sharing and cooperation, and a clearly-articulated strategy for instrument use – all of which help clarify accountability.
  • International leakage can be reduced in several ways, including with international standards that address cross-border frictions, ensure a level-playing field, and reduce the risk of financial instability spilling over to other countries.footnote [19]
  • A cautious approach to taking policy action may not always be more appropriate where policymakers lack perfect information. Bahaj and Foulis (2017) show that the conventional assumption that policy should act more cautiously in response to uncertainty (as per Brainard (1967)) may not always apply in a financial stability context. If the costs of instability outweigh the benefits of looser policy in the policymaker’s objective function, then uncertainty can justify more aggressive intervention.footnote [20] Bahaj and Foulis (2017) also consider Knightian uncertainty, where policymakers cannot (reliably) assign probabilities to rare events like financial crises. In such cases, a ‘robust control approach’ (Hansen and Sargent (2001, 2008)) – applying the policy that minimises losses in the worst-case scenario – may be appropriate. But Bahaj and Foulis (2017) suggest that this approach has its limits if it leads to the financial system being overregulated and inefficient in normal times. Ultimately, uncertainty requires a pragmatic approach: focusing on interventions that matter most for policy objectives while remaining alert to unintended consequences (Lipsey (2007)).

iii) Well-defined objectives and a coherent strategy for deploying instruments can help address policy implementation challenges in several ways. These include: a) anchoring policy expectations and enhancing policy credibility, b) aiding prioritisation, c) guiding policy instrument selection where frictions and effects of policy on them may interact, and where there are complexities in the nature of frictions (such as non-linearities and variation over time), d) supporting co-ordination with other policymakers and e) strengthening the framework against undue pressure to loosen or tighten policy.

2.3: Objectives and instruments in the UK’s financial policy framework

The literature discussed in the previous section offers valuable insights and guidance in several areas. These include: the overarching aim of public policy, optimal policy in response to certain frictions and the costs of co-ordination problems. However, the literature offers little consensus in other areas, such as on specific and tangible financial stability objectives.

The formulation of the FPC’s objectives and the key institutional arrangements underpinning its activities are broadly consistent with many of the key features of this literature.

The FPC’s objectives are set out in its establishing Act.footnote [21] Its primary objective is financial stability.footnote [22] Specifically, the Act tasks the FPC with identifying, monitoring, and taking action to remove or reduce systemic risks (defined as risks to the stability of the UK financial system as a whole or a significant part of it). The Act highlights three key sources of systemic risk: i) structural features of financial markets, such as connections between financial institutions, ii) the distribution of risk within the financial sector and iii) unsustainable levels of leverage, debt, or credit growth. The Government must also provide a written remit to the FPC at least once a year which gives guidance on the FPC’s objectives.

However, beyond this, the Act does not further detail the FPC’s primary objective. In practice, the FPC interprets and communicates its objective via the Bank’s Financial Stability Strategy and other publications.footnote [23] The most recent Strategy, published in 2023, defines a stable financial system as one ‘that has sufficient resilience to be able to facilitate and supply vital services by financial institutions, markets and market infrastructure to households and businesses, in a manner that absorbs rather than amplifies shocks’.footnote [24]

The FPC also has a secondary objective – subordinate to the primary objective – to support the Government’s economic policy, including its objectives for growth and employment. The Government’s annual remit letter sets out what the FPC should consider as part of its secondary objective. The Act further specifies that the FPC should not ‘exercise its functions in a way that would, in its opinion, be likely to have a significant adverse effect on the capacity of the financial sector to contribute to the growth of the UK economy in the medium or long term.’ This serves as a constraint on how the FPC pursues its primary objective.

The allocation of a primary financial stability objective and secondary economic policy objective to the FPC is consistent with two key insights from theory discussed in the previous sections. First, the huge costs of financial crises justify a primary focus on financial stability for a policymaker with a system-wide perspective and purview. Second, it is arguably easier to assign a primary financial stability objective to a single policymaker than to do so for the objective of supporting the Government’s economic policy. This is because the latter can be materially influenced by many policy domains. While financial stability is also influenced by other policymakers, there are relatively fewer relevant actors, making it more feasible to allocate a coordinating role to a single policymaker (the FPC).

The FPC’s remit notes that its primary and secondary objectives will often be complementary. It encourages the FPC to support the secondary objective, where that is not in conflict with its primary objective. But the remit also notes that when conflicts between the two objectives do arise, they should be managed and communicated transparently.

Overall, the FPC’s objectives leave broad room for interpretation. This breadth reflects the myriad of ways in which financial stability (and even more for the financial sector’s contribution to the Government’s economic policy) can be appraised and undermined. In other words, it is not a deficiency in the FPC’s remit. Such room for interpretation is also consistent with the state of the literature which offers no consensus on specific and concrete financial stability objectives. But it could create ambiguity around the FPC’s objectives and pose challenges to policymaking in certain circumstances.

The Act also sets out the FPC’s powers and instruments, and specifies several elements of FPC governance and processes, such as requirements around the publication of meeting records.

To fulfil its functions, the FPC has a range of policy instruments it can call upon.

Regular and ad-hoc communications by the FPC, Bank and PRA are an important part of the financial policy toolkit. Key tools here include: i) FPC meeting records, ii) Financial Stability Reports, iii) publications relating to stress tests on banks, CCPs, insurers and the broader market, iv) speeches and v) other publications such as Financial Stability in Focus papers. These communications variously intend to achieve aims such as supporting accountability, policy signalling and expectation management and raising awareness of risks and structural changes affecting them.

Other tools include the setting or application of capital buffers, powers of direction, and recommendation. The FPC can also advise the Bank on the use of its balance sheet for the purpose of protecting and enhancing the stability of the UK financial system.footnote [25] These are summarised in Table 2.A alongside some examples of their use.

Table 2.A: tools and instruments that the FPC can call upon, and examples of their use

Instrument

Details

Examples of use

Capital buffers and legally binding powers of direction to the PRA or FCA over specific macroprudential tools(a)

UK Countercyclical capital buffer rate (CCyB): applies to UK exposures, set quarterly.

Other Systemically Important Institutions (O-SII) buffer: the FPC is responsible for the framework and reviewing it at least biennially.(b)

Sectoral capital requirements, leverage ratio requirements and buffers, and mortgage lending limits.

UK CCyB rate was lowered in March 2020 in response to the economic shock from the pandemic (later raised in July 2022).

Direction in 2015 specifying the minimum level of the leverage ratio for UK banks and building societies as a complement to minimum capital requirements.(c)

Non-binding powers of recommendation to the PRA, FCA, HMT, and other public bodies

Recommendations can cover any area within these bodies’ remit but cannot relate to a specified regulated entity. They can be used to tackle risks outside the FPC’s defined instrument set, including for non-banks.

Recommendations made in 2023 following the LDI crisis – including to the Pensions Regulator (TPR) – regarding the resilience of LDI funds, TPR’s remit and collaboration with other regulators.

Bank of England balance sheet tools

Tools include: liquidity facilities (eg repo facilities), funding schemes and asset purchases.

Contingent Term Repo Facility and Term Funding Scheme (with additional incentives for SMEs) were used in March 2020 amid market and economic stress driven by the pandemic.(d)

A targeted and temporary government bond purchase programme was used in the LDI crisis in 2022 to stabilise the gilt market.

Footnotes

  • (a) These buffers and tools apply to UK-authorised banks, building societies, investment firms and regulated lenders (and in some cases, qualifying parent undertakings and subsidiaries) where relevant for the use of buffer or tool in question. For example, the CCyB applies to all banks, building societies and investment firms (other than those exempted by the FCA) incorporated in the UK. Whereas the UK legislation implementing the O-SII buffer restricts the application of the O-SII buffer to ring-fenced bodies and large building societies. And mortgage lending limits apply to regulated lenders above a de minimis threshold.
  • (b) The PRA applies this O-SII framework to firms. The FSB (in conjunction with the BCBS) is responsible for the framework for global systemically important banks (G-SIBs, G-SIIs in the UK framework) but it is applied to UK firms by the PRA. Technically a firm can be both a O-SII and G-SII; in such cases the higher buffer applies.
  • (c) The 2015 Direction (and accompanying Recommendation) were updated in 2021 with a new Direction (and Recommendation).
  • (d) The Contingent Non-Bank Financial Institution Repo Facility (CNRF) was introduced in 2025.

The Bank and PRA also have a role in financial intermediation, which can support both financial system stability and efficiency. Roles here include: (Bank) operation of payment and settlement infrastructure (RTGS and CHAPS), a founding role in securities settlement infrastructure (CREST)footnote [26] and a (PRA) role in governance and oversight of the Financial Services Compensation Scheme.

This set of tools and instruments is also consistent with many insights from the literature. For example – as considered optimal in response to certain time-varying frictions and distortions – the FPC has an explicit countercyclical tool, the CCyB, at its disposal. The breadth of tools that the FPC can call upon also reflects the many ways its objectives can be undermined and supported. For example, it has broad powers of recommendations over UK policymakers and can call upon both ex-ante tools (such as the CCyB) and ex-post tools (such as Bank market operations).

The UK’s financial stability framework also helps to address problems associated with the delegation of policy to multiple policymakers. Several bodies and committees have financial stability policy responsibilities or influence, including the FPC, PRA, Financial Market Infrastructure Committee (FMIC), FCA and HMT. And various elements of the institutional framework aim to clarify relative responsibilities and establish the basis of effective co-ordination. These include: i) statutory remits and accountability mechanisms that – where relevant - describe each body’s role in, or duty to consider, financial stability, ii) a coordinating role for the FPC in financial stability policymaking, iii) formal and informal co-ordination structures,footnote [27] iv) the housing of the FPC, PRA, FMIC and balance sheet operations together in the Bank and v) other collaboration, such as through joint surveys.

2.4: Summary and proposed enhancements to the analytical framework

Many of the general challenges in using policy to achieve a constrained-efficient or second-best outcome have been tackled by the frameworks underpinning UK financial policymaking. However, there is always room for improvement, not due to deficiencies, but because both the ways that the FPC’s policy objectives can be undermined and the analytical tools available to address them have evolved.

These general challenges, how the UK’s financial policy framework and supporting analytical framework have helped to address them – and how the analytical framework could be further enhanced – are summarised as follows.

General challenge 1: inherent difficulty in specifying precise and concrete financial policy objectives.

The UK’s framework for the FPC sets out broad primary and secondary objectives that are necessarily open to interpretation. But, like other similar frameworks, this reflects many challenges including the wide-ranging, uncertain and hard-to-measure ways in which financial stability and contributions to the Government’s economic policy can be undermined. While some flexibility in objectives is understandable and unavoidable – and may even be beneficial as the financial system evolves – insufficient clarity could create ambiguity around the FPC’s role.

What the UK’s financial policy frameworks do to address these challenges.

Several communications including publications and speeches aim to clarify the FPC’s objectives and set out how threats to them can affect the provision of services.footnote [28] For example, through their impact on the availability of credit for UK households and firms. But these communications stop short of implying quantitative or even very precise qualitative objectives.

Suggested enhancements to the analytical framework.

Efforts to specify the FPC’s objectives in precise and practical terms continue. Further work here can include:

  • Continuing to build and communicate a more specific articulation of the FPC’s financial stability goals, building on work already done to enhance mapping of the provision of financial services to real economic activities.
  • As a complement to that work, the FPC could interpret its objectives in further detail. Box C shows two ways in which this might be done. The first interprets the objectives by identifying the underlying conditions that characterise financial instability and inefficiency – termed ‘instability conditions’ and ‘inefficiencies’. The second proposes ‘financial stability outcome indicators’ which measure how the system performs when subjected to an adverse shock. Such indicators could be the basis for work on loss functions described in the next bullet.
  • Conduct research and analysis to work towards an approximately welfare-consistent financial policy loss function, recognising that this cannot be expected to produce anything like a ‘definitive’ loss function. But it would improve the theoretical underpinnings of the objectives of policy, including an understanding of their links to underlying frictions. And it should help rank different threats to the FPC’s primary and secondary objectives viewed together. Such a loss function should: i) be as parsimonious as possible, ii) contain interpretable and measurable (or estimable) terms, iii) contain terms that can be influenced by policy and iv) reasonably approximate welfare maximisation (in that minimising it supports financial policy goals).

General challenge 2: policymakers may have imperfect tools and face constraints in using them.

Imperfections can include imperfect information, unclear objectives, gaps in coverage and imprecision. Policy effectiveness may also be undermined by ‘leakage’ and complacency over crises that leads to pressure for looser policy.

What the UK’s financial policy frameworks do to address these challenges:

  • Within the regulatory perimeter, the FPC can call upon a broad set of tools to address relevant financial frictions and their effects. Safeguards against policy leakage include CCyB reciprocity and monitoring risks beyond the regulatory perimeter.
  • Ongoing work aims to better link these tools to specific vulnerabilities and their impact on the real economy (Section 3), which should help further enhance instrument selection.
  • The FPC’s independence and formally-defined powers, along with a clearly-defined formal role for Government, should remove ambiguity over relative roles in financial policymaking.
  • The FPC has published its approach to using instruments including the CCyB, leverage ratio and housing tools.footnote [29]

Suggested enhancements to the analytical framework.

Enhancing the FPC’s strategy for intervening in a system-wide manner would help further address some of the challenges discussed here. It would also help ensure that the FPC deploys its tools as effectively as possible, enabling to focus on what is most important to system-wide stability and to understand the system-wide implications of its policy response. Possible further work here includes that on: i) the FPC’s use of different instruments in different circumstances, including the use of ex-ante and ex-post tools and ii) accounting for interactions in both frictions – including between those that undermine the primary and secondary objectives – and tools to address them. This could build on existing work to map frictions to vulnerabilities to effects on the real economy (Section 3).

General challenge 3: different aspects of financial policy are delegated to different policymaking bodies.

This necessarily reflects the breadth of policy scope. But the existence of multiple policymakers can hamper the effectiveness and consistency of policymaking. For example, it could result in risks ‘falling through the cracks’ or contradictory/inconsistent policy.

What the UK’s financial policy frameworks do to address these challenges.

As discussed in Section 2.3, several features of the UK institutional framework aim to clarify responsibilities and establish effective co-ordination between policymakers with financial stability policy roles or influence. The analytical framework underpinning financial stability policymaking supports co-ordination in several ways. These include how analysis is done – such as information sharing, joint analysis and exercises – and what analysis is done. For example, identifying vulnerabilities and their causes can help identify which policymakers should act and how. This also extends to the international level, where the UK plays a leading role in providing analysis around activities undertaken on a global basis, including through its membership of the Financial Stability Board.

Suggested enhancements to the analytical framework.

The enhancements proposed here – providing greater clarity over the interpretation of FPC’s objectives and enhancing its policy strategy – should help further illustrate its role relative to other policymakers, and support co-ordination with them. And proposals to enhance the risk assessment and policy evaluation frameworks (Section 3) should help identify possible actions for different policymakers by supporting the comprehensive and systemic assessment of risks, their effects and causes.

Box A: Non-technical summary of the case for a public financial policy role

The financial system delivers significant benefits to society by overcoming some of the costs and barriers to activity that would exist without it. These costs and barriers are driven by the ‘pure’ frictions and distortions that would arise in a world of purely bilateral transactions (top of Figure A).

  • ‘Frictions’ here refer to impediments or imperfections that make it harder or more costly for ‘agents’ (such as households, financial- and non-financial firms) to engage in financial activity like lending or insuring against risk. A common example is asymmetric information. For instance, in lending, borrowers typically know more about their own behaviour than lenders. Without mitigating mechanisms, this may enable borrowers to act in ways that increase their risk of default.
  • ‘Distortions’ here refer to the direct consequence or manifestation of frictions. For example, in response to asymmetric information, lenders may ration credit, limiting access even to some creditworthy borrowers.footnote [30]

Both frictions and distortions can lead to inefficient or ‘suboptimal’ outcomes in terms of ‘economic welfare’. This describes household well-being (‘utility’) with respect to economic factors such as consumption and employment, aggregated across households, hereafter: ‘welfare’.footnote [31]

Figure A: Mapping from frictions to real effects

The top of Figure A starts with pure frictions and distortions.  An arrow extends down from these frictions and distortions to a box labelled financial service mechanisms, which have developed in the financial sector to overcome or exploit pure frictions and distortions.  The box gives some examples of such mechanisms: specialisation, pooling and matching.  Two arrows then extend from this financial services mechanisms box, passing through a label of derived frictions and distortions, to the bottom two boxes of the diagram.  
The left-hand box of these two boxes is labelled inefficiencies.  These apply in normal times as well as stresses.  Three examples of inefficiencies are shown in this box - misallocation of capital, under/over-supply of capital and imperfect consumption profiling.  The right-hand box is labelled vulnerabilities.  These can crystallise in, create or exacerbate stresses.  Two examples of vulnerabilities are shown in this box.  First, an example of microfinancial vulnerabilities - mismatches and exposures.  Second, an example of macrofinancial vulnerabilities – correlation.  
Arrows then extend from the inefficiencies and vulnerabilities boxes to a label of sub-optimal welfare at the bottom of the diagram, illustrating their effects on the real economy.

The financial system helps tackle these frictions and distortions through ‘financial services mechanisms’ (top box in Figure A) including specialisation, pooling, and matching. These support risk transfer, transformation, and other efficiencies.

But the financial system does not fix all frictions (or resulting distortions), and in some cases can manifest, create or worsen them.

The resulting (‘derived’) set of frictions and distortions can drive vulnerabilities and inefficiencies in the financial system and real economy (upper and central sections of Figure A).

‘Vulnerabilities’ here refer to features of the financial system, or its use, that create or exacerbate sensitivity of agents to shocks (relative to a frictionless world). These can be distinguished between: ‘microfinancial’ vulnerabilities (financial and operational susceptibility to shocks at an entity level) and ‘macrofinancial’ vulnerabilities (‘topological’ features of the system and real economy that broaden the incidence and impact of microfinancial vulnerabilities). Box E discusses vulnerabilities in further detail.

  • Externalities are a key distortion that drives vulnerabilities because they can encourage behaviours that can harm the system, such as excessive risk taking. Externalities arise where an activity imposes costs (or benefits) on third parties not directly involved in the activity and so did not participate in the decision to undertake it.footnote [32] Externalities are ultimately caused by frictions – such as missing or incomplete markets for those costs (benefits) – which mean the effect on third parties cannot be ‘internalised’ (ie priced and/or allocated to the activity/agents responsible for it).
  • In finance, a prominent form of externality is ‘pecuniary’ externalities. These arise when actions affect others through changes in the prices or values of financial instruments. For instance, a large sale – including forced or ‘fire’ sales – of securities commonly used as collateral can lower market value. This, in turn, can harm other holders of those securities by reducing the value of their collateral – thereby constraining their borrowing capacity.
  • One important manifestation of externalities in this context is excessive leverage, which can contribute to crises. For example, Bianchi and Mendoza (2018) find that crises can be driven by pecuniary externalities; households and firms overborrow in booms (facilitated by high collateral values) and to amplify the effect of downturns, including through fire sales (when hit with falling collateral value and related debt constraints).footnote [33] More examples of how specific frictions drive vulnerabilities are discussed in Box E and Annex A.
  • Other frictions and distortions that stem from the financial sector’s scale, structure, or practices can also contribute to vulnerabilities. For example, disruption of a service provider with market power can have a bigger impact on the real economy, if that means it is harder or more costly for its users to switch to substitute services. And asymmetric information between a firm’s stakeholders and its managers – that can arise in large, complex financial firms – can encourage actions by managers that may not align with stakeholder interests, such as excessive risk-taking.

‘Inefficiencies’ here refer to barriers to or increased costs of providing and using financial services (relative to a frictionless world) that apply outside of or in addition to vulnerabilities. Ie, including normal times or in the ‘steady state’. Inefficiencies can result from a range of frictions and distortions, including to (explicit and implicit) pricing of risks and activities. In theory, inefficiency can manifest in sub-optimal degrees and distributions of risk-taking, debt and investment. For example, the inability of smaller firms to access certain financial services (or at the same cost) that larger firms can arise because of information frictions. Another example is ‘zombie lending.’footnote [34] This refers to lending (typically by banks) to firms that are non-viable/unproductive and survive only because of such credit.

Non-rational behaviour can also contribute to vulnerabilities (and their crystallisation), and inefficiencies.footnote [35] There is strong evidence that agents’ behaviour can deviate from rationality (eg Kahneman (2002)). For example, overextrapolation of price rises during booms can fuel credit and asset price cycles that later collapse. This is consistent with the Kindleberger (1978) and Minsky (1977) view of crises. Credit and asset booms that are not driven by fundamentals can also distort allocation of resources, separately from their role in crises. For example, by diverting resources from more productive activities.

Both vulnerabilities and inefficiencies can hamper households’ and firms’ ability to engage in consumption and production, but in different ways. The effects of vulnerabilities (largely) arise upon their crystallisation, whereas inefficiencies arguably always apply. This distinction is not hard and fast of course. Many inefficiencies and vulnerabilities are related. Some may result from the same frictions. Some may trade off with or worsen each other. For example, crises driven by vulnerabilities can increase the longer-term cost of funding for providers of financial services. This can increase the cost of their services to the real economy.

The real economic – and so welfare – costs of vulnerabilities and inefficiencies can be substantial (bottom of Figure A).

  • Some vulnerabilities can create ‘systemic risks’ to stability. Ie those are that material enough to affect real economic activity – through effects on availability/reliability of financial services or amplification of economic shocks – when they crystallise. More costly and frequent financial crises, and amplification of economic shocks, can materially harm economic activity. This includes through increased economic volatility which can harm consumption and production, including through higher uncertainty. The real economic impacts of crises and stresses can be temporary (as for the gilt market stress in 2022) or longer lasting with persistent scarring effects on economic output (as for the GFC).footnote [36]
  • Inefficiencies can lower activity levels and growth. For instance, limited access to finance may restrict innovation, for example by curbing research and development expenditure. And access to services like insurance – often essential for commercial operations – can affect firm activity. Another example discussed earlier is zombie lending. This can prolong the life of inefficient firms and so drag on productivity.

The scale of these costs justifies a role for public policy in financial policy, to help reduce or prevent them. One way to distinguish between the types of policy that target vulnerabilities and inefficiencies is to consider the probability of the outcomes that the policy focuses on. Policy aimed at addressing vulnerabilities typically focuses on the occurrence of ‘tail’ instability events. While these events are rare, the high costs when they do occur – for example because of non-linearities (discussed earlier) justifies such a focus. Whereas policy targeting inefficiencies tends to focus more on more likely, ‘normal times’ outcomes. Of course, policy targeting one type of outcome can have spillover effects on others (Box B discusses trade-offs in policy).

Box B: Summary of key financial policy instruments and the frictions, distortions and behaviours they target

This box discusses financial policy tools that are – or could in theory be – used by financial policymakers. There are three broad types: regulation, a public role in financial intermediation, and communication.

Regulation aligns firm behaviour with the public interest:

  • Resolution frameworks (including eg resolution plans and loss-absorbing capital) help reduce externalities associated with firm failures, for example by ensuring continuity of critical functions.
  • Governance and disclosure. Governance rules can curb excessive risk-taking by managers, by enhancing their accountability. Disclosure and transparency requirements can reduce information asymmetries and improve risk pricing.
  • Prudential regulation includes: i) shock-absorbing tools (eg capital and liquidity requirements and buffers), and ii) exposure-limiting tools (eg limits on lending, collateral rules). Both help build/preserve resilience. And both tackle externalities; but the former can be thought of as doing this by forcing firms to internalise the costs of their risk-taking, the latter by restricting it. Some tools available to policymakers are static (eg clearing rules), while others could be dynamic or state-contingent (eg countercyclical capital buffers or LTV limits). The literature generally finds that optimal use of dynamic/state-contingent tools is countercyclical, tightening when constraints are loose and easing when they bind.footnote [37]
  • System-shaping regulation. Structural reforms like central clearing or ring-fencing can reduce systemic risk by limiting unmanageable exposures or insulating households from financial firms’ risky activities.

A public role in financial intermediation and central bank balance sheet tools can also support financial stability and efficiency by helping to address frictions – like incomplete markets or co-ordination failures – that industry may be unable to resolve alone. Examples include central banks operating payment systems or playing a role in industry-run risk-sharing schemes. Central banks can also offer public insurance through their balance sheets, such as through liquidity support, funding facilities, and asset purchases. These tools are typically used ex post (ie after the stress has hit), in contrast to the ex-ante nature of many prudential tools. Externalities can mean that in normal times financial entities under-insure against future stresses, which is (constrained) inefficient. In theory, ex-ante regulation can deliver the (constrained) efficient level of self-insurance. But frictions like incomplete markets can mean some constraints still bind, forcing entities into actions like fire sales.footnote [38] The impacts (and sometimes incidence) of such actions can be reduced by ex-post tools.footnote [39] This distinction should help limit moral hazard, along with other measures like only offering support to firms subject to robust prudential standards. There is support in the literature too for using a combination of ex-ante and ex-post tools.footnote [40]

Communication by financial policymakers includes publishing data and analysis, illustrating how risks and shocks can play out in practice, and providing advice and setting expectations. Sharing data and insights helps market participants assess and manage risk and be better prepared for shocks, while setting expectations can encourage resilience.

Box C: Interpreting and operationalising the FPC’s objectives

This box expands on the work agenda in the main text by setting out our current thinking on interpreting and operationalising the FPC’s objectives. It is illustrative and partial but gives a sense of the work’s direction of travel. 

We interpret the FPC’s objectives as avoiding financial instability (primary objective)footnote [41] and, subject to that, supporting the Government’s economic policy by reducing inefficiencies in the financial system (secondary objective). The primary objective relates to financial dynamics when adverse shocks have pushed the economy away from the steady state (ie bad times); the secondary objective relates to how well the financial system supports the economy at the steady state (ie normal times).

We proceed on the basis that a financially unstable economy is one that, when subjected to a shock, amplifies it either through the response of the financial system or via the real economy’s exposure to it.footnote [42] Amplification typically occurs through financial service disruptionfootnote [43] and households and firms having to cut back on spending and production to meet their obligations to the system (‘real economy amplifiers’).

Various ways of interpreting and operationalising the FPC objectives are possible. This box pursues two.

The first expands on the objectives by setting out in more detail the properties of a financially unstable economy and an inefficient financial system. We call these properties ‘instability conditions’ and ‘inefficiencies’ respectively and we organise them by type of financial service. These are useful in providing a more detailed narrative of the latent properties which make a system unstable or inefficient and can help guide policymakers in determining which vulnerabilities to investigate further. But they are less useful in identifying clear indicators that can be used to measure the FPC’s success in achieving its objectives. 

The second approach therefore starts by considering the desirable properties of indicators that could potentially be used to measure the FPC’s success, particularly in relation to financial stability. Such ‘financial stability outcome indicators’ measure outcomes when an unstable system is subjected to an adverse shock. As set out in the main text, they could in turn be used as building blocks for a loss function. This work is very preliminary and so mentions only a few illustrative examples, before setting out some issues that require further thought. 

Approach 1: Instability conditions and inefficiencies 

Finance and credit 

This category includes services that help firms and households raise funds such as business loans, consumer finance, property lending, public and private equity, corporate bonds and syndicated lending.

Instability conditions include: 

  • Banks and other lenders have insufficient balance sheet capacity to meet borrowing demands in bad times due, for example, to unstable funding sources or an inadequate supply of loss-absorbing equity. 
  • Provision of underwriting services for primary markets is fragile or secondary markets provide unreliable exit opportunities for investors. 
  • Liquidity in core markets is fragile, for example because they are dominated by investor bases that take concentrated, correlated or leveraged positions which they seek to exit during stress. 
  • Excess supply of credit relative to ability of firms and households to service debts, resulting in a debt overhang that drives cuts in spending during stress. 

Inefficiencies include: 

  • Bias towards less productive uses of finance (eg property lending or buyouts), due to an over-reliance on collateral or lack of ability to assess risk.
  • Inadequate infrastructure to support investor interest in secondary markets for corporate bonds and loans, such as reliable ratings and market-making services. 
  • Barriers to the provision of venture capital and private equity financing, such as complex regulatory and compliance requirements and a limited, non-diverse, investor base. 

Hedging and insurance

This category includes derivatives that businesses use to hedge costs and risks and general insurance that is accessed by both firms and households. Instability conditions include: 

  • Derivatives providers or central counterparties lack models for assessing complex risks, mechanisms for matching and managing them, or sufficient balance sheet capacity to bear losses and meet obligations. 
  • Margins required by derivative providers and/or central counterparties are procyclical, exposing derivative users to excessive liquidity risk. 
  • Insurers have insufficient balance sheet capacity to meet demand during stress, due to an inadequate supply of loss-absorbing equity or inefficient pooling of risk. 

Inefficiencies include: 

  • Mispricing of risk due to the presence of speculative and/or unsophisticated investors in the market or excessively high costs of regulation. 
  • Lack of education, particularly for small and medium-sized businesses, as to the hedging benefits of derivatives and how to access them. 
  • Bias towards insurance products that are diversifiable versus more bespoke products, due to a reliance on pooled risk models or lack of re-insurance opportunities. 

Long-term savings vehicles 

This category includes non-deposit savings vehicles used to generate investment returns for both firms and households and to provide retirement income. Instability conditions include: 

  • Fixed return scheme providers use leverage to match liabilities and manage the risks of doing so in a procyclical way.

Inefficiencies include: 

  • Inadequate matching of investments to investor preferences, due to overly restrictive mandates or lack of expertise on the part of fund managers. 
  • Limited access for retail investors to products meeting preferences due to minimum participation limits or concentration of products marketed by retail brokers. 
  • Lack of education as to the benefits for retirement income, and hence future consumption, of fixed return schemes. 
  • Inadequate matching of fixed return scheme liabilities due to insufficient supply of long-dated assets or a lack of expertise on the part of scheme managers. 

Liquidity and payment services 

This category includes deposit services, revolving credit facilities and payments services that firms and households access to smooth cashflows and facilitate spending. Instability conditions include: 

  • Non-bank private sector money issuers lack mechanisms to manage cashflows, including meeting outflows on demand, so that people may lose trust in the money they use.
  • Revolving credit providers lack mechanisms for managing cashflows and balance sheet capacity to meet borrowing demands in bad times. 
  • Infrastructure to meet preferences in making payments and to ensure people have trust in payments services is fragile. 

Inefficiencies include 

  • Barriers to entry to new forms of money that are more efficient and/or offer greater financial inclusion, due either to the monopoly power of existing payment providers or the restrictive regulation of new types of payment provision. 
  • Under-pricing of settlement risk, deterring existing or new payment providers from serving riskier customer types and marketplaces. 

Approach 2: Financial stability outcome indicators 

Desirable properties of financial stability outcome indicators 

Ideally, financial stability outcome indicators would be clearly related to welfare, amenable to policy control and capable of being reliably and simply measured.footnote [44] These three properties are discussed in turn. 

The role of financial stability policy is to reduce the social costs of financial instability. A financial stability outcome indicator which is clearly related to welfare would therefore reliably signal when such social costs are high and when they are low.

Objectives should be achievable. From this it follows that policymakers ought to be able to achieve success as measured by the outcome indicator. In other words, policymakers should be able to control the indicator, even if that control is imprecise and subject to lags.

Clarifying the interpretation of FPC objectives and supporting policy design and evaluation would be best served by outcome indicators which are easy to understand, transparent and verifiable. Such indicators would enable straightforward comparisons across policy proposals and over time.

Assessing whether or not an indicator has these properties requires judgement. For example, different people (or different models) may well come to different conclusions about how well an indicator signals when financial frictions are particularly damaging to welfare, or whether policy can control the indicator.

It may prove challenging to find quantitative indicators which policymakers are confident possess all three of these desirable properties. It is therefore also worth exploring qualitative benchmarks, which are less easily measurable, but may be more robustly linked to welfare. 

Some illustrative examples 

Indicators relating to payments services are likely to concern the trust people have in privately issued money balances and the operation of payments systems. Regardless of the source, disruptions to the safety of money balances and the transfer of funds – whether between individuals or institutions, or between forms of money – can be highly costly, given how central payments are to economic activity.

Accordingly, a qualitative benchmark for the safety of money balances could be trust in money and a quantitative one could be a fixed exchange rate between different forms of money.footnote [45] This approach builds on the concept of the singleness of money.footnote [46] 

Likewise, a qualitative benchmark for payments transactions could be how well the payments infrastructure is operating. An example of a quantitative outcome indicator might be the proportion of payments completed as promised by the payments service provider. This approach aligns with the spirit of the impact tolerances for critical payments already set out by the FPC.footnote [47]

Outcome indicators relating to credit intermediation are likely to concern the terms on which households and businesses can access credit. Unlike in the case of payments reliability, some variation in these terms reflects the credit system reacting efficiently to fundamentals. Ideally, there would be no inefficient fluctuations in the cost of credit intermediation (such as those resulting from amplification through fire sales). A qualitative benchmark could therefore be the gap between the actual and the efficient cost of credit intermediation. An example of a narrow quantitative measure could be the spread between corporate and government bond yields. Because it is so hard to measure the cost of credit intermediation in bank loan markets, the best quantitative measures there may be partial. One such example is the spread between bank and government funding costs.

Outcome indicators relating to other intermediation including hedging and insurance are again likely to concern the terms on which households and business can access these services. 

Other considerations 

Work on outcome indicators would need to consider a much wider set of factors than those raised in this box. Examples include the number of outcome indicators and their relative importance. Furthermore, the work would need to incorporate financial system inefficiencies in the steady state (secondary objective), risks to the public finances,footnote [48] the social costs of amplification relating to overindebted households and nonfinancial businesses and the international role of the UK financial sector.

3: Risk assessment (and policy evaluation)

The purpose of risk assessment is to quantify how financial frictions and behaviours transmit and amplify (together, ‘propagate’) shocks to the real economy and ultimately economic welfare. In thinking about how to enhance the existing framework it is useful to start with the features of the ideal risk assessment framework:

  • Includes all financial frictions, behaviours and vulnerabilities at the entity- and system-level relevant for financial stability.
  • Captures how these frictions, behaviours and vulnerabilities vary across time, horizons, and system states, such as stressed periods. For example, entities may amplify shocks more when their shock-absorbing capacity is already depleted. An ideal framework would reflect these dynamics, including by identifying contingent risks.
  • Captures the general equilibrium consequences of shocks, including the (two-way) interaction between different frictions, vulnerabilities and propagation mechanisms and their macroeconomic effect.
  • Is able to assess the impacts of different types of shock drawn from well-defined probability distributions, as well as to capture endogenous financial cycles. This includes shocks that are both exogenous and endogenous to the financial system (eg Covid and the GFC respectively).
  • Quantifies propagation mechanisms in an empirically-accurate way to enable scaling and ranking of risks.

A framework incorporating these elements would enable consistent tracking of systemic risks across time and scenarios. It would make it easier to monitor the impact of changes in vulnerabilities – such as bank or corporate leverage – on overall risk. It would also help identify which frictions and vulnerabilities matter most, supporting prioritisation.footnote [49]

In its purest form, the ideal risk assessment framework is not achievable in practice. This would require a fully specified, empirically grounded, system-wide general equilibrium ‘macrofinancial’ model (‘macrofinancial’ here refers to the interaction between the macroeconomy and the financial system). Such a model in this context would capture all relevant markets, institutions, services, and real-economy linkages. Such a comprehensive ‘transmission framework’ is well beyond current modelling and data capabilities.

Nonetheless, meaningful progress toward something resembling the ideal is possible. In this context, this section discusses key challenges in systemic risk assessment and suggests enhancements that could help overcome them. Section 3.1 briefly summarises the Bank’s current approach to financial stability risk assessment. Sections 3.2 to 3.4 consider possible enhancements in risk mapping, scenario analysis and indicators respectively. Section 3.5 discusses supplementary approaches as an alternative to quantitative analysis. Section 3.6 covers models for policy analysis. Section 3.7 summarises and brings together proposals for further work.

3.1: The Bank’s current approach to financial stability risk assessment

The broad contours of the financial stability risk framework at the Bank of England are similar to those used by other financial stability policymakers (Figure 3). The FPC has most recently applied this framework to MBF and operational resilience. In doing so, consistent with the FPC’s primary task to identify, monitor and remove or reduce systemic risks, the focus has been on identifying vulnerabilities and the system’s resilience to them (left-hand and central sections of Figure 3). Work is ongoing to further develop the understanding of i) the effect of systemic risks crystallising on the real economy and ii) feedbacks between the financial system and real economy (right-hand section of Figure 3).

Figure 3: Narrative framework for identification and monitoring of systemic risk

Figure 3 illustrates, at a high level, the narrative framework for identification and monitoring of systemic risk used at the Bank.  It shows a series of boxes connected by arrows flowing from left to right.  These arrows represent transmission channels. The first box at the left-hand of the figure is labelled triggers.  This flows rightwards to financial conditions, which itself flows to vulnerabilities and resilience, which are shown as an upper and lower layer of the same box.  This box then flows to the ultimate impact on the real economy.  The Figure also shows with a dotted line macro-financial feedbacks between the boxes for financial conditions, vulnerabilities and resilience and impact on the real economy.

This focus on vulnerabilities and resilience also applies in routine monitoring of financial stability risks via a set of indicators that contain information about the current state of the financial system.footnote [50] These include, for example, the credit to GDP ratio, private non-financial credit growth, corporate bond spreads, the tier 1 capital ratio of major UK banks, and the share of new residential mortgages with high LTVs. Many of the indicators in the set have been shown to be empirically associated with the probability of financial crises occurring and/or with the depth of recessions and the speed of recoveries.footnote [51]

In parallel, Bank staff have developed models and exercises to quantify systemic risks in particular parts of the financial system. One example is stress testing. The Bank Capital Stress Test (BCST, formerly the Annual Cyclical Scenario), assesses whether UK banks hold sufficient capital to maintain lending to the real economy during severe downturns.footnote [52] It is supported by several models/frameworks, encompassing both disaggregated asset-level impairment models and top-down ‘ready reckoners.’ Those models can also be combined with expert judgement to conduct desk-based variants of the BCST without participating banks submitting their own stressed projections, as was done in 2024. The Bank also conducts stress tests for insurers and central counterparties (CCPs).footnote [53] Similar to the BCST, the objective of these tests is to assess sector and firm resilience to defined stress scenarios.

Staff have extended this approach to MBF as risks there have grown. This includes the Bank’s recent system-wide exploratory scenario (SWES).footnote [54] The latter was conducted with around 50 industry participants – including banks, insurers, CCPs, hedge funds, asset managers, and pension funds. It aimed to test the resilience of core UK markets to non-bank and bank behaviour during market stresses.footnote [55] Like the BCST, the SWES offers valuable granular insights into exposures, interconnections, and behaviours that would be difficult to obtain otherwise. These efforts are supported by a wide range of data sources, including macroeconomic statistics, regulatory, financial market and transaction data, and firm and loan-level datasets.

These activities are a core element of the Bank’s risk assessment capabilities. They could be usefully supplemented with further work – some of which is already in train – to articulate how: i) frictions and behaviours drive vulnerabilities and ii) vulnerabilities propagate shocks to the real economy. This would help make the risk assessment framework (even) more system-wide and help in refining policy strategy. The following sections suggest ways this work could be taken forward, grouped broadly into mapping, modelling, and indicators.

3.2: Possible enhancements to financial system mapping for risk assessment

This section discusses developing mapping of the key components and features of the financial system relevant to financial stability policy. This exercise aims to advance a structured framework for assessing financial stability risks, building on existing and ongoing work. The approach treats the full system map as an ‘atlas,’ from which targeted maps – such as for core markets or major exposures – can be drawn based on specific risks or policy questions. Box F sets out an example of how this mapping approach could be applied in the analysis of risks from private equity in comparison to LDI funds.

3.2.1: Overview of the mapping framework

Figure 4 sets out a high-level schematic of the mapping framework. It shows four major ‘nodes’ of the system.

Starting from the real economy (right-hand side of Figure 4), this connects to the financial system through the end-user servicesfootnote [56] households and firms use. These take two forms. First, provider-intermediated services delivered by firms – like banks and insurers – that fully intermediate between the ‘supply chain’ for such services and their users. Second, market-intermediated services – like equity issuance – that are delivered through financial markets but typically accessed by users via intermediaries like investment banks. The end-user services themselves also rely on intermediate services (for example, payments infrastructure) and markets (for example for funding).

Figure 4: High-level schematic of the risk and resilience mapping framework

Figure 4 shows a high-level schematic of the risk and resilience mapping framework.  It shows the four nodes represented by four boxes labelled the real economy, end-user services, intermediate services and markets.  The figure starts on the left-hand side with the intermediate services box at the top and the markets box at the bottom. The figure shows how service disruption can transmit between markets and intermediate services, denoted by a two-headed arrow between these two boxes.  Examples of service disruption transmitted from markets to intermediate services shown include disruption resulting from losses and higher funding costs. Examples of service disruption transmitted from intermediate services to markets shown include disruption resulting from input intermediate services such as liquidity supply, and losses in position/asset value.  Arrows then extend from the intermediate services and markets boxes to the end-user services box in the centre of the diagram, showing how these can affect end-user services.  Examples of service disruption shown include those arising from losses, higher funding costs and disruption to services inputting into end-user services.  These arrows are double-headed, illustrating that disruption and losses can transmit back from end-user services to intermediate services and markets.  Finally, and moving to the right-hand side, the figure shows transmission of the effects of service disruption from end-user services to the real economy, both from and through end-user services.  This includes disruptions and amplifiers originating from markets that reach end-users through market-intermediated end-user services. The effects on the real economy shown include service disruption, such as outages, and real economic amplifiers, such as the effects of high levels of indebtedness. The figure also shows that disruption and losses can flow back again from the real economy to end-user services and markets via market-intermediated services, denoted by double-headed arrows flowing between them.

There are two relevant dimensions to the propagation of shocks. Shocks can impact the terms or availability of the supply of service, or they can be transmitted through financial channels including, for example, financial market losses or demands for liquidity, such as collateral calls. And they can be direct, for example, if the availability of services (service channel) is impacted by a cyber attack, or indirect, for example, if higher funding costs for banks result in them charging higher lending rates to end users (financial channel). And the real economy can itself transmit or feedback shocks back to the financial system, for example to lenders through losses on lending.footnote [57]

Systemic risk manifests in relation to the real economy in two simultaneous but interdependent ways:

  • ‘Service disruption’ including both outright interruption of services and other deterioration in their availability and pricing (such as unduly procyclical provision).
  • ‘Real economy amplifiers’: household and firm vulnerabilities or constraints that cause them to amplify economic shocks and which arise from – or are worsened by – the way that financial services have been/are provided to them. For example, tighter credit constraints and higher initial debt servicing costs may mean that more indebted borrowers cut back their spending by more when economic shocks hit, thereby amplifying those shocks.footnote [58]

Both financial service disruption and real economy amplifiers matter because they affect welfare via their effects on consumption and production. Consistent with that, the following discussion of how to operationalise the mapping framework starts (in Section 3.2.2) with mapping propagation to the real economy. It then traces propagation ‘backwards’ (Sections 3.2.3 to 3.2.5) to the mechanisms and frictions that drive it. This ‘real economy first’ approach ensures that risk assessment remains grounded in its ultimate purpose.

3.2.2: Mapping to the real economy

The first step in mapping the propagation of shocks to the real economy is to identify touchpoints of the financial system with real economic activity (Breeden (2024)). These can be broadly grouped by type of service provided: i) finance and credit, ii) hedging and insurance, iii) long-term savings vehicles, iv) liquidity and payments services. These services are vital to the functioning of the real economy. They ensure that households and businesses can make transactions and manage and take risks. For example, payments, credit card, mortgage and insurance services for households. And they facilitate corporate financing activity across a range of forms – bank lending, commercial paper, the bond market, leveraged lending, private debt, private equity and venture capital. They also enable firms to insure against and hedge risks. For example, through the use of derivatives to hedge commodity, interest rate and currency risks. Further, financial services facilitate the financing of the Government.

It follows from the importance of these services to end-users that the financial markets and ‘intermediate’ services supporting their provision matter too. For example, the gilt market – and the associated repo market – underpins a wide set of other transactions, through its role in pricing risk-free assets and in helping liquidity to flow around the system. That ultimately supports the provision of services that households and businesses use to borrow, save, invest, make payments and insure themselves against shocks.

The second step in mapping propagation to the real economy is to compile/collate a library of metrics that help size its costs at these touchpoints (ie in terms of real economic impacts). This will reflect, in part, the systemic importance of the touchpoints (Box D).

Mapping the costs of propagation presents several challenges.

Mapping and sizing propagation costs is difficult if shocks (and propagation) occur rarely. For instance, while digitalisation and interconnectedness have increased operational risks, no associated disruptions have yet caused significant macroeconomic harm. Of course, it would be wrong to take from this that the risk is necessarily low. But it does mean that there are no episodes on which to base an assessment of its potential effects.

Empirical evidence is incomplete, often missing key propagation channels. Work to fill these gaps is discussed elsewhere in this section and in Section 4. But some gaps will remain. The use of models of the sort discussed in Section 3.3 can help here.

And it can be difficult to compare metrics that size: i) propagation channels and ii) their impacts on the real economy. For an example of the former (i), it’s unclear how a shift in the Excess Bond Premiumfootnote [59] of Gilchrist and Zakrajšek (2012) would compare to a similar shift in bank credit supply. While standardising metrics – eg, in terms of standard deviations – could improve consistency, precise comparability requires the more integrated modelling of the sort discussed in the next section. For an example of the latter (ii), estimates of the economic impact of propagation are often – naturally – expressed in varying units (eg consumption, investment), complicating comparison.

The proposals in Section 2 to conduct research and analysis that works towards an approximately welfare-consistent financial stability loss function are relevant here. These could help to start framing propagation and its effects in a more common way (in the context of possible precursors to loss function terms). Expressing propagation effects in common units would aid comparison across propagation mechanisms and over time and so support prioritisation. This is important given limited resources for monitoring and mitigation.footnote [60] Converting estimates into GDP terms, though argued to be a poor proxy for overall welfare,footnote [61] could be a practical way to consistently compare propagation channels.

3.2.3: Describing key nodes in the supply chain of service provision to end users

Figure 4 shows the four key financial system nodes discussed in Section 3.2.2: markets, end-user and intermediate financial services, and the real economy. More detailed maps can disaggregate these nodes into their constituent entities – such as sectors, firms – or activities – as needed. The appropriate level of disaggregation depends on: i) the diversity of constituents in the node and its relevance for systemic risk and ii) the nature of the shock being considered. For example, shocks to services that are universally used – like payment systems – are likely to affect node constituents more uniformly. Table B1.A in the annex outlines a taxonomy for node mapping.

The node mapping exercise could be formalised through a network model, though data limitations constrain its completeness. Some gaps may be addressed using external sources (Section 4) or through exercises like the SWES, which offer valuable granular insights into exposures and interconnections that would be difficult to obtain otherwise. Existing Bank and regulatory datasets also offer significant potential. For example, for a granular view of how securities and liquidity flow through and across financial markets, it would be desirable to link participating entities by combining Securities Financing Transactions Repository, Mifid2, and EMIR datasets, to create a network map of securities, gilt repo, and derivatives markets.

For banking, market infrastructure, and insurance, mapping will mainly draw on existing supervisory information. However, supplementary sources, such as market intelligence and qualitative data, including on how these services interact with the real economy, remain valuable.

Supplementary sources matter even more for non-banks and MBF, where vulnerabilities and their real economic effects are less well charted. The use of SWES-style exercises to build and test understanding of the system is particularly valuable here. And given the system’s evolving nature, incorporating horizon scanning and industry insights is essential to keep the risk map current and responsive.

3.2.4: Vulnerabilities

The next part of the mapping exercise is to map vulnerabilities. For this purpose, they can be distinguished between: ‘microfinancial’ vulnerabilities (financial and operational susceptibility to shocks at an entity level) and ‘macrofinancial’ vulnerabilities (‘topological’ features of the system and real economy that broaden the incidence and impact of microfinancial vulnerabilities). Box E sets out further detail.

One way of characterising micro versus macrofinancial vulnerabilities is to distinguish between the ‘depth’ and ‘width’ of vulnerabilities.

  • Depth describes the severity of microfinancial vulnerabilities. That is, the degree of sensitivity to a shock at any given entity/node. For example, leverage can deepen vulnerability to shocks, by reducing entities’ ability to absorb losses.
  • Width describes how broadly these microfinancial vulnerabilities and their impacts are spread across the system by macrofinancial vulnerabilities. For example, shocks in core markets that much of the system is connected to, like gilt markets, can have very broad effects.

Some financial system features can drive both depth and width. For example, leverage creates microfinancial vulnerabilities such as refinancing dependencies. But it can also drive macrofinancial vulnerabilities. For example, by creating or increasing interconnections between borrowers and lenders.

Other features might instead trade off depth and width, particularly when behaviour is overlaid. For instance, some attempts by entities to reduce the impact of microfinancial vulnerabilities may exacerbate the impact of macrofinancial vulnerabilities. For example, firms may sell assets or raise margins on counterparties to protect themselves during stress. Though rational individually, such actions can amplify systemic risks. For example, by driving asset price declines and liquidity strains.

3.2.5: Identifying how risks might manifest

Having mapped the nodes and key vulnerabilities, the map can then be used to identify how risks might manifest.

In doing this, the map will need to account for how propagation might change in different conditions. For example, due to non-linearities and changes in vulnerabilities in stressed conditions, such as contingent exposures and loss of entity resilience.

The map will also need to integrate international considerations. This presents several challenges. While the approach outlined here is theoretically location-agnostic, in practice, geography affects vulnerabilities and real economy impacts. For example, where cross-border considerations introduce different risks and/or involve different financial activities. This is especially relevant to the UK, which is an open economy with a large financial sector, and therefore particularly exposed to global developments. This applies to both the real economy (such as forex risk and related hedging) and the financial system (eg cross-border risk transfer and booking models). Another challenge is quantifying how global financial activity contributes to UK systemic risk.footnote [62]

In addition to propagation channels, behavioural responses to them (and the distribution of such responses) also play a critical role in determining how shocks affect the real economy. The modelling of behaviour is discussed in Section 3.3.

3.3: Possible enhancements to scenario analysis

Scenario analysis – such as the BCST, SWES and CCP and insurance stress-testing set out in Section 3.1 – is a key tool in the UK’s risk assessment framework. These exercises draw on and complement the mapping exercise by helping to assess propagation channels in the context of specific shocks or types of shock. They can also be used to account for behavioural responses, using various methods (discussed further later in this section). These can range from simple rules of thumb or assumptions about behaviour, to industry/firm information on their likely reactions to specified events, to complex models that attempt to predict behavioural reactions.

In general, any one stress-testing exercise – while very valuable for bringing different elements of the propagation of shocks together – can be resource-intensive and will not fully capture feedback loops between the financial system and the real economy.

For both these reasons, it is desirable to supplement stress tests with desk-based capability that can be used for scenario analysis in a more nimble and flexible way.

3.3.1: Developing desk-based scenario analysis capability

Scenario analysis is a practical and widely-used approach to risk assessment. It can enable ‘joining of the dots’ between different sectors and facilitates ‘end-to-end’ analysis from shocks to their ultimate impact on the real economy. The more specific functions of scenario analysis include:

  • Scenario impacts: quantification of financial propagation to the real economy in a set of adverse scenarios, facilitating comparison both between different scenarios and of the same scenario over time.
  • Counterfactual analysis: quantification of scenario impacts in hypothetical states of the world. This enhances understanding of the effects of specific vulnerabilities and helps identify thresholds at which they become significant.footnote [63]
  • Sensitivity analysis: quantification of the effect of changing an assumption(s) about the behaviour of one or more agents (such as the impact of sudden deleveraging by hedge funds). This will pick up non-linearities.
  • Reverse scenario analysis: starting from a targeted degree of impact, this supports analysis of the conditions – including combinations of vulnerabilities and behaviours – under which such impacts might arise.

In practice, the effectiveness of scenario analysis depends on its design – particularly the scenarios (Box G on scenario design) and inputs used. The latter can be grouped into two broad categories:

  • Models of the system and behaviours, where these exist or can be efficiently developed.
  • A modular approach. This combines a range of inputs, which can include models as well as other quantitative/qualitative evidence and assumptions about vulnerabilities and behaviours.

Whichever approach is used, applying an ‘Occam’s razor’ principle suggests two things. First, each approach should include only the financial propagation mechanisms relevant to the specific scenario. Only certain vulnerabilities are relevant to each shock. For instance, insurers are typically much less exposed to refinancing risk than highly-leveraged firms. Second, system-wide approaches should focus on propagation channels with significant real-economy impact. These points support using multiple targeted models, alongside at least one system-wide model that captures propagation across key sectors in stresses or downturns.

And in all cases, scenario analysis will depend on timely, granular data and behavioural assumptions – which may not accurately describe the real world. And as the financial system evolves, both model structures and behavioural assumptions can become less accurate.

3.3.2: Using models for scenario analysis

Designing an effective modelling approach for scenario analysis involves the usual challenges in applied model design. Feasibility constraints mean not all financial amplification mechanisms can be captured in a single model. And in some areas, there is insufficient knowledge for detailed modelling without significant research. Even within the feasible set of models, there are the usual trade-offs between model size and tractability, and between system-wide coverage and detail.

Recent years have seen major advances in models for financial stability scenario analysis (Aikman et al (2023a)). This new generation of models goes beyond first-round effects by incorporating institutions’ behavioural responses and their systemic impacts. Aikman et al. identify three key drivers of shock amplification: i) the size of firms’ capital and liquidity buffers; ii) their asset liquidation strategies under stress; and iii) how firms are interconnected. Effective models must therefore include assumptions about firm responses to actual (and ideally, expected) changes to cash flow, profit, and balance sheets and capture system interconnections.footnote [64]

These considerations effectively narrow the choice of model type down to semi-structural models, network models (with some added behavioural assumptions), and agent-based models. Table 3.A sets out the pros and cons of these model types for use in financial stability scenario analysis, as well as some example models from the literature.footnote [65]

Table 3.A: Pros and cons of candidate alternative model types for financial stability scenario analysis

Approach

Pros

Cons

Examples

Semi-structural

Flexible and adaptable; capable of good empirical fit.

Empirical identification (given many equations and parameters); not well-suited to granular modelling; usually (near) linear.

Budnik et al (2020).

Catalan and Hoffmaister (2022).

Network

Readily capture granular information and interconnections between financial (and non-financial) entities (though lack of data availability can reduce these benefits).

Require behavioural assumptions to be added; not dynamic (without the addition of dynamic behavioural assumptions).

Covi and Hüser (2024).

Hüser et al (2024).

Sydow et al (2024).

Agent-based

Well-designed to capture heterogeneity and granular information; behavioural responses embedded in model design.

Computationally expensive; need careful justification of ad-hoc behavioural assumptions; system-wide estimation challenging.

Bardoscia et al (2024).

Liu et al (2020).

Footnotes

  • Table 9 from Aikman et al (2023a) and the text therein provide further discussion of semi-structural and network models.

Based on scenario design considerations, the existing modelling landscape, and the practical issues discussed here, there are two areas in which investment in modelling for scenario analysis would be valuable:

  1. Further development of Covi and Hüser (2024) for quantifying macroeconomic recessionary scenarios. This microstructural stress-testing model simulates the impact of rising corporate defaults on bank and insurer capital using a network model. It incorporates amplification through fire sales and solvency contagion. Rather than producing just point estimates, the methodology produces full distributions of profit, loss, and capital outcomes. The model provides a principled method for modelling how the complex interactions between banks and non-banks shape shock propagation in stresses.footnote [66] Recent in-house extensions include a banking liquidity channel and exploring the addition of investment funds (via the Lipper TASS dataset).footnote [67] In practical terms, starting with the Covi and Hüser model is desirable because it is UK-calibrated and already in-house. This lowers the cost of implementation, extensions and data updates.
  2. Development of system-wide models for financial market stress scenarios to include:
  • Development of a desk-based, system-wide model using regulatory data and SWES insights to simulate market price shocks and their amplification via fire sales. This can be viewed as a more granular extension of the representative-agent system-wide model of Aikman et al (2019), which captures key players in the market in the spirit of the Duffie (2011) ‘10-by-10-by-10’ proposals.footnote [68] This will allow for scenario analysis – more quickly and at lower cost than a full SWES exercise – as the financial system and risk-taking behaviours evolve. For example, during the episode of market volatility in April 2025, Bank staff used an early version of the model to estimate the losses LDI funds had incurred and were able to quantify the potential scale of LDI-related activity in real time (as described in Box D of the July 2025 Financial Stability Report). Importantly, this desktop modelling approach will rely on continued collaboration with SWES participants to corroborate modelled results and the behavioural assumptions used.
  • Production of a network financial market simulation model with appropriate propagation mechanisms. Building on planned network mapping of entities in securities, gilt repo, and derivatives markets (Section 3.2.3), this model could adopt approaches such as Koijen and Yogo’s (2019) ‘demand system asset pricing’ or a semi-structural method drawing on the post-SWES modelling agenda described above. Relative to that work, this would draw on broader data sources and capture more interconnections. However, it may give an incomplete picture of entities’ balance sheet positions (due to flow-based data) and face tractability challenges given dataset complexity. It remains to be seen whether data limitations could be overcome.

Relative to the ideal framework, however, this would still leave some gaps in capability.

First, neither proposed modelling approach includes a ‘macroeconomic block’, meaning that propagation to and from the real economy is incomplete. Impacts stop at intermediate financial outcomes (eg asset prices), requiring off-model analysis to flesh out empirical links and complete the risk assessment. Feedback loops from the real economy back to the financial system are also omitted.

Adding such propagation to these models is possible in principle,footnote [69] but is likely to be very challenging in practice. Absent a fully-integrated approach, the best alternative is to include models explicitly designed to capture macrofinancial amplification (Box H).

Second, neither of the modelling avenues proposed for building a macrofinancial scenario capability caters for operational shocks. As summarised in Box B of Adeney et al. (2024), the literature is largely confined to modelling of cyber events with a US focus. Substantial further work would be needed to develop a UK-focused model capable of quantifying operational risk scenarios. More subtly, broader operational considerations and structural changes (like the adoption of AI) can also affect financial propagation mechanisms. For example, operational issues materially exacerbated the fire sale amplification in gilt markets during the LDI episode (Table 1).footnote [70]

3.3.3: Using a modular approach for scenario analysis

Using a modular approach – as discussed in this section – can help address the two key gaps in the use of modelling for scenario analysis discussed in Section 3.3.2. First, it can provide an end-to-end approach to scenario analysis – from the scenario through the system to real economy impacts. Second, it can enable operational risk analysis, by incorporating information on operational dependencies and behavioural reactions to operational shocks.

This could approximate some benefits of a comprehensive model – like capturing feedback loops and expressing risks in comparable terms. This approach can also blend sophisticated analytics with more ‘rough and ready’ assumptions or estimates where necessary. It can draw on a very wide range of information. This can include:

  • Information and insights from the mapping exercise. As discussed previously this can draw on a huge range of data including supervisory information and market intelligence.
  • Models of the type discussed in the previous section. These can be combined with other modules that translate the outputs of these models into real economic effects. Different parts of the system could also be connected using system-wide models or by joining the dots between existing different stress testing exercises – banking, insurance, central counterparties, and market-focused SWES scenarios.
  • Firm information on their responses to specified scenarios. This can include fully modelled responses as in stress tests with industry participation. But desk-based versions can also draw on collaboration with industry to validate assumptions, as discussed in the previous section.
  • Systemic risk (and other) indicators discussed later in this section. These can for example enable assumptions to be made about relationships between different parts of the transmission chain, even where a full understanding of the drivers of that relationship is not (yet) possible.
  • Simple assumptions like ‘rules of thumb’ on behaviour and exposures where necessary.

In this spirit, Figure 5 illustrates a modular approach to systemic risk assessment. It starts by mapping micro and macrofinancial vulnerabilities (boxes). Behavioural assumptions (arrows labelled ‘behaviour’) are then overlaid to reflect how entities respond to these vulnerabilities, ‘activating’ the map and enabling assessment of real economy impacts.footnote [71] In the absence of a single unified model, different behavioural ‘modules’ can be linked to relevant parts of the map to simulate similar dynamics.

Figure 5: Stylised illustration of modular approach to systemic risk assessment (a)

Figure 5 presents a stylised illustration of the modular approach to systemic risk assessment. The left-hand side of the figure is structured as a series of interconnected modules, each representing a distinct component of the propagation of shocks to the real economy. The right-hand side summarises very briefly how each of these distinct components can be assessed.
Starting at the top of the figure with the initial shock, this can be specified as a shock, shocks, type of shock or shocks (eg collection of features of a shock), scenario or scenarios.  The next key component is microfinancial vulnerabilities, which can be assessed through mapping. The behavioural reaction to shocks combined with such vulnerabilities is shown as a downward linking arrow and can be assessed using modelling or making assumptions about behaviour.  The breadth of impact of these vulnerabilities and behaviours across the system can then be described by macrofinancial vulnerabilities, which can be assessed through mapping.  The behavioural reaction of provision of vital services to the real economy, and the real economy itself, to the propagation of such shocks is the final connecting arrow.  Again, this can be assessed using modelling or making assumptions about behaviour.   The ultimate impact on the real economy via the provision of vital services can be assessed through mapping the costs of propagation for real economic activity.

Footnotes

  • (a) Amplification could be conceptually illustrated by an increasing size of boxes and arrows moving from top (shock) to bottom (impact on the real economy). The FPC’s mandate is to build resilience/limit exposure such that the financial system absorbs rather than amplifies the shock (reducing the size of the boxes and arrows and so the ultimate effect on the real economy and welfare).

For instance, one module could estimate how fund redemptions respond to shocks using simple regressions or rules of thumb. The outputs of that module (redemption sensitivities) could then calibrate fund behaviour in a system-wide stress model. The resulting price impacts – eg on bonds – could feed into further modules estimating effects on bank lending or collateral values. This modular approach supports practical, step-by-step quantification of shock transmission to the real economy. While it may miss complex interactions captured by more integrated models, it offers a foundation that can be built on over time.

Further work – some of which is already ongoing – is required to build this mapping. This includes advancing understanding of the criticality of each type of service and, by extension, their providers. Of relevance to this is how ‘systemic’ each service is (Box D). Other work – discussed later in this section – includes work progressing the FPC’s macroprudential approach to operational resilience.

3.4: Additional systemic risk (and more heterodox) indicators

As a complement to the mapping and scenario analysis discussed in the previous sections, it is also desirable to develop a small suite of indicators that measure systemic risk over time. Relative to naturally disaggregated mapping and modelling approaches, these indicators are more aggregate by design and so may facilitate more consistent comparison over time. Importantly, they do not always rely on an understanding of the exact mapping of vulnerabilities and behaviour to systemic risk. For these reasons, systemic risk indicators provide a natural ‘top-down’ cross-check on conclusions drawn from more disaggregated or ‘bottom-up’ risk assessments.

Examples of potentially useful systemic risk indicators – in addition to indicators already in use at the Bank – include:

  • A composite indicator of Systemic Stress (CISS), which uses standard portfolio theory to aggregate five market-specific indices created from 15 separate financial stress measures (Kremer et al. (2012)).
  • Asset-price based indicators of systemic risk, like SRISK (Brownless and Engle (2017)), the Excess Bond Premium (Gilchrist and Zakrajšek (2012)), MES (Acharya et al. (2017)), and crash risk (Martin and Shi (2023)).
  • A measure of financial market risk perceptions shown to be associated with macroeconomic outcomes (Pflueger et al. (2020)).
  • A credit market sentiment index comprised of economic activity and credit market sentiment factors, and the probability that the economy is in an adverse state, constructed by the Richmond Federal Reserve Bank.footnote [72]
  • A measure of aggregate leverage in financial markets combined with a stability condition which can be used by policymakers as an early warning indicator (Adrian, Borowiecki and Tepper (2022)).

Consideration should be given to the extent to which individual indicators:

  • Provide new information relative to other approaches and indicators. Most valuable are those indicators that shed light on areas where data gaps and challenges to mapping/modelling are greater, such as for non-banks and MBF.
  • Predict the build-up of risk or whether they summarise the current state of the system. For example, there is evidence that some of the asset-price based indicators referenced here have predictive power (Acharya et al. (2024)). By contrast, the CISS metric is more of a summary statistic of current conditions.
  • Distinguish between effects driven by i) financial frictions and ii) fundamentals. For example, rising corporate bond spreads due to weaker business conditions do not indicate financial propagation, whereas those driven by bond fire sales do.footnote [73] Broader Bank research on how to distinguish between such effects includes Banks et al (2024). This looks at the extent to which changes in bank lending are unwarranted given changes in macroeconomic conditions.footnote [74] Extending this analysis to include non-banks/MBF would be a valuable next step.footnote [75]

Alternative ways of thinking about the financial system may also help summarise its current state. For example, the ecological network approach described in Ulanowicz (2020) could be applied to analysis of market stability, recognising that systems can be inherently unstable and sustainability requires a balance between efficiency and flexibility. This approach could be applied to the proposed network modelling of core UK financial markets described earlier. More broadly, adapting stability metrics from the natural sciences, as suggested by Haldane and Turrell (2017), warrants further exploration.

3.5: Broader limitations to quantitative analysis and supplementary approaches

Implementing the proposals discussed in preceding sections would materially improve the financial stability risk assessment framework used at the Bank. Nonetheless, several factors mean that gaps to the ideal framework will inevitably persist. For example:

  • Models are necessarily always gross simplifications of the real world. They can provide insights, but not definitive answers. In particular, behaviour is always difficult to model accurately, especially where there is little evidence on which to base it or where sectors include both domestic and global participants who may respond to shocks differently.
  • The financial system is complex and is constantly evolving. This means that despite efforts to update them, the map and models can be out-of-date in certain respects, to varying degrees.
  • Data gaps can materially hamper quantitative risk assessment. Some can be filled, including via exploratory exercises, but some will always remain.

These types of limitation are generic to all analytical frameworks supporting policy, but are especially relevant for financial stability, given the complexity of the policy problem and the pace of financial system evolution.

More generally, the limitations of quantitative analysis imply that complementary, more qualitative approaches should also be part of the risk assessment framework. These include horizon scanning, supervisory and market intelligence, and alternative approaches to risk identification that are cruder and/or less data-dependent. For example, exploring parallels between historical episodes of financial instability and present circumstances. Other approaches could include the identification of rapid shifts in risks, activity or profitability in certain parts of the financial system, or simple indicators like rapid growth in credit in the real economy.

Ultimately, policymaking is a matter of informed judgement, given the inherent limitations of models. This means risk assessment should be done holistically, combining all relevant quantitative and qualitative information to guide judgement.

3.6: Models for policy analysis

Risk assessment frameworks discussed in preceding sections can be useful for assessing: i) candidate policy interventions and ii) the value of policy interventions in stresses, by identifying how the financial system propagates shocks.

But such analysis of shock propagation is less useful for evaluating the costs of policy outside of stresses.footnote [76] Further, the types of model best suited to system-wide scenario analysis – like network or semi-structural models – are typically less well-suited to policy analysis.footnote [77]

For these reasons, models explicitly designed for policy analysis are considered separately here.

The models discussed in this section should not be considered as anything like definitive guides for real-world policy, given their inherent simplifications. Rather, they serve as experimental tools to explore the rationale for and potential effects of interventions. For example, counterfactual analyses of the impact of different policy interventions (or reaction functions) on the real economy and, ideally, economic welfare. Another important use of these models is to analyse how appropriate policy responses vary with different types of shocks. In this way, model outputs can inform state-contingent policy design and cost-benefit assessments of interventions, and so are a valuable input to policy strategy.

When evaluating models for financial stability policy analysis, the following criteria matter:

  • General equilibrium. The model should be capable of tracing the effects of policy interventions through the financial system to the real economy, accounting for both direct and indirect effects.
  • System-wide. Models should encompass as many relevant parts of the financial system as possible, to try and account for interconnections, risk migration or ‘leakages’ between sectors.
  • Policy interactions. The model should incorporate other policies that affect (or are affected by) the intervention in question.
  • Micro founded. The model should ideally be built up from ‘deep’ behavioural assumptions and associated parameters, ensuring that resulting decision rules are invariant to different policies (per the Lucas Critique). The inclusion of utility-maximising agents would also allow for explicit welfare analysis.
  • Heterogeneity. As discussed in Section 2, financial policy interventions can have distributional effects; assessing these requires sufficiently granular models. Adequately capturing the propagation of shocks through the financial system to the real economy also requires some heterogeneity (eg because financial constraints bind on a subset of entities).
  • Empirical realism. Policy models should approximate relevant empirical features – such as observed effects of an unanticipated policy intervention.
  • Tractability. Model behaviour needs to be comprehensible and explicable (including to non-specialists as required). The model should be deployable quickly – including in terms of compute time – so it can be used to analyse ‘live’ policy questions.

As discussed in Section 3.3.2, model design involves practical trade-offs. For example, a model that performs well along the dimensions of ‘general equilibrium’, ‘system wide’, ‘policy interactions’, ‘micro founded’ and ‘heterogeneity’ would perform less well on ‘tractability’. And while there is some continuity across model classes, key differences often make the choice of model and solution method discrete. Ultimately, the choice depends on the specific policy question. In some cases, using multiple approaches may be beneficial. Figure 6 offers a high-level RAG assessment of key model classes against several of the above criteria.footnote [78]

The use of near-representative agent, near-linear DSGE models is well-established in policy institutions. These models benefit from mature macrofinancial linkage modelling techniques and tractability. They enable the application of powerful toolkits for policy analysis (including those developed at the Bank – Harrison and Waldron (2021)). Although linear, these models can be applied in a piecewise manner – incorporating the effect of occasionally-binding constraints (under a perfect foresight assumption) – to generate non-linearity and asymmetry. They are also well-suited to analysing policies – and their interactions – where: i) theoretical foundations matter (eg for explicit welfare analysis), ii) rich heterogeneity is not critical, and iii) policy questions are more easily tackled in a linear setting (eg De Pauli and Paustian (2017), and Ferrero et al (2024)). Examples of such policy questions include analysis of the interaction between optimal financial stability and monetary policy with a zero lower bound (ZLB) constraint.

Figure 6: Strengths and weaknesses of alternative model classes for policy analysis

Figure 6 offers a high-level assessment of key model classes against several of the above criteria; system-wide, policy interactions, theory-grounded, heterogeneity and tractability.  It assesses four types of model. The Figure uses green, amber and red shading to denote good, reasonable and (relatively) poorer performance against the criteria respectively.
The first row shows that near representative-agent, near-linear DSGE models are considered to have a good performance against policy interactions, theory-grounded and tractability criteria, and poorer performance against system-wide and heterogeneity criteria. Example models given are Ferrero et al (2024) and Clerc et al (2015).   
The third row shows that non-linear heterogenous agent models are considered to have a good performance against theory-grounded and heterogeneity criteria, and poorer performance against system-wide, policy interactions and tractability criteria.   Example models given are Kaplan et al (2020) and Coimbra and Rey (2024).
The second row shows the performance of the spectrum of models between these two model types that is, between representative agent DSGE models and fully non-linear, heterogeneous agent models.  This spectrum of models is considered to have a good performance again the theory-grounded criterion, a reasonable performance against the policy interactions, heterogeneity and tractability criteria, and a poorer performance against the system-wide criterion.
The fourth row shows that agent-based models are considered to have a good performance against policy interactions and heterogeneity criteria, a reasonable performance against system-wide and tractability criteria, and a poorer performance against the theory-grounded criterion.  Example models given are Bardoscia et al (2024) and Popoyan et al (2017).
The fifth row shows that semi-structural models are considered to have a good performance against system-wide, policy interactions and tractability criteria and a poorer performance against the theory-grounded and heterogeneity criteria.  Example models given are Aikman et al (2024) and Budnik et al (2023).

Heterogeneous agent DSGE models are based on the same theoretical paradigm but recognise differences between agents.footnote [79] These models are inherently better suited to financial stability policy analysis than near-representative agent models for several reasons. First, they recognise market incompleteness as central to financial stability issues. Second, they capture the uneven distribution of leverage and other state variables that drive amplification, enabling it to be more accurately modelled. Third, they reflect distributional effects of financial stability policies more effectively. Overall, these models are preferable when: i) heterogeneity is crucial to the policy question, ii) theoretical foundations matter (eg, for explicit welfare analysis), and iii) linearity doesn’t. Examples of such policy questions include cost-benefit analyses of all types of financial stability policy.

However, heterogenous agent DSGE models can be less tractable than representative agent models: compute time is expensive and solution methods are more model-specific. The scale of such cost – and these models’ ability to capture rich heterogeneity and non-linearity – depends heavily on how they are specified and solved. Fully non-linear models with high heterogeneity (eg Bewley-Aiyagari-type) suffer from the ‘curse of dimensionality’ computational cost which limits their size and so makes them ill-suited to system-wide policy analysis. But there is something of a continuum of model types between linear (or piecewise-linear) representative agent models and fully non-linear heterogeneous agent models (as captured in the third row of Figure 6). Some of the recent monetary policy-focused Heterogeneous Agent New Keynesian literature falls within this middle ground (eg Ravn and Sterk (2021)).footnote [80] This includes macrofinancial models (examples in Figure 6) but could usefully be developed to include explicit modelling of financial stability policy.

Agent-based models (ABMs) are an alternative type of micro-founded, heterogeneous-agent models that are not built around the utility maximisation and rationality assumptions typical of DSGE models. By dispensing with optimising behaviour, they break the curse of dimensionality, offering several advantages over fully non-linear heterogeneous agent DSGE models. They can be much larger with rich heterogeneity across multiple dimensions (eg households and financial institutions). And they can incorporate more sophisticated and realistic interactions between agents (eg auction mechanisms), and multiple constraints (eg LTV, LTI and debt service constraints on household borrowing). Dispensing with optimisation and rational expectations means that ABMs can also generate a greater range of non-linear dynamics. These advantages make ABMs better suited to analysis that requires a high degree of heterogeneity and/or a fuller modelling of the financial system. For example, where policy questions are system-wide and/or involve policy interaction. Or cases where ABMs are judged to be more realistic along some dimensions (eg where non-rationality adds empirically relevant dynamics, such as endogenous boom-bust financial cycles).footnote [81] Greater computational tractability also allows exploration of a broader range of scenarios, including counterfactual policy experiments.

That said, some of these advantages of ABMs trade-off. Flexibility and the potential for complexity can lead to arbitrary behavioural rules that make overall model behaviour harder to understand.footnote [82] While the absence of an explicit utility function precludes formal welfare analysis, ABMs – like heterogeneous-agent DSGE models – can still support broader policy evaluation like cost-benefit analysis. ABMs have been applied to a diverse range of topics including interbank markets, housing, financial markets, climate change, and payment systems (Borsos et al (2025)).

Semi-structural models cover a wide range of alternative models, including system-wide models of the sort described in Section 2 and DSGE variants (eg Aikman et al (2024)). Their main drawback in policy analysis is a lack of grounding in economic theory, subjecting them to the Lucas critique. While they can be more empirically congruent than micro-founded models, semi-structural models are not without empirical issues. For example, they are prone to identification issues associated with their flexible structure. And larger variants suffer from practical estimation issues given the volume of data required to estimate a (typically) larger number of parameters. Semi-structural models are best suited to policy analysis where microfoundations and associated advantages are not important and where the flexibility or tractability of the model class is. For example, where system-wide analysis is needed, such as in understanding policy ‘leakages’, or where system-wide propagation is a key feature (eg in modelling crystallisation of liquidity risk).

In all cases, there is a question of how the model should be estimated or calibrated. As well as the usual general considerations (not described here), there are some specific challenges for financial stability applications. In particular, time-series approaches risk overfitting of rare events like the GFC and Covid, which may not reflect current dynamics or future shocks.footnote [83] For models with heterogeneity, cross-sectional or panel data can offer alternative estimation or calibration strategies, though with their own limitations. Given these issues and broader model uncertainty, sensitivity of model outputs to alternative model parameterisations is desirable (and the use of alternative models altogether where possible).

In summary, the models of the type discussed in this section can play a valuable role in evaluating policy. Including to: i) routinely examine interactions between monetary and macro-prudential policy; ii) enhance model-based analyses of policies in use; iii) help explore emerging areas of policy development, such as policy for non-banks/MBF.

3.7: Bringing the proposals together

The frameworks underpinning UK financial policymaking aim to tackle several of the general challenges in risk assessment discussed in the preceding sections. However, as for other aspects of the analytical framework, there is always room for improvement, particularly as the nature of threats to UK financial stability and the analytical tools available to address them both evolve.

These general challenges, how the UK’s financial policy framework and supporting analytical framework has helped to address them – and how the analytical framework could be further enhanced – are summarised as follows.

General challenge 1: difficulties in systematically identifying vulnerabilities and amplification of shocks to real economic outcomes via financial service provision.

What the UK’s financial policy frameworks do to address this challenge.

Like other financial stability policy institutions, the UK’s risk assessment framework has focused on identifying vulnerabilities. It has broadened and deepened over time, shifting its focus to building threats to stability, such as those posed by MBF and operational risks. And ongoing work is further developing understanding of the effect of financial system propagation of shocks to and from the real economy.

This includes work (eg the FPC’s macroprudential approach to operational resilience) assessing operational risks to key nodes in systemically important markets and how these risks can be amplified through systemic vulnerabilities and transmission channels, ultimately affecting the delivery of vital services to the real economy. Market-wide mapping and scenario analysis can enhance understanding of how operational disruptions propagate, particularly when they occur alongside existing financial stress. This approach can reveal how and where operational disruption exacerbates financial instability, highlighting areas where regulatory or firm-level intervention may be necessary to mitigate risks.

Suggested enhancements to the analytical framework.

Construct a ‘map’ of the financial system that systematically identifies – and where possible quantifies – propagation of shocks to real economic outcomes. This work would include:

  • Compiling the taxonomy of information of the types set out in the annex tables alongside a system-wide dashboard for monitoring vulnerabilities and behaviours. It is essential to distinguish here between effects driven by i) financial frictions and ii) fundamentals (as discussed earlier along with work that could support this).
  • Building a library of estimates and ready reckoners that help size propagation mechanisms (and identify gaps).
  • Produce a system-wide dashboard of indicators for the vulnerabilities and propagation channels identified in the map and collate estimates and data that help size them.
  • Further analysis of the role of and impact of disruption to different financial services on real economic activity.

General challenge 2: stress tests can be resource-intensive and may not fully capture feedback within and between the financial system and the real economy.

What the UK’s financial policy frameworks do to address this challenge.

Bank staff have developed models and top-down ‘ready reckoners’ that can be used, in conjunction with supervisory and other data, to help run desk-based variants of stress tests. Importantly, these draw on bottom-up industry exercises to fine tune desk-based capabilities.

Suggested enhancements to the analytical framework.

1. Build on existing work to supplement the Bank Capital Stress Tests and exploratory exercises like the SWES with models that can be used for scenario analysis in a more nimble and flexible way:

  • Development of system-wide models for financial market stress scenarios: i) Use of regulatory data, insights from the SWES, and ongoing contact with SWES participants to construct a system-wide model along the lines of Aikman et al (2019), but with heterogeneity within sectors following a variant of the Duffie (2011) ‘10-by-10-by-10’ proposal.footnote [84] ii) Leverage off planned network modelling work to consider whether to build a network financial market simulation model or a demand system asset pricing model.
  • Exploration of modelling approaches that incorporate macrofinancial feedback mechanisms not captured by other proposals here: i) Continued development of an in-house semi-structural model to capture feedback between the banking sector and real economy in adverse macroeconomic scenarios. ii) Further investigation of the merits of extending the semi-structural model of Aikman et al (2024) to improve its empirical realism and mapping to the regular real economic risk assessment and/or investigation of the merits of using any model adopted for capital analysis.
  • A model or models for assessing, over time, the potential implications of macroeconomic and financial system stresses for bank capital buffers, as an input to the FPC's quarterly CCyB decisions. This could be done by extending Covi and Hüser (2024) and developing an in-house semi-structural model.footnote [85]
  • Further development and extension of Covi and Hüser to assess propagation from banks, insurers, and investment funds in macroeconomic recessionary scenarios.

2. Develop a modular approach to scenario analysis. This would build on the mapping, indicators and models discussed earlier to provide an end-to-end approach to scenario analysis – from the scenario through the system to real economy impacts. Depending on the risk being assessed, different inputs would be needed to simulate different parts of the propagation chain supplied by the mapping and indicators. The aim would be to standardise inputs as far as possible and – while recognising the caveats and qualifications – use a modular approach to combine them, to more fully simulate the propagation of shocks to the real economy. Scenarios for different parts of the system could also be connected using system-wide models or by joining the dots between existing different stress testing exercises – banking, insurance, central counterparties, and SWES scenarios.

General challenge 3: limits to mapping and modelling. Some may be possible to overcome in time, for example, some data and modelling gaps. But some are inherent, for example, models are necessarily always gross simplifications of the real world.

What the UK’s financial policy frameworks do to address this challenge.

Bank staff draw on a broad spectrum of data sources – including supervisory insights, horizon scanning, industry and market intelligence – to close data gaps and enhance quantitative analyses (Section 4 for further discussion of staff work with data). The Bank also continuously advances its risk assessment framework by developing, extending, and integrating innovative modelling approaches, such as those proposed by Covi and Hüser (2024), ensuring that it keeps pace with the latest advances in modelling technology.

Suggested enhancements to the analytical framework

  • To use SWES-type exercises to investigate risks or areas of the financial system that are not mapped or understood with enough clarity, but where previous events, intelligence or market features suggest potential systemic risks. Such exercises can inform on vulnerabilities and behaviour and then underpin subsequent desk-based stress tests.
  • To invest further in systemic risk (and more heterodox) indicators as robust cross-checks on the bottom-up, disaggregated risk assessment.

General challenge 4: the absence of a fully-specified comprehensive transmission framework means that separate models are needed for policy analysis (from risk assessment). And different models have different pros and cons – many are limited by computational and data needs, and constraints on specification (such as functional form). As for risk assessment models, some challenges may be overcome in time but some are inherent.

What the UK’s financial policy frameworks do to address this challenge.

Bank staff draw on a broad spectrum of data sources and modelling approaches including drawing on risk assessment models where appropriate. Several strands of ongoing research in the Bank have applicability to policy questions and some have already been used by Bank staff in this way.

Suggested enhancements to the analytical framework.

In addition to the proposals for risk assessment models – which can be used in a limited way for policy analysis given the drawbacks outlined above – the following are proposals for investment in models for policy analysis.

  • Explore the development of a set of models – built from a common ‘trunk’ – for analysing financial stability and monetary policy responses to different shocks under different assumptions.footnote [86] As well as informing on how policy should respond to different shocks, such models could also be used to assess the interaction between monetary and financial stability policy,footnote [87] implications of the ZLB, and the effects of unconventional monetary policy. The trunk would be built from the DSGE paradigm and would be solved with non-linear methods. It would feature a banking sector, non-banks which also lend to producers, housing financed by mortgages, and an interbank market (to allow for repo and associated central bank policies).
  • Continue to investigate and develop capabilities in modelling agent heterogeneity in support of analysis of a range of policy interventions. For example, an overlapping Generations (OLG) model like that in Kaplan et al. (2020) would be well-suited for analysis of mortgage market interventions. Alternatives to this include Greenwald (2016) and Garriga et al. (2021); these offer some of the same capabilities (though arguably with less direct real-world applicability) in more tractable frameworks. Agent-based models – such as an extension of Bardoscia et al. (2024) – could supplement this approach. More broadly, agent-based models could be used to assess a range of prudential policies and their interactions. This would expand the literature in this area.
  • Investigate the development of a heterogeneous agent model that studies the financial stability policy problem when financing constraints are linked to economic supply-side outcomes via the level and distribution of corporate sector capital (and, potentially, total factor productivity growth). As noted in Annex A, the literature has not studied the optimal policy response to externalities arising from supply-side frictions. Such frictions have the potential to create meaningful trade-offs for policy, including between growth and stability.
  • Explore development of structural modelling of the effects of policies that enhance resilience in core markets. For example, the gilt market, like other government bond markets globally, has experienced several financial shocks in recent years. A structural model of it, capturing the behaviour of its participants, both domestic and international, and, potentially, interlinkages with other markets (like the repo and interest-rate derivatives market) would be particularly useful at this juncture. Candidate starting points include the search-and-bargaining models of Duffie et al (2005), Uslu (2019) and Coen and Coen (2022).

The above list would ideally be supplemented with models capable of shedding light on areas of increasing policy interest, like operational risks and AI. In practice, however, the current state of the literature and available resources constrain what can be reasonably achieved, in the near-term at least. Development of models for policy analysis should be kept under review, including to reflect advances in the literature.

Box D: Systemic importance of financial services

Possible criteria for assessing systemic importance of financial services:

  • Key economic function(s). What economic activities rely on the service? For example, payment services are critical to the functioning of the entire economy.
  • Substitutability (related to market or product ‘concentration’). Can the service be effectively or readily substituted? For example, large companies may be able to substitute issuance of corporate bonds for bank debt, but: i) this may take time to do and ii) smaller companies may be unable to substitute financing in this way.
  • Interconnectedness and correlation. How widely would stress be propagated through the system given the service’s interconnectedness and correlation with other services? For example, disruptions to some markets may have limited direct impact on the real economy but can cause significant indirect effects if they input to services that do directly impact the real economy. Or if disruption to a service is correlated with disruption to a substitute service, this can reduce users’ ability to avoid disruption.

Box E: Financial system vulnerabilities

Microfinancial vulnerabilities are defined here as financial and operational susceptibility to shocks at an entity level. It is helpful to group these vulnerabilities into three broad (although interacting) buckets, in part reflecting the key types of friction that drive them:

  • Mismatches and exposures, such as leverage, liquidity or maturity mismatch, and credit or market risk exposures. Externalities are a key driver (and consequence) of these vulnerabilities, although other frictions and distortions like asymmetric information play a role too.
  • Financial and operational dependencies, such as reliance on third parties for critical services or liquidity. While externalities and asymmetric information play a role here, market or service concentration in the provision of third-party services is a key driver.
  • Vulnerabilities created by risk assessment and management flaws, such as model error/bias or inadequate hedging or pricing. These often arise from asymmetric/imperfect information and missing markets, such as gaps in the ability to hedge and diversify risks.

It is also helpful to think of these buckets as characterising different aspects of entities’ operations. They can be respectively thought of as being associated with: i) entities’ business models, ii) how they achieve those business models in practice and iii) unintended flaws in doing so.

Macrofinancial vulnerabilities are defined here as ‘topological’ features of the system and real economy that broaden the incidence and impact of microfinancial vulnerabilities. It is helpful to consider three broad types of macrofinancial vulnerability:

  • Correlation: similarity of microfinancial vulnerabilities to a given shock, such as exposures, or behaviour. For example, common portfolios or use of leverage can expose multiple firms to similar market and interest rate risks.
  • Interconnectedness: channels through which shocks can be transmitted; such as through asset and collateral prices, counterparty exposures and behavioural channels like herding.
  • Concentration: the extent to which a single (or few) providers, assets, methods etc (and so shocks to them) dominate a particular service, market, practice etc.

Table B1.B and Table B1.C in the annex set out some more detailed examples of micro and macrofinancial vulnerabilities and how they might be mapped.

Box F: Example application of the mapping framework

Given the complexity and rapid evolution of the financial system, there will always be some parts of it or propagation mechanisms that warrant further understanding or more detailed mapping. But even when this is not possible – or is at earlier stages – using a consistent analytical framework remains valuable in risk assessment.

For example, the June 2024 Financial Stability Report included analysis of the financial stability risks from private equity for the first time. Table 1 summarises that analysis using the mapping approach, including analogous analysis of risks from LDI funds by way of comparison.

Table 1: Qualitative analysis of risks from private equity in comparison to LDI funds

Private equity (a)

LDI funds (b)

Frictions

Asymmetric information between funds and sponsored companies, and between funds and fund finance providers; possible search-for-yield in low-interest-rate environment which may have driven growth in the sector and excessive risk taking.

Accounting regulations for DB pension funds incentivising liability duration matching; insufficient supply of long-dated fixed-income assets for that liability duration matching (necessitating creation of synthetic duration); imperfect liquidity insurance.

Microfinancial vulnerabilities

Leverage in funds and PE/PC sponsored companies; risk management practices by funds and their financers.

Size of contingent liquidity risk via gilt repo and interest rate derivative positions; operational frictions between DB and LDI funds (creating recapitalisation friction in event of gilt price falls).

Macrofinancial vulnerabilities

Funds closed-ended, so mitigating redemption and fire sale risk; interconnection between PE/PC funds and the core of the financial system via debt finance of those (leveraged) funds; possible spillovers to other riskier credit markets in event PE/PC funds collapse; potential for macrofinancial feedback focused on PE/PC-sponsored companies in event of defaults or excessive contractions in supply of riskier credit; PE/PC strategies are partially but not perfectly correlated; PE/PC fund financing is well spread across lenders.

Fire sale amplification and externality due to state-contingent liquidity risk in LDI funds; LDI funds have very closely correlated strategies.

Relevant scenarios

Higher interest rates and/or a macroeconomic downturn resulting in PE/PC sponsored companies defaulting, an increase in the cost of funding for PE/PC funds and potential knock-on consequences for other forms of corporate finance.

Sharp rise in long-term government bond yields prompting margin/collateral calls sufficiently large given available liquidity that fire sales become necessary.

Quantitative importance

PE/PC sponsored companies account for around 10% of UK private-sector employment (so macro implications could be quantitatively important); extent and nature of PE/PC fund leverage unclear; potential for other forms of corporate finance to substitute for PE/PC if there are no spillovers to other markets.

LDI funds own about 10% of the gilt market and the gilt market underpins pricing of many financial services; size of open positions large with LDI funds typically operating at 2x leverage or more.

Examples of policy mitigants and developments

FCA review of private asset valuation practices; PRA engagement with banks on risk management practices; monitoring role for financial stability policy including the potential to gather further information to assess the risks.

LDI funds stress testing against gilt yield spikes; but stress test did not account for fire sale externality, so role for financial stability policy in respecifying the stress test to account for the externality; role for microprudential policy in mitigating operational frictions between pension and LDI funds.

Footnotes

As noted in the July 2025 Financial Stability Report, further work is needed to address significant data gaps that hinder the ability of financial stability authorities to understand how private markets might operate after a shock, and how stress within private markets might interact with the wider financial system and potentially disrupt UK real economy financing. As part of this effort, the Bank intends to continue to undertake structured engagement with private market participants and key providers of capital to the sector. The objectives of this will be to: a) deepen understanding of how private markets finance the real economy and support growth, b) understand how the private markets ecosystem may operate during a downturn, including behavioural responses of investors to losses.footnote [88]

Box G: Scenario design

Ideally, scenario analysis would be subsumed into a full stochastic simulation that models the joint probability distributions of welfare (or welfare proxy) outcomes – and the contribution of financial propagation to those outcomes – by sampling from a distribution of shocks. Some of the models discussed in the main text follow this approach.

But, in practice, it is useful to examine a set of scenarios that effectively summarise the conditions under which financial propagation occurs. Such scenarios are designed to characterise specific parts of the probability distribution, typically focusing on the left tail (eg a 1-in-100 event), which is most relevant for systemic risk.

There are several practical reasons for using scenario analysis. First, stochastic simulations are often computationally intensive. Second, their usefulness depends on how well the joint density function of shocks (or residuals) describes the range of scenarios that could occur in the real world. In practice, this is difficult to achieve, in part because of challenges to estimate probability density functions accurately. Third, it may be more straightforward to communicate scenarios than complete probability distributions.

There may also be value in separating scenarios into groups. If certain scenarios are largely independent, analysing them separately can reveal relevant insights into the probability density function without requiring a full stochastic simulation. Relatedly, it is neither feasible nor desirable to include all relevant amplification mechanisms in the same model. It may also be useful to separate scenarios if that some matter more for certain propagation mechanisms than others.

There are three broad classes of scenario that such analysis would ideally include:footnote [89]

  • Macroeconomic recessionary: BCST-style scenarios in which there is an economic recession and accompanying falls in asset values. Variants could explore sensitivity to key elements of the scenarios (eg recessions driven by supply/demand shocks with monetary policy tightening/loosening, as per past BCSTs).
  • Financial market: Scenarios that explore crystallisation of risk in financial markets. Examples include rapid asset price corrections, or structural demand shifts that force the unwinding of large market positions (eg the basis trade).
  • Operational: Scenarios that explore the effects of operational incidents. This would include cyber attacks on or disruption to critical third parties that simultaneously affect multiple financial entities or critical financial market infrastructures.

In all three cases, scenarios can draw on hypothetical or real-world events. Examples include Covid, geopolitical shocks or market shocks such as the temporary gilt yield spike following the UK mini-budget in Autumn 2022.

These classes are intended to capture scenarios that always matter for financial stability. Other scenarios that are slower-moving – or manifest over longer horizons – may also be relevant, but the value of their routine quantification is likely to be lower.footnote [90]

In all cases, the purpose of routinely quantifying scenario impacts (and any counterfactual or sensitivity analysis) is to assess the causes, extent and real economic impact of financial propagation. But, as discussed in the main text, the state of modelling technology can constrain this aim.footnote [91]

Box H: Possible approaches to building macrofinancial propagation into modelling that could be explored and further developed

1. In-house modelling work is developing an approach to assessing macrofinancial feedback from contractions in bank credit supply to real economic activity (and back again). This combines a structural VAR (Barnett and Thomas (2014)) with microeconometric evidence of the response of key banking variables to determine macroeconomic feedback effects using the credit supply shock identified in the VAR.footnote [92]

2. Development of the semi-structural New-Keynesian model in Aikman et al. (2024) to incorporate i) unconventional monetary policy as an imperfect substitute for Bank Rate at the ZLB and ii) a more sophisticated credit amplification mechanism in the household sector. This would improve the model’s empirical congruence and better link the household block of the model to staff’s quarterly risk assessment.

3. The models discussed in Section 3.6 may also be suitable for scenario analysis of macrofinancial amplification since they would incorporate a macroeconomic block.

4: Data

Data is a critical input to most elements of the analytical framework supporting the UK’s financial stability framework. The Bank has access to a wide range of data in pursuit of its financial stability objective. But, as for other similar policymaking institutions, maximising the potential of that data is a work in progress. This section briefly sets out the most pertinent elements of the Bank’s data and analytics strategy to the proposals made in Section 2 and Section 3.footnote [93]

4.1: Context and overview

The Bank published a data and analytics strategy in 2024 in response to an Independent Evaluation Office investigation into the Bank of England's use of data to support its policy objectives. The elements of this relevant to UK financial stability can be characterised by three pillars:

  1. Data: a strategy for improving the accessibility and use of existing data and, where desirable, filling data gaps.
  2. Architecture and tools: adoption of cloud compute, storage, and access for all data, alongside a programme of automation, including adoption of AI.
  3. Culture and skills: a strategic assessment of required data skills, associated target staff skills mix and training programme.

Delivery of all three pillars will support and enable the proposals for further work set out in this paper. Given the more direct link to other parts of this paper, the remainder of this section summarises the first ‘data’ pillar and second ‘architecture and tools’ pillar.

4.2: Data use, data gaps and data quality improvements

Bank staff already have access to granular bank, insurer, market infrastructure and markets data. For example, covering UK banks’ balance sheets, mortgage transactions and corporate accounts. And for markets, data include: high frequency data covering gilt repo, derivatives and bond market transactions in sterling markets, as well as a range of survey and less structured data. As discussed in Section 3 and here, there is work in train – or being considered – which will enable Bank staff can make better use of the data already available.

In addition, there are several ways in which the Bank is working to improve the quantity and quality of data used for risk assessment:

  • Work is ongoing to identify and prioritise data gaps. The non-bank sector and MBF – and banks’ exposures to it – are a key focus. This aligns with the proposals in Section 3 to enhance risk assessment in these sectors and activities.
  • The PRA is reviewing the regulatory banking data landscape to better align data collections – at the lowest possible cost to industry – with the requirements of day-to-day supervision, better integrate and streamline data collections and to ensure that the data collected meets the current and anticipated future needs of policymaking.
  • The Bank engages with the ONS on data issues as they emerge, working collaboratively on possible solutions to them. The well-publicised problems with the quality of some UK statistics poses challenges, including to financial policymaking. This underscores the importance of improvements to address structural issues in how figures about the country are produced generally. Going forwards, there are specific and achievable changes that would support the analytical framework underpinning financial stability policymaking. Since 2005, the Bank has sought access to the Inter-Departmental Business Register (IDBR), which would plug a gap in the Bank’s ability to assess corporate risk. This requires legislative changes that the Bank hopes to work on with ONS’ leadership. The ONS’s plans to improve survey data, including household surveys, are welcome and will be important for improving the information set that is available for household risk assessment. It is important that this is supported with necessary resourcing. The ONS’s debt and equity statistics are also an area where enhancements would be desirable for financial policymaking: most pressingly in the form of better measures of debt that account for market-based issuance, granular data on gilt holdings that would directly aid systemic risk monitoring, and more accurate information on ownership than the current equity series provides.

The mapping work discussed in Section 3 will help to identify further data gaps and help prioritise those gaps over time. Given the increased emphasis on system-wide analysis and real economy linkages, this may reveal further gaps in data in these areas.

Not all the data gaps identified will require new collections of data from financial entities. Some may be filled with data from existing collections by the FCA and other UK regulators, which would be desirable given the costs to financial entities of providing it and to the Bank of ingesting, storing and processing it.

One notable area of data gaps is in the UK financial system’s links with the global financial system. For example, the UK’s financial transaction level data collections focus on UK entities, with limited visibility of non-UK entities’ activity. Further thought should be given to the nature of these data gaps – including their likely importance to financial stability in the UK – and, if judged important, to exploring ways in which they might be filled.

4.3: Architecture and toolkit

A key element of the Bank’s data and analytics strategy relevant to financial stability policymaking is to adopt a cloud-first approach to its data architecture. The migration of existing data will significantly improve efficiency by centralising data, enhancing computing power, and enabling access to off-the-shelf tools for reproducing analysis and automated checking.

In addition, Bank staff will continue to take advantage of emerging tools to improve the quality and efficiency of its analysis. This includes further development of automated dashboards of key metrics, such as those discussed in Section 3. This will build on significant progress already made here, such as in collating corporate balance sheet information and linking it to financial markets.

Dashboards and automated tools like these, taking advantage of AI techniques where appropriate,footnote [94] have the potential to make regular monitoring of relevant metrics significantly more efficient.

5: Conclusions and next steps

This paper has reviewed the analytical framework used for financial policy at the Bank of England. Financial stability has long been a key objective of the Bank of England, but it was only in 2013 that the FPC became a statutory policy committee at the Bank. The framework that was introduced in 2013 naturally reflected the legacy of the GFC with a focus on the banking sector.

Subsequently, the framework has evolved along practical lines as threats to stability have grown in other sectors and from other sources. Notably, the importance of – and risks in non-banks and MBF have grown substantially. Beyond this shift to non-banks, the financial system – and so the focus of the FPC – has evolved in other ways too. For example, the financial system has become more digitised and interconnected, driven in part by greater reliance on technology. This has made operational resilience more important while – arguably – increasing risks to it at the same time.

Over the same period, there have been substantial advances in the availability of data pertinent to the assessment of financial stability, and techniques for analysing those data.

This paper takes stock of these developments and – in this context – suggests ways in which the analytical framework supporting financial policymaking can be enhanced. These can be summarised as: (i) further work on the objectives and instruments of financial stability policy; (ii) work to improve the risk and policy assessment frameworks, including a mapping agenda, investment in desk-based scenario modelling, a revamped suite of systemic risk indicators and investment in models for policy analysis – all with a particular emphasis on, and in support of, system-wide and scenario analysis; (iii) a data strategy, aimed at extracting more value from existing datasets and identifying key data gaps. Together, these proposals imply an evolution of the framework in the following ways:

  • Takes an (even) more system-wide approach to risk assessment, with an emphasis on scenario analysis and the interaction between the financial system and the real economy.
  • Identifies risks to the FPC’s objectives more systematically and comprehensively and enables a clearer comparison of those risks and policies to address them.
  • Integrates a more systematic analysis of frictions underlying financial system vulnerabilities and inefficiencies.

Although some of these proposals will take some time to bear fruit, this shift in emphasis can begin to be embedded in the analysis that Bank staff provide to FPC now. Staff will be working over upcoming policy rounds to embed the general logic of the proposed approach into their analysis.

Annex A: Literature on welfare and optimal policy consequences of financial frictions

Table A1.A briefly summarises some of the benchmark macrofinancial papers in the literature that study the welfare and optimal policy consequences of specific financial frictions. The literature can be approximately divided into two: theoretical papers with more stylised models that elucidate the nature and implications of externalities caused by a particular (set of) financial friction(s); and applied papers with less stylised models that seek to understand the properties of optimal policy given a particular (set of) financial friction(s). A short summary is as follows:

  • In theory, inefficiency can manifest in sub-optimally high or sub-optimally low levels of risk-taking, debt and investment.
  • Pecuniary externalities (which typically arise from financial frictions) can be complicated in the sense that they can sometimes be decomposed (where analytical results are available) into several parts (eg distributive and collateral externalities). These parts are relatable to (though not the same as) familiar concepts like financial amplification and fire sales.
  • Since multiple externalities may arise from a single friction, optimal policy is not always constrained efficient depending on the available instruments. Arbitrary agent and state-specific taxes (and subsidies) can always recover the constrained efficient equilibrium. But, more realistically in the case where only specific instruments are available (eg leverage or LTV limits), optimal policy cannot always deliver the constrained efficient equilibrium.
  • In models at the less-stylised end of the spectrum, the general effect of these externalities is to make downturns larger, as occasionally-binding constraints become binding. For the same reason, debt levels and/or asset prices are typically sub-optimally high compared to the constrained efficient (second-best) outcome. In line with conventional wisdom, optimal policy (whether LTV limits or capital ratios or something else) acts countercyclically by leaning against debt and/or asset prices when the constraints are not binding and (potentially though not universally) loosening when the constraints become binding.
  • Financial stability policy is very effective in mitigating externalities arising from particular financial frictions (assuming an appropriate instrument is available), whereas monetary policy typically is not and may face sharp trade-offs among monetary and financial objectives. The literature supports a conclusion that financial stability policy can provide considerable societal benefits relative to a world with no financial stability policy and only monetary policy.
  • The aggregate demand externality from a monetary policy constraint (eg the ZLB) and a pecuniary externality from collateral constraints on borrowing reinforce one another, providing an even stronger motive for financial stability policy than in the presence of just one of those frictions.
  • In contrast to conventional monetary policy, central bank liquidity policy can be very effective in mitigating externalities, albeit at the cost of potentially higher distortionary taxes (eg, Bandera and Stevens (2024)). In general, the optimal policy mix contains both prudential and liquidity policies and the design of one affects the optimal design of the other.

While the literature is useful for building understanding of the implications of specific frictions for welfare and for optimal policy, including the interaction between financial stability and monetary policy, it does not provide anything like a definitive guide as to what financial stability policy should do in practice. In that respect, it is much less well-advanced than the equivalent for monetary policy, where the gap between theory and academia, on the one hand, and practice, on the other, is smaller. In particular, there appear to be several areas where the literature could usefully develop further:

  • The literature has focused on inefficiencies arising from either pecuniary externalities caused by collateral constraints on borrowing by real-economic agents or from frictions within financial intermediaries, but not both. As such, the literature has not studied the optimal interaction between borrower-side and lender-side externalities and the interaction between the instruments that might be used to mitigate them.
  • Related to that, very few papers have studied the implications of alternative instrument choices for welfare. Any given inefficiency may be mitigated to some extent by a range of alternative instruments.
  • The literature has not studied externalities arising from meaningful supply-side frictions that would impede the allocation of capital and the optimal policy response to that.footnote [95] Meaningful supply-side distortions would provide a further motive for financial stability policy and, depending on their nature, may create interesting trade-offs for the policymaker.
  • For understandable reasons, the overwhelming majority of papers in this literature assume rational expectations.footnote [96] The finance literature and practitioner observations highlight the role of non-rational beliefs in driving financial cycles. There is an interesting meta question about how non-rational expectations should affect optimal financial stability policy that the literature is only just starting to tackle.

Table A1.A: Financial frictions, inefficiencies and optimal policy in macrofinancial models – some benchmark papers in the literature

Paper

Modelling approach

Frictions and externalities

Main results

Bianchi and Mendoza (2018)

Infinite-horizon, small-open economy model with representative firm-household subject to three aggregate shocks and a collateral constraint solved using a global, non-linear method.

Pecuniary externality from the collateral constraint (which can be micro-founded as an incentive compatibility mechanism given limited contract enforcement).

Social planner can deliver constrained efficient equilibrium; can be implemented using a state-contingent tax on borrowing; optimal policy is countercyclical.

Davila and Korinek (2018)

Very general three-period model of heterogeneous agents with one-period endowment uncertainty used to explore inefficiencies that come with borrowing constraints analytically.

Arbitrarily incomplete state-contingent securities come with incomplete risk sharing and potentially a collateral constraint, giving rise to a ‘distributive’ externality and a ‘collateral’ externality.

Collateral externality leads to over-borrowing, but distributive externality can lead to over or under-borrowing; policy can implement constrained efficient equilibrium using state-contingent taxes.

Farhi and Werning (2016)

Analytical analysis in heterogeneous (preference) agent model, a complete set of state-contingent securities with arbitrary imperfections, sticky prices and an arbitrary constraint on monetary policy.

Aggregate demand externality – households do not internalise that outcomes are better if wealth is transferred to households with higher marginal propensities to consume on goods in depressed states.

Optimal policy characterised by marginal utility and goods wedges; implementable via state-contingent household-specific taxes to security pricing; same wedges characterise optimal policy with pecuniary externality.

Jeanne and Korinek (2020)

Three period model with bankers and depositors, risky endowment income in the second period and capital accumulation by bankers.

Bankers subject to collateral constraint with associated pecuniary externality. Policy can use prudential and liquidity tools to reduce resulting financial amplification. The liquidity tool incurs an ad-hoc social cost.

In general, the optimal policy mix contains both prudential and liquidity policies and the design of one affects the optimal design of the other.

Korinek and Simsek (2016)

Infinite horizon two-agent New-Keynesian (TANK) model (with fixed prices), borrowing and zero lower bound (ZLB) constraint on monetary policy used to study deterministic deleveraging episode; extension with collateral constraint.

Dynamic, macro aggregate demand externality – agents do not internalise the effect of their borrowing on aggregate outcomes when monetary policy becomes constrained.

Aggregate demand externality leads to over-borrowing; policy can achieve constrained efficient allocation by reducing leverage; pecuniary externality from collateral constraint reinforces AD externality.

Ferrero et al. (2024)

Infinite horizon TANK model with housing sector and collateral constraint on borrowing, ZLB constraint and credit spreads shock solved as piece-wise linear and quadratic approximation to welfare.

Pecuniary externality from the collateral constraint and aggregate demand externality from the ZLB constraint.

Loss function contains usual monetary policy terms and ‘risk sharing’ terms in consumption gaps; optimal LTV limits strongly countercyclical, improving risk-sharing and reducing cost of ZLB episodes.

Collard et al. (2017)

Infinite horizon rep-agent NK model solved to 1st and 2nd order; banks provide loans to and monitor risk in capital producing firms; bank equity costlier than debt; deposits insured; total factor productivity (TFP) and risky capital shocks.

Limited liability and deposit insurance give rise to risk-taking externality – banks and capital producers do not internalise social cost of risky projects.

Regulator cannot perfectly observe loan portfolios – optimal policy via capital requirement; optimal capital responds only to capital risk shocks (not TFP shocks); monetary policy can perfectly stabilise inflation (no trade-off).

der Ghote (2021)

Continuous time NK model in spirit of Brunnermeier and Sannikov (2014) with TFP shock; financial intermediaries have comparative advantage in providing capital and an incentive compatibility constraint.

Efficiency of intermediary ownership of capital vis-à-vis that of households combined with incentive compatibility constraint a la Gertler and Kiyotaki (2010) gives rise to inefficiency and pecuniary externalities.

Optimal policy in form of state-contingent leverage constraint improves intermediary value throughout cycle relaxing constraints in busts; non-zero, but limited benefits to policy co-ordination.

Tella (2019)

Continuous time real model in spirit of Brunnermeier and Sannikov (2014) with TFP and idiosyncratic (to individual intermediaries) capital returns shocks.

Intermediaries have an incentive to divert investment returns and their trades in capital market are unobservable – this gives rise to a risk-taking pecuniary externality.

Intermediaries’ exposure to risk is inefficiently high; social planner can implement the constrained efficient outcome with a tax on asset holdings that reduces asset prices (and investment).

Annex B: Detail on risk mapping

The following tables give more detail on the content and mapping of: key nodes (Table B1.A), microfinancial vulnerabilities (Table B1.B) and macrofinancial vulnerabilities (Table B1.C ).

Table B1.A: Mapping key nodes

Node

Example constituents of node

Examples of interconnections within/across nodes
split by service and financial channels (often two-way)

Examples of data needed to measure these

Markets bring together constituents of other nodes to trade instruments.

Types of instruments traded: eg fixed income, equities, commodities and derivatives.

Users: end-user service providers (eg banks), intermediate service providers (eg dealers), real economy entities (eg non-financial corporations (NFCs) issuing equity), governments (eg issuing debt).

Service channels:

Supply chains: access to markets (eg via brokers), post-trade services (eg settlement), trading services (eg venues), pre-trade or supporting services (eg credit ratings), issuance services (eg underwriting).

Financial channels:

  • Bilateral exposures: eg to CCPs, derivatives counterparties.
  • Multilateral exposures eg to asset prices, liquidity supply.
  • Market population data such as trading volume breakdowns, memberships of market infrastructures.
  • Mapping business models and use of markets. Eg types of NFC funding (eg market versus private), how households access markets.

Intermediate services support market functioning and provision of end-user services

Types of service: liquidity supply (eg dealers), risk management and transfer (eg derivatives), financing (eg interbank loans), asset mgt, transactions and custody.

Users:. end-user service providers (eg insurers), other intermediate service providers.

Service channels:

Supply chains: various. But most financial firms will use financing services to fund themselves, some of form of risk management service such as derivatives, and market access.

Financial channels:

  • Bilateral exposures created through funding, trading or use of services eg to CCPs, derivatives counterparties.
  • Multilateral exposures eg to asset and funding markets.
  • Population mapping such as data on sectoral composition (eg concentration) and SIC constituent numbers (eg ONS).
  • Mapping business models and use of these services.

End-user services are provided to real economy users

Types of service: Financing (eg NFC equity issuance, government debt issuance), risk management and transfer (eg insurance), consumption smoothing (eg pensions, lending and deposit-taking) custody and transactions (eg payments)

Users: NFCs, governments and households.

Service channels:

Supply chains: intermediate services and markets (as above).

Financial channels:

  • Bilateral exposures to real economy via eg lending, insurance and derivatives. Exposures to financial firms through eg funding, trading or use of services eg to CCPs, counterparties.
  • Multilateral exposures, eg via asset and funding markets.
  • Population mapping such as data on sectoral composition (eg concentration) and SIC constituent numbers (eg ONS).
  • Mapping business models and use of these services.

Real economy

NFCs and households

Service channels: risks to end-user services (as above).

Financial channels:

  • Bilateral exposures created through claims on insurers and pension funds, investments held in funds and at other firms like hedge funds, bank deposits.
  • Multilateral exposures – loss in asset value and liquidity.
  • Mapping NFC and households. Map populations by cohort features (eg by IG, HY and leverage).

Table B1.B: Microfinancial vulnerabilities and their depth at entities and nodes

Vulnerability

Examples and sources

Examples of i) data needed to identify these and ii) additional information that may help inform behavioural models

Mismatches and exposures which create vulnerabilities of service providers and real economy to shocks (both by reducing the ability to absorb losses and via exposures to specific risks like interest rate risk or credit risk).

  • Leverage (including when combined with exposures and other mismatches). Drivers include externalities and asymmetric information (which incentivise excessive risk-taking and prevent users/lenders from disciplining against it respectively).(a)

i) Identifying vulnerabilities

  • Metrics on leverage, maturity mismatch and liquidity mismatch, net of mitigants. Can infer leverage and maturity mismatch from (detailed) asset and liability data. Estimation may be required for liquidity mismatches.
  • Metrics required on resilience. Further metrics relate to, for example, collateral values (repo) and variation margin calls (derivatives).

ii) Other behavioural information (in addition to (i))

  • Liquidity sentiment (eg deposits and fund redemptions) and rules/policy (eg any discretion over margin calls).
  • Maturity and liquidity mismatch at banks and some non-bank lenders (end-user services). Drivers include externalities and asymmetric information (which incentivise excessive risk-taking and prevent users/lenders from disciplining against it respectively).
  • Mismatch of liquidity terms by investment funds (end-user services). Drivers include asymmetric information (investors may not be able to evaluate liquidity risk) and externalities (do not fully internalise costs of redemptions).
  • Liquidity mismatches created by collateral calls (intermediate and end-user service providers) where obligations do not match assets available to meet them). Drivers include asymmetric information (collateral is used to mitigate bilateral asymmetries) and externalities (counterparties do not factor in impact of collateral calls or sales on wider market).
  • Exposures to credit and market risk eg by banks, insurers and hedge funds (end-user services). Excessive exposures can be driven by frictions such as asymmetric information and externalities (mechanisms similar to those in preceding row).
  • Exposures to forex, commodity and other risks and risks associated with hedging them by financial firms and real economy entities.
  • Risk metrics required on granular basis to reflect primary risks such as ‘value risk’ and ‘credit default risk’.
  • Metrics required on loss-absorbing capital relative to risk exposures.

Dependencies which expose users to risks in the provision of services to them.

  • Exposures to refinancing risk for financial firms and real economy entities. For example, risks of increases in market-wide or firm-specific funding costs, including due to falls in the value of collateral used to secure lending.
  • Exposures to operational risk (intermediate and end-user services). For example, to provision of technology and infrastructure services.

In both cases, dependencies can be driven by service/provider/funding type/maturity/collateral concentration and market power. Asymmetric information may prevent users from pricing in risks posed by dependencies.

  • Metrics on funding concentration eg of maturity and funding sources.
  • Metrics on direct and indirect exposures/losses from operational risks crystallising for key services (for example as provided by infrastructures), factoring in firm resilience and recovery capabilities, and levels of substitutability.

Risk assessment and management flaws which can drive inadvertent risk exposures and leave firms ill-prepared for their crystallisation.

  • Problems with risk identification, assessment, measurement eg use of flawed information and inadequate models, exposure to risks that are difficult to measure or identify.
  • Risk management flaws and mispricing eg unintended/unpriced consequences such as in the use of collateral, insufficient hedging, diversification or pooling of risks, deliberate or negligent risk management, systematic risk mispricing. Can be driven by missing/incomplete markets and exacerbated by asymmetric information (including by preventing users from fully screening the risk management of providers).
  • Key gaps in risk information for firms.
  • Identification of commonalities in internal modelling approaches and reliance on common data.
  • Metrics of risk (mis)pricing.
  • Known gaps in markets that enable risk hedging, diversification and correction of risk mispricing (eg limits to short-selling).
  • Identification of areas of wrong-way risk (eg from supervisory data).

Footnotes

  • (a) There is a distinction between i) ‘dry powder’ leverage capacity – which can help absorb shocks and ii) actual use of leverage – which can instead reduce entities’ ability to absorb shocks.

Table B1.C: Macrofinancial vulnerabilities and their width across entities and nodes

Vulnerability

Examples and sources

Examples of i) data needed to identify these and ii) additional information that may help inform behavioural models

Correlation increases the scale/scope of shock impacts across the system.

  • Common positions and incentives. Other than herding, can be driven by co-ordination failures, transactions costs and missing/incomplete markets. For example, limited availability of safe assets may increase commonality of government bond holdings. Common positions and incentives – such as for procyclicality – may also be driven or exacerbated by regulation and diversification and behaviours such as short-termism.

i) Identifying vulnerabilities:

  • Correlation matrices for values of key assets and liabilities.
  • Identification of major areas of ‘wrong-way’ risk. Eg in collateral versus exposures or correlation of different vulnerabilities (eg where the likelihood of operational risks crystallising is positively correlated with wider market risks).
  • Identification of significant common or similar exposures across entities. Eg using asset holdings of key investors, common usage of benchmarks by funds, interest rate-sensitivity of leveraged institutions and common use of services/providers.

ii) Other behavioural information (in addition to (i))

  • Information that helps identify incentives for herding a eg benchmark use by fund managers.
  • Identification of drivers of common behaviour, eg similar mandates and regulation.

Interconnectedness transmits the impact of shocks across the system.

Disruption and losses (including the effects of procyclicality) can be transmitted through:

  • Service dependencies. Such as reliance on infrastructures and other intermediate service providers like inter-dealer brokers and reinsurers.
  • Financial exposures. Such as exposures to the risk of losses in asset values (eg because of procyclical selling by others), counterparty risk and the risk of liquidity demands like collateral calls. This channel could include behavioural interconnectedness such as herding.
  1. Identifying vulnerabilities:
  • Metrics on liquidity and other exposures and reliance on services to real economy and from other parts of the real economy and financial system, such as hedge funds and LDI funds (in their capacity as providing services to pension funds).
  • Metrics on capacity of dealers and other intermediaries to meet demands for market services, including: underwriting, market-making, repo lending, derivatives, and cash liquidity services.

Mapping transmission channels can be guided by the connections shown in Figure 4 and Table B1.A.

ii) Other behavioural information (in addition to (i))

  • Evidence on likely entity behaviour in response to shocks. For example, estimates of redemption sensitivity to changes in asset prices, expectations of participation behaviour drawn from surveys/exercises such as the SWES.
  • Evidence of/drivers of appetite of dealers and other intermediaries to meet demands for market services.

Concentration of market share of a provider/asset/market means more of the system is affected by shocks to it.

  • Concentration of exposures to a particular asset or counterparty. Can be driven by missing or incomplete markets (eg a lack of diversity in long-dated safe assets can drive concentrated exposures of pension funds to government bonds). Or by asymmetric information (eg overexposure of prime-brokers to Archegos).
  • Concentration of service providers is typically driven by market power, eg critical third parties (CTPs).
  • Identification of major areas of concentration in key entities’ exposures, for example using large exposures data and asset holdings data for systemically important institutions.
  • Estimates of market/provider concentration such as market share, Herfindahl indices.

Bibliography

Acharya, Viral V., Pedersen, Lasse H., Philippon, T. and Richardson, Matthew. 2017. Measuring Systemic Risk. The Review of Financial Studies, 30(1), 2–47.

Acharya, Viral V., Crosignani, Matteo, Eisert, Tim, Steffen, Sascha. 2022. Zombie Lending: Theoretical, International, and Historical Perspectives. Annual Review of Financial Economics, Annual Reviews, vol. 14(1), pages 21–38, November.

Acharya, Viral V., Brunnermeier, Markus K., and Pierret, Diane. 2024. Systemic Risk Measures: Taking Stock from 1927 to 2023. NBER working papers 33211. National Bureau of Economic Research, Inc.

Adeney, Rachel, Hitchens, Adrian, Lane, Claudia, Mehta, Harsh, and Quashie, Alison. 2024. Operational Resilience in a Macroprudential Framework. Financial Stability Paper No. 50, Bank of England.

Adrian, Tobias and Borowiecki, Karol Jan and Tepper, Alexander, 2022. ’A leverage-based measure of financial stability,’ Journal of Financial Intermediation, Elsevier, vol. 51(C).

Aikman, David, Chichkanov, Pavel, Douglas, Graeme, Georgiev, Yordan, Howat, James, and King, Benjamin. 2019. System-Wide Stress Simulation. Bank of England Working Papers No. 809, Bank of England.

Aikman, David, Beale, Daniel, Brinley-Codd, Adam, Covi, Giovanni, Huser, Anne-Caroline, and Lepore, Caterina. 2023a. Macro-Prudential Stress Testing Models: A Survey. Bank of England Working Papers No. 1037, Bank of England.

Aikman, David, Giese, Julia, Kapadia, Sujit, and McLeay, Michael. 2023b. Targeting Financial stability: Macroprudential or Monetary Policy. International Journal of Central Banking, 19(1), 159–242.

Aikman, David, Bluwstein, Kristina, and Karmakar, Sudipto. 2024. A Tail of Three Occasionally Binding Constraints: A Modelling Approach to GDP-at-Risk. IMF Economic Review.

Ajello, Andrea, Boyarchenko, Nina, Gourio, François and Tambalotti, Andrea. 2022. Financial Stability Considerations for Monetary Policy: Theoretical Mechanisms. Staff Reports 1002, Federal Reserve Bank of New York.

Allen, William A. and Wood, Geoffrey 2006. Defining and achieving financial stability. Journal of Financial Stability, 2(2), 152–172.

Bahaj, Saleem, and Foulis, Angus. 2017. Macroprudential Policy under Uncertainty. International Journal of Central Banking, 13(3), 119–154.

Bailey, Andrew. 2024a. Today’s challenges in financial stability: the new and the not so new. Speech given at Bloomberg Regulatory Forum, New York, 22 October 2024.

Bailey, Andrew. 2024b. The future of money and payments. Speech given at Group of Thirty’s 39th Annual International Banking Seminar, Washington, 26 October 2024.

Bandera, Nicolo, and Stevens, Jacob. 2024. Monetary Policy Consequences of Financial Stability Interventions: assessing the UK LDI crisis and the Central Bank Policy Response. Bank of England working papers 1070. Bank of England.

Bank of England. 2009. The Role of Macroprudential Policy. Bank of England Discussion Paper, Bank of England.

Bank of England. 2011. Instruments of Macroprudential Policy. Bank of England Discussion Paper, Bank of England.

Banks, Will, Khairnar, Kunal and Sian, Inderjit. 2024. Identifying (un)warranted tightening in credit supply, Bank of England Financial Stability Paper No. 51, Bank of England.

Bardoscia, Marco, Carro, Adrian, Hinterschweiger, Marc, Napoletano, Mauro, Popoyan, Lilit, Roventini, Andrea, and Uluc, Arzu. 2024. The impact of prudential regulations on the UK housing market and economy: insights from an agent-based model. Bank of England Working Papers No. 1066, Bank of England.

Barnett, Alina, Thomas, Ryland. 2014. Has weak lending and activity in the UK been driven by credit supply shocks? Manchester School, University of Manchester, 82(S1), 60–89.

Benjamin, Nathanaël. 2025. Picking what matters. Given at the Global Investment Management Summit, London.

Bennett, Will, Coppins, Geoff, McCloskey, Maighread and Walker, Danny. 2024a. Financial stability at the Bank of England. Bank of England Quarterly Bulletin, 64(3).

Bennett, Will, Coppins, Geoff, McCloskey, Maighread and Walker, Danny. 2024b. The contribution of the Financial Policy Committee to UK financial stability. Bank of England Quarterly Bulletin, 64(3).

Bianchi, Javier, and Mendoza, Enrique G. 2018. Optimal Time-Consistent Macrofinancial Policy. Journal of Political Economy, 126(2), 588–634.

BIS. The international dimensions of macroprudential policies. 2017. BIS Working Papers No. 643. Bank for International Settlements.

Bluwstein, Kristina, Buckman, Marcus, Joseph, Andreas, Kang, Miao, Kapadia, Sujit, and Simsek, Ozgur. 2023. Credit Growth, the Yield Curve and Financial Crisis Prediction: Evidence from a Machine Learning Approach. Journal of International Economics, 145(C).

Borsos, A, Carro, A, Hinterschweiger, M, Kaszowska-Mojsa, J and Uluc, A. 2025. Agent-based modelling at central banks: recent developments and new challenges. Bank of England Working Papers No. 1122, Bank of England.

Boyarchenko, Nina and Elias, Leonardo. 2024. The Global Credit Cycle. Staff Reports 1094, Federal Reserve Bank of New York.

Brainard, William C. 1967. Uncertainty and the Effectiveness of Policy. The American Economic Review, 57(2), 411–425.

Breeden, Sarah. 2024. Financial Stability at Your Service. Speech based on remarks given at Wharton-IMF Transatlantic Dialogue, Washington DC, 10 September 2024.

Brownlees, Christian, & Engle, Robert. 2017. SRISK: A Conditional Capital Shortfall Measure of Systemic Risk. The Review of Financial Studies, 30(1), 48–79.

Brunnermeier, Markus K., and Sannikov, Yuliy. 2014. A macroeconomic model with a financial sector. American Economic Review, 104(2), 379–421.

Budnik, Katarzyna, Groß, Johannes, Vagliano, Gianluca, Dimitrov, Ivan, Lampe, Max, Panos, Jiri, Velasco, Sofia, Boucherie, Louis, and Jancokova, Martina. 2023. BEAST: A model for the assessment of system-wide risks and macrofinancial policies. Working Paper Series 2855. European Central Bank.

Caldara, Dario, Harrison, Richard and Lipinska, Anna. 2014. Practical tools for policy analysis in DSGE models with missing shocks. Journal of Applied Econometrics, 29(7), 1145–1163.

Catalan, Mario, and Hoffmaister, Alexander. 2022. When Banks Punch Back: Macrofinancial Feedback Loops in Stress Tests. Journal of International Money and Finance, 124(C)

Cesa-Bianchi, A., Dickinson, R., Kösem, S., Lloyd, S., and Manuel, E. (2021). No economy is an island: how foreign shocks affect UK macrofinancial stability. Bank of England Quarterly Bulletin, 2021 Q3.

Clerc, Laurent, Derviz, Alexis, Mendicino, Caterina, Moyen, Stephane, Nikolov, Kalin, Stracca, Livio, Suarez, Javier, and Vardoulakis, Alexandros P. 2015. Capital Regulation in a Macroeconomic Model with Three Layers of Default. International Journal of Central Banking, 11(3), 9–63.

Coen, Jamie and Coen, Patrick, 2022. A Structural Model of Liquidity in Over-The-Counter Markets. Bank of England Working Papers No. 979, Bank of England.

Coimbra, Nuno & Rey, Helene. 2024. Financial cycles with heterogeneous intermediaries. The Review of Economic Studies, 91(2), 817–857.

Collard, Fabrice, Dellas, Harris, Diba, Behzad, andLoisel, Olivier. 2017. Optimal Monetary and Prudential Policies. American Economic Journal: Macroeconomics, 9(1), 40–87.

Covi, Giovanni and Huser, Anne-Caroline. 2024. Measuring Capital at Risk with financial contagion: two sector model with banks and insurers. Bank of England working papers 1081. Bank of England.

Davila, Eduardo, and Korinek, Anton. 2018. Pecuniary Externalities in Economies with Financial Frictions. Review of Economic Studies, 85(1), 352–395.

Davila, Eduardo, & Walther, Ansgar. 2023. Prudential Policy with Distorted Beliefs. American Economic Review, 113(7), 1967–2006.

De Paoli, Bianca, and Paustian, Matthias. 2017. Coordinating Monetary and Macroprudential Policies. Journal of Money, Credit and Banking, 49(2-3), 319–349.

Debortoli, Davide, Kim, Jinill, Linde, Jesper, and Nunes, Ricardo. 2019. Designing a Simple Loss Function for Central Banks: Does a Dual Mandate Make Sense? The Economic Journal, 129(621), 2010–2038.

Demekas, Dmitri. 2019. Building an Effective Financial Stability Policy Framework: Lessons from the Post-Crisis Decade. LSE Research Online Documents on Economics 100483.

Der Ghote, Alejandro Van. 2021. Interactions and Co-ordination between Monetary and Macrofinancial Policies. American Economic Journal: Macroeconomics, 13(1), 1–34.

Duffie, Darrell. 2011. Systemic Risk Exposures: A 10-by-10-by-10 Approach. NBER Working Papers 17281, National Bureau of Economic Research Inc.

Duffie, Darrell, Garleanu, Nicolae, and Pedersen, Lasse Heje. 2005. Over-The-Counter Markets. Econometrica, 73(6), 1815–1847.

Duncan, Alfred., and Nolan, Charles. 2018. Financial Frictions in Macroeconomic Models. Oxford Research Encyclopedia of Economics and Finance.

Edge, Rochelle M. 2003. A utility-based welfare criterion in a model with endogenous capital accumulation. Finance and Economics Discussion Series 2003-66. Board of Governors of the Federal Reserve System (U.S.).

Farhi, Emmanuel, and Werning, Ivan. 2016. A Theory of Macrofinancial Policies in the Presence of Nominal Rigidities. Econometrica, 84(September), 1645–1704.

Ferrero, Andrea, Harrison, Richard, and Nelson, Benjamin. 2024. House price dynamics, optimal LTV limits and the liquidity trap. Review of Economic Studies, 91(2), 940–971.

Frost, Jon, Gambacorta, Leonardo, Huang, Li, Shin, Hyun Song and Zbinden, Pablo. 2019. BigTech and the changing structure of financial intermediation. Economic Policy, CEPR, CESifo, Sciences Po;CES;MSH, vol. 34(100), pages 761–799.

Garriga, Carlos, Kydland, Finn E., and ˇSustek, Roman. 2021. MoNK: Mortgages in a New-Keynesian model. Journal of Economic Dynamics and Control, 123(C).

Gertler, Mark, and Kiyotaki, Nobuhiro. 2010. Financial Intermediation and Credit Policy in Business Cycle Analysis. Chap. 11, pages 547–599 of: Friedman, Benjamin M., and Woodford, Michael (eds), Handbook of Monetary Economics. Handbook of Monetary Economics, vol. 3. Elsevier.

Gilchrist, Simon, and Zakrajsek, Egon. 2012. Credit spreads and business cycle fluctuations. American Economic Review, 102(4), 1692–1720.

Greenwald, Daniel. 2016. The Mortgage Credit Channel of Macroeconomic Transmission. 2016 Meeting Papers 1551. Society for Economic Dynamics.

Haldane, Andrew, and Turrell, Arthur. 2017. An interdisciplinary model for macroeconomics. Bank of England working papers 696. Bank of England.

Hansen, Lars Peter, and Sargent, Thomas J. 2001. Robust Control and Model Uncertainty. American Economic Review, 91(2), 60–66.

Hansen, Lars Peter, and Sargent, Thomas J. 2008. Robustness. Princeton University Press.

Harrison, Richard, and Waldron, Matt. 2021 (Feb.). Optimal policy with occasionally binding constraints: piecewise linear solution methods. Bank of England working papers 911. Bank of England.

Hauser, Andrew. 2023. A journey of 1000 miles begins with a single step: filling gaps in the central bank liquidity toolkit. Speech given at a Market News International Connect Event, London.

Hüser, Anne-Caroline and Lepore, Caterina and Veraart, Luitgard Anna Maria. 2024. How does the repo market behave under stress? Evidence from the COVID-19 crisis. Journal of Financial Stability, 70(C).

Jeanne, Olivier, and Korinek, Anton. 2020. Macroprudential Regulation versus mopping up after the crash. The Review of Economic Studies, 87(3), 1470–1497.

Jordà, Oscar, Schularick, Moritz, & Taylor, Alan M. 2013. When Credit Bites Back. Journal of Money, Credit and Banking, 45(s2), 3–28.

Jordà, Oscar, Schularick, Moritz, & Taylor, Alan M. 2015. Leveraged bubbles. Journal of Monetary Economics, 76(S), 1–20.

Kahneman, Daniel. 2002. Maps of Bounded Rationality. Nobel Prize in Economics Documents 2002-4, Nobel Prize Committee.

Kaplan, Greg, Mitman, Kurt, and Violante, Giovanni L. 2020. The Housing Boom and Bust: Model Meets Evidence. Journal of Political Economy, 128(9), 3285–3345.

Khan, Aubhik, & Thomas, Julia K. 2013. Credit Shocks and Aggregate Fluctuations in an Economy with Production Heterogeneity. Journal of Political Economy, 121(6), 1055–1107.

Kindleberger, C P. 1978. Manias, Panics, and Crashes: A History of Financial Crises, Macmillan.

Koijen, Ralph S.J., and Yogo, Motohiro. 2019. A Demand System Approach to Asset Pricing. Journal of Political Economy, 127(4), 1475–1515.

Korinek, Anton, and Simsek, Alp. 2016. Liquidity Trap and Excessive Leverage. American Economic Review, 106(3), 699–738.

Kremer, Manfred, Lo Duca, Marco, and Hollo, Daniel. 2012. CISS – a composite indicator of systemic stress in the financial system. Working Paper Series 1426. European Central Bank.

Lipsey, R. G., and Lancaster, Kelvin. 1956. The General Theory of Second Best. Review of Economic Studies, 24(1), 11–32.

Lipsey, Richard. 2007. Reflections on the general theory of second best at its golden jubilee. International Tax and Public Finance, 14(4), 349–364.

Liu, A., Paddrik, M., Yang, S. Y., and Zhang, X. 2020. Interbank contagion: An agent-based model approach to endogenously formed networks. Journal of Banking and Finance, 112(C).

Lloyd, Simon, Manuel, Ed, and Panchev, Konstantin. 2024. Foreign vulnerabilities, domestic risks: the global drivers of GDP-at-Risk. IMF Economic Review, 72(1), 335–392.

Lorenzoni, Guido. 2008. Inefficient Credit Booms. Review of Economic Studies, 75(3), 809–833.

Lucas Jr, Robert. 1987. Models of Business Cycles. Basil Blackwell, New York.

Martin, Ian, and Shi, Ran. 2023. Forecasting Crashes with a Smile. CEPR Discussion Papers 18524.

Mendicino, Caterina, Nikolov, Kalin, Suarez, Javier, and Supera, Dominik. 2018. Optimal Dynamic Capital Requirements. Journal of Money, Credit and Banking, 50(6), 1271–1297.

Mendicino, Caterina & Nikolov, Kalin, Suarez, Javier and Supera, Dominik, 2020. Bank capital in the short and in the long runJournal of Monetary Economics, Elsevier, vol. 115(C), pages 64–79.

Minsky (1977): The Financial Instability Hypothesis: An interpretation of Keynes as an Alternative to ‘Standard’ Theory. Nebraska Journal of Economics.

Monacelli, Tommas and Jamilov, Rustam. 2020. Bewley Banks. CEPR Discussion Papers No. 15428.

Munoz, Manuel and Smets, Frank. 2024. The Positive Neutral Countercylical Capital Buffer. CEPR Discussion Papers No. 19790.

Palley, Thomas. 2011. A Theory of Minsky Super-Cycles and Financial Crises. Contributions to Political Economy, Cambridge Political Society, 30(1), 31–46.

Pflueger, Carolin, Siriwardane, Emil, and Sunderam, Adi. 2020. Financial Market Risk Perceptions and the Macroeconomy. The Quarterly Journal of Economics, 135(3), 1443–1491.

Popoyan, Lilit, Napoletano, Mauro and Roventini, Andrea. 2017. Taming macroeconomic instability: Monetary and macro-prudential policy interactions in an agent-based model. Journal of Economic Behaviour and Organization, 134, 117–140.

Ravn, Morten, and Sterk, Vincent. 2021. Macroeconomic Fluctuations with HANK and SAM: an Analytical Approach. Journal of the European Economic Association, European Economic Association, 19(2), 1162–1202.

Reinhardt, D., and Sowerbutts, R. (2015). Regulatory arbitrage in action: Evidence from banking flows and macroprudential policy. Bank of England Working Paper No. 546.

Romer, Christina D., and David H. Romer. New Evidence on the Aftermath of Financial Crises in Advanced Countries. American Economic Review, vol. 107, no. 10, 2017, pp. 3072–3118

Schularick, Moritz, and Taylor, Alan M. 2012. Credit Booms Gone Bust: Monetary Policy, Leverage Cycles, and Financial Crises, 1870-2008. American Economic Review, 102(2), 1029–1061.

Smets, Frank, and Wouters, Rafael. 2007. Shocks and Frictions in US Business Cycles: A Bayesian DSGE Approach. American Economic Review, 97(3), 586–606.

Stiglitz, Joseph, Sen, Amartya, and Fitoussi, Jean-Paul. 2009. Report by the Commission on the Measurement of Economic Performance and Social Progress.

Stiglitz, J.E., and Weiss, A. 1981. Credit Rationing in Markets with Imperfect Information. American Economic Review, 71(3), 393–410.

Sydow, Matthias, Schilte, Aurore, Covi, Giovanni, Deipenbrook, Marija, Del Vecchio, Leonardo, Fiedor, Pawel, Fukker, Gabor, Gehrend, Max, Gourdel, Regis, Grassi, Alberto, Hilberg, Bjorn, Kaijser, Michiel, Kaoudis, Georgios, Mingarelli, Luca, Montagna, Mattia, Piquard, Thibault, Salakhova, Dilyara, Tente, Natalia. 2024. Shock Amplification in an Interconnected Financial System of Banks and Investment Funds, Journal of Financial Stability, 71(C).

Tella, Sebastian Di. 2019. Optimal Regulation of Financial Intermediaries. American Economic Review, 109(1), 271–313.

Ulanowicz, Robert E. 2020. Quantifying sustainable balance in ecosystem configurations. Current Research in Environmental Sustainability, 1, 1–6.

Uslu, Semih. 2019. Pricing and Liquidity in Decentralized Asset Markets. Econometrica, 87(6), 2079–2140.

  1. The Bank of England would like to thank for their comments and input into this review David Aikman, Dimitri Demekas, Charles Goodhart, Anil Kashyap, Michael Kiley, Anna Kovner and Hyun Song Shin.

  2. The authors would also like to thank Niki Anderson, James Benford, Nathanaël Benjamin, Kristina Bluwstein, Colette Bowe, Sarah Breeden, Geoff Coppins, Olga Filipenko, Lee Foulger, Julia Giese, Manuel Gloria, Daniel Gray, Bernat Gual-Ricart, Jon Hall, Rashmi Harimohan, Marc Hinterschweiger, Adrian Hitchens, Bonnie Howard, Simon Jurkatis, Sudipto Karmakar, Grellan McGrath, Liz Oakes, Will Parry, Huw Pill, Chiara Punzo, Dennis Reinhardt, Niamh Reynolds, Marek Rojicek, David Rule, Victoria Saporta, Sophie Stone, Arthur Turrell, Nicholas Vause, Benjamin Westwood, Jean-Charles Wijnandts and Carolyn Wilkins for comments and contributions.

  3. More formally, the Act establishing the FPC defines its (primary) objective as ‘contributing to the achievement by the Bank of the Financial Stability Objective’ and separately defines the FPC’s responsibility in this regard as that cited here. It further says that ‘the Act does not require or authorise the Committee to exercise its functions in a way that would in its opinion be likely to have a significant adverse effect on the capacity of the financial sector to contribute to the growth of the UK economy in the medium or long term’.

  4. Systemic risk is defined in the Bank of England Act 1998 (‘the Act’) as ‘a risk to the stability of the UK financial system as a whole or of a significant part of that system’.

  5. ‘Financial stability’ policy is used in preference to ‘macroprudential’ policy throughout this document. This is in recognition that interventions to support the objectives of macroprudential policy may not always be ‘prudential’ in the sense of always being actions that build resilience. For example, market operations, such as that deployed by the Bank in the wake of the LDI crisis, can help the FPC to meet its objectives, but are not prudential in the traditional sense of the word.

  6. Benjamin (2025) discusses the FPC’s operating modes further.

  7. There is a large literature considering the role of frictions in creating vulnerabilities. For example, several papers consider the role of pecuniary externalities linked to collateral constraints, including by interacting with aggregate demand externalities and/or monetary policy constraints, such as the zero lower bound (ZLB). Further details of this literature are in Annex Table A1.A. A growing literature also looks at identifying frictions that can drive both vulnerabilities and inefficiencies. For example, the literature surveyed by Duncan and Nolan (2018) and Ajello et al (2022) explores how frictions (eg asymmetric information), distortions (eg externalities), and non-rational behaviour (eg overconfidence) can drive financial and economic stresses (including through their role in credit cycles). Many of the papers surveyed by these authors also examine financial frictions in a broader macroeconomic context, and thus – implicitly or explicitly – consider the effect of frictions outside of stresses.

  8. An outcome is ‘Pareto efficient’ if no one can be made better off without someone else being made worse off. ‘Full’ or ‘perfect’ information here refers to a situation where agents have full and accurate knowledge of all relevant aspects including their own and other agents’ preferences, the good or service being traded, and all relevant prices.

  9. A related, important, point is that amplification is not always a sign of inefficiency. This is shown formally by Davila and Korinek (2018). In this vein, the existence of a Schumpeterian trade-off between short-term distress and long-term dynamism, whereby corporate bankruptcy generates short-term amplification costs but long-term resource reallocation efficiency benefits, illustrates that amplification may not always be (constrained) inefficient.

  10. A loss function is one way to mathematically formulate policymakers’ objective(s). For example, a common form of loss function used by monetary policymakers quantifies the cost of deviations of inflation from target and fluctuations in output from some reference level (ie an output gap). As this example demonstrates, the terms in loss function can be interpreted as targets. Those targets can be operational targets in the sense of targets that have been assigned to the policymaker (as in inflation targets) or they can be targets that are viewed as being roughly consistent with a high-level objective that has been assigned to the policymaker (as in financial stability). In theoretical models, loss functions can be derived from household utility. These derivations typically proceed by taking a “quadratic” approximation (for analytical tractability). In such cases, the minimisation of a loss function derived in this way can be thought of as being approximately consistent with welfare maximisation. In the real world, there are several challenges to specifying and measuring welfare. Even without these challenges and as discussed later in this section, the co-existence of multiple policymakers complicates the assignment of loss functions that would deliver overall welfare maximisation. Nevertheless, since economic welfare is what ultimately matters, it is desirable that any mathematical formalisation of a policymaker’s objective in the form of a loss function should have some relationship with the ultimate objective of welfare.

  11. An important methodological issue applies here. Ferrero et al (2024) and De Paoli & Paustian (2017) use a ‘linear-quadratic’ framework for policy analysis. But, given that financial frictions can involve significant non-linearities, their welfare effects may be better studied with non-linear methods. But these methods do not allow for the derivation of a welfare-consistent quadratic loss function, implying an analytical trade-off.

  12. This is also true in models without financial frictions. Edge (2003) shows that even the seemingly small extension of the canonical New-Keynesian model to incorporate capital accumulation materially complicates the welfare-consistent loss function.

  13. For example, Debortoli et al (2019) who study the performance of optimal policy using a simple dual monetary mandate loss function in approximating the welfare-consistent optimal policy in the Smets & Wouters (2007) model. They find that optimal policy based on a dual mandate loss function does a reasonable job of approximating welfare-consistent optimal policy provided that the weight on output in the dual mandate loss function is set appropriately.

  14. A Nash equilibrium is one in which no one can change strategy unilaterally and improve their payoff, given the strategies of others.

  15. Specifically, the losses from a lack of co-ordination are eliminated when the macroprudential authority acts as a within-period leader. They argue that this is likely to be a reasonable approximation of reality given the relatively lower frequency over which financial stability authorities tend to adjust their instruments.

  16. For example, Mendicino et al (2018, 2020) establish that imposing capital requirements on banks can be Pareto improving (ie improve all households’ welfare simultaneously) up to a certain point, but that raising capital requirements beyond that point will embody intertemporal and distributional trade-offs.

  17. More formally, the Tinbergen principle says that for each independent policy objective, there must be at least one independent policy instrument. Demekas (2019) argues persuasively that financial stability policy should be seen as a portfolio of instruments which are closely co-ordinated.

  18. For example, Gertler and Kiyotaki (2010).

  19. Other measures include reciprocity arrangements.

  20. This would also be consistent with standard formulations of utility functions that suggest that individuals suffer more from a given loss in consumption than they gain from an increase of the same amount. This property is often referred to as risk aversion and reflects declining marginal utility of consumption.

  21. The Bank of England Act 1998 as amended by the Financial Services Act 2012.

  22. More formally, the FPC’s primary objective is contributing to the achievement by the Bank of its financial stability objective to protect and enhance the stability of the UK financial system. Less formally, the FPC’s role can be thought of – to a large extent – as reducing the financial system’s propagation of shocks to the real economy. Ie reducing the size of the arrows and boxes in Figure 5 (eg vulnerabilities).

  23. The FPC must have regard to The Bank’s Financial Stability Strategy. Under the terms of the Act, the Bank’s Court of Directors has a duty to determine the strategy and to review it every three years, a responsibility which, in practice, it has delegated to the FPC.

  24. Vital services are defined in the Strategy as: the provision of payment and settlement services, intermediating between savers and borrowers, channelling savings into investment; and insuring against and dispersing risk.

  25. The FPC can also advise on financial stability relevant issues in relation to the MPC’s decisions.

  26. The UK’s securities settlement system and central securities depository for equities and gilts, now operated privately.

  27. Such as cross-authority working groups, cross-memberships of committees and memorandums of understanding outlining cooperation and information sharing.

  28. Bailey (2024a) and Breeden (2024) discuss challenges and the framework for financial stability policy respectively with both emphasising the important of a system-wide focus for financial stability policy. Also relevant is Bennett et al (2024a) who refresh the description of the Bank’s institutional arrangements for delivering its financial stability objective and Bennett et al (2024b) sets out the specific role the FPC plays in that.

  29. CCyB (2023), SCR (2014), leverage ratio (2021 update), housing tools (2016).

  30. Stiglitz and Weiss (1981).

  31. More formally, a utility function is a mathematical description of how a household’s satisfaction depends on factors such as consumption. In this context, ‘welfare’ is measured by combining households’ utility functions in some way. For example, by simple aggregation (as in ‘Utilitarian’ welfare) or a more complex treatment, like defining welfare as the utility of the worst-off household (as in ‘Rawlsian’ welfare). This raises a critical question – not discussed here – of which method should be used to aggregate welfare across agents. Moreover, regardless of how aggregate welfare at any point in time is calculated, what typically matters for public policy is ‘total’ welfare over time and states, not just a ‘snapshot’ of it. Since utility – and therefore welfare – can take different values across time and different states of the world; these must be appropriately discounted and probability-weighted to derive ‘total’ aggregate welfare.

  32. Effects on those agents that are directly involved in the activities are not typically considered ‘externalities’, in that they are not ‘external’ effects.

  33. Footnote 7 sets out examples of other papers considering the role of pecuniary externalities linked to collateral constraints.

  34. For example, Acharya et al (2022).

  35. Rationality in this context can be described simply as decision-making by agents that maximises their utility, given available information and constraints.

  36. For example, Romer and Romer (2017), who estimate the effects of financial crises on output.

  37. Benchmark papers setting out optimal policy in the context of time-varying frictions and distortions include several that look at externalities associated with limits to risk sharing, incentive compatibility constraints and/or collateral constraints, finding that optimal policy is countercyclical. Including: the use of state-contingent taxes on borrowing or capital requirements (Bianchi and Mendoza (2018), Davila and Korinek (2018), Collard et al (2017)), countercyclical LTV limits (Ferrero et al (2024)), state-contingent leverage constraints (der Ghote (2021)), and state-contingent taxes on securities (Farhi and Werning (2016)) and assets (Tella (2019)).

  38. Similarly, self-insurance may not completely disincentivise fire sales; in the continued presence of frictions and distortions such as externalities there may be socially sub-optimal incentives to avoid losses even if self-insurance means they can be borne without distress. For example, because of effects on the cost of firms’ capital.

  39. Some argue that ex-ante tools should be used to build resilience against more frequent or foreseeable stresses while ex-post tools should address rare, extreme events that firms cannot feasibly self-insure against. For example, Hauser (2023) for a broader discussion (and references therein).

  40. Some papers find that central bank liquidity policy can be very effective in mitigating externalities in stresses. But this can come at the cost of potential distortions (eg Bandera and Stevens (2024)) and perverse incentives, if not carefully designed and targeted. For example, Jeanne and Korinek (2020) who find that in general, the optimal policy mix contains both prudential and liquidity policies and the design of one affects the optimal design of the other.

  41. The main text sets out the full wording of the primary objective.

  42. This definition draws on Allen and Wood (2006).

  43. Both outright interruption of services and other deterioration in their availability and pricing (such as unduly procyclical provision).

  44. These are minor variations of the first three desirable properties in Allen and Wood (2006).

  45. At least up to a certain value, for those monies recognised by the state.

  46. Bailey (2024b).

  47. Financial Policy Summary and Record of the Financial Policy Committee meeting on 23 March 2023.

  48. Protecting public funds is a statutory objective for resolution policy, a component of financial stability policy.

  49. Although not included in the list of features of the ideal risk assessment framework, it would also be desirable to be able to use the risk assessment framework for aspects of policy analysis, including testing the efficacy of interventions in mitigating risks. This would help bring the risk assessment framework discussed in this section closer to the policy modelling discussed in Section 3.6.

  50. The FPC's June 2024 core indicators.

  51. For example, Schularick and Taylor (2012); Jordà et al (2013, 2015); Bluwstein et al (2023).

  52. The Bank has updated its approach to stress testing the UK banking system. This was published alongside the November 2024 Financial Stability Report. In the updated approach stress tests with firm participation would ordinarily be expected to take place every other year with desk-based stress tests in the intervening years (as a reflection of the costs of full bank participatory stress testing and of the flexibility benefits of desk-based stress tests (eg by facilitating exploration of a wider set of scenarios)).

  53. For insurers, this includes the Life Insurance Stress Test (LIST) and the Dynamic General Insurance Stress Test (DyGIST). LIST 2025 results will be published in Q4 this year; the exercise includes one core scenario (resilience to evolving financial markets stress) and two exploratory scenarios (developing capabilities towards asset-level stresses and funded reinsurance recapture). The DyGIST will be run in 2026.The Bank’s 2025 Stress Test of UK Central Counterparties (CCPs) is the Bank’s fourth such exercise. The Bank conducts regular stress tests of UK CCPs in order to assess their financial resilience, and to promote transparency and confidence in the UK clearing system. These exercises are exploratory rather than ‘pass-fail’ and the findings are used to identify potential areas of risk and to support and inform the Bank’s supervisory and regulatory activities. The 2025 CCP Stress Test will focus on assessing the resilience of UK CCPs to the default of two or more of its members during a severe market stress. This includes a core Credit Stress Test and additional reverse stress testing and sensitivity testing that explores how the results change under increasingly severe assumptions. This year’s exercise will not include a full Liquidity Stress Test, but we will explore liquidity risks in more qualitative manner with firms and assess how risks have evolved since last tested. The Stress Test will also consider the impact on the wider financial system via initial margin and variation margin calls.

  54. The System-Wide Exploratory Scenario and November 2024 Financial Stability Report.

  55. System-wide dynamics can also be relevant for bank solvency tests. In some circumstances, wider system dynamics are likely to increase the capital drawdown by banks in adverse scenarios (eg Sydow et al (2024)).

  56. Note: some services – like payment systems – are used both by real economy end-users and as intermediate services by other financial service providers. For simplicity we distinguish services by their uses here (so payment services would appear in both the intermediate services and end-user service boxes, reflecting their importance throughout the financial system).

  57. Such losses could then lead to a (procyclical) increase in risk aversion by lenders which would feed back again to the real economy.

  58. The association between borrowing and such ‘aggregate demand’ externalities is discussed in some of the literature referred to in Annex A, such as Korinek and Simsek (2016).

  59. The component of corporate bond credit spreads that is not directly attributable to default risk and is argued by Gilchrist and Zakrajšek (2012) to provide an effective measure of investor sentiment or risk appetite in the corporate bond market (eg FEDS Notes, 2016)).

  60. An alternative, simpler, binary approach (eg material versus not) however might – in certain circumstances - lead to over-inclusion of risks due to framing effects.

  61. For example, the Stiglitz-Sen-Fitoussi Commission (2009) discusses limitations of GDP as a proxy for welfare.

  62. For instance, Lloyd et al (2024) find that faster foreign credit growth is linked to more severe domestic GDP tail risks. And Cesa-Bianchi et al (2021) find that around half of the variation in UK economic activity, and almost all the variation in a summary measure of UK financial market conditions, can be explained by global shocks over the period 1997–2019.

  63. The same approach could also be used to conduct limited forms of policy analysis. For example, how would a policy that limits leverage of, say, hedge funds, affect amplification in the system. As discussed in Section 3.6, this analysis is likely to be limited because it would not be designed to evaluate the costs and benefits of policy interventions. Depending on the model being used, it may also not be well-suited to assessing the system-wide implications of policy interventions (ie full general equilibrium effects, including leakages).

  64. As noted in the main text, there is a trade-off between coverage and granularity with tractability. There may also be diminishing returns to capturing increasing numbers of financial entities (once entities are ordered by size), especially in sectors dominated by a small number of firms. These observations drove the Duffie (2011) proposal for a ‘10-by-10-by-10’ network-based approach to monitoring systemic risks. The approach subjects a core group of 10 systemic entities to 10 stress scenarios for which they report their gains or losses and the 10 counterparties for which these gains or losses are the greatest in magnitude. This process allows the identification of the top systemic counterparties whose distress creates the greatest losses for reporting firms in a given scenario. Identified systemic entities can be added to the list of reporting entities and the process can be iterated over time.

  65. It should be noted that there is a very grey area around the classification and application of network models. Depending on the behavioural assumptions used and the way networks are embedded into the broader model, they may be classified as either semi-structural or agent-based models. Put differently, by themselves, plain vanilla network models are not useful for financial stability scenario analysis because they do not contain (dynamic) behavioural assumptions.

  66. Eg overlapping portfolio holdings and solvency-liquidity feedback loops.

  67. Functionally, these extensions align the model with Sydow et al (2024), who develop a coherent stress testing framework capturing the interactions between banks and investment funds through a comprehensive set of channels. They apply it to a stress scenario calibrated to the Covid outbreak and granular Euro Area data to analyse the resilience of the financial sector.

  68. Footnote 64 sets out more detail.

  69. For example, in theory it would be possible to map bank and insurer non-financial company exposures as in Covi and Hüser (2024) into Companies House accounting data (and into the ONS Annual Business Survey) and add some behavioural assumptions (in the spirt of agent-based modelling) to model corporate spending responses in the scenario taking into account financial amplification in the financial sector block.

  70. While a micro-founded modelling of the production function of financial entities capable of capturing operational considerations is beyond the scope of the modelling investment proposed (and the state of the art in the literature), it is relevant for the calibration (and estimation) of financial amplification mechanisms (and so might be captured in a reduced-form sense).

  71. As exemplified in Annex Table B1.A, information collected/collated as part of the mapping exercise could also inform behavioural models.

  72. Introducing the Credit Market Sentiment Index.

  73. This is also a challenge that applies to some of the systemic risk indicators discussed in Section 3.4 that do not attempt to isolate financial amplification-driven movements in financial conditions.

  74. Using data from the Bank of England’s Credit Conditions Survey, the study examines the influence of macroeconomic indicators, bank balance sheet variables (excluding capital and liquidity), and financial risk measures on lending standards. Any unexplained variation is attributed to financial amplification effects. These effects can then be linked to economic outcomes via empirical estimates of credit supply shocks.

  75. This could draw on literature examining the predictive power of measures of funding market conditions for future real activity (such as Gilchrist & Zakrajšek (2012) and Boyarchenko & Elias (2024)).

  76. Additionally, those models do not contain an explicit account of welfare or real economic outcomes across different groups of the population.

  77. For example, those modelling approaches typically do not capture the underlying frictions that give rise to a motive for policy interventions, including those that drive externalities, thereby obscuring understanding of the ultimate rationale for policy interventions. Further, those models do not include an account of how a policy reaction function systematically affects agents’ decisions via their expectations, necessary for a more complete understanding of the effects of state-contingent financial stability policy. In recognition of these deficiencies, alternative models are required for policy analysis. The flipside is that models that are better designed for policy analysis are likely to be less empirically relevant in quantifying the effects of particular shocks or events, as compared to some of the modelling approaches discussed in the risk assessment parts of this section.

  78. General equilibrium has been excluded as a criterion because all these model classes are general equilibrium in the sense that they either explicitly or can be set up to explicitly solve for prices and quantities simultaneously in a way that clears markets and respects resourcing constraints. Empirical realism has been excluded because it is more subjective and its assessment would depend on the dimension of empirical realism being assessed, as well as the methodology used to estimate or calibrate the model. In general, semi-structural models are likely to embody more empirical realism than, say, representative agent DSGE models but that is not universally true across comparisons of all models in these classes.

  79. For example, because of differences in preferences (ex ante) or in shock realisations and incomplete insurance (ex post).

  80. For example, special case assumptions can be made about preferences and technology to facilitate aggregation, aggregate shocks can be excluded leaving comparative statics (steady state) analysis or ‘zero-probability’ shocks with transition paths back to steady state, or dimensions of heterogeneity can be reduced.

  81. DSGE models also feature endogenous financial cycles, but these are typically tied very closely to business cycles (except where exogenous financial shocks drive a wedge between the two cycles).

  82. It is an empirical question as to whether heterogeneous-agent DSGE or agent-based models should be preferred. In the absence of thorough empirical evidence, the two model types should perhaps be seen as complementary. They can be used to analyse similar questions but do so in different ways.

  83. In both cases, the underlying problem is that the model is unlikely to control for all state variables relevant for how the economy and financial system responded. This means that these events will influence parameter estimates to some degree in a non-structural manner.

  84. Aikman et al (2019) develop a model for assessing how the UK’s MBF system might behave under stress. It features representative agents interacting in asset, funding and derivatives markets. The authors document how the range of solvency and liquidity constraints on these agents can lead to an amplification of adverse shocks through feedback loops.

  85. As part of the proposed work on furthering understanding of macrofinancial models, it may also be desirable to draw on models of bank capital from the literature. Candidates include the DSGE model of Mendicino et al (2018), which extends the Clerc et al (2015) ‘three layers of default’ model, or the similar but simpler macro-banking of Munoz and Smets (2024). Alternatives include the heterogeneous agent models of Monacelli and Jamilov (2020) or Coimbra and Rey (2024), both of which explicitly model heterogeneity across banks. Complementary modelling approaches, such as ABMs, could be considered for this purpose as well.

  86. This agenda would form a small part of the Bank’s response to the Bernanke review of forecasting for monetary policy making and communication at the Bank of England. Recommendation 4 of that review included recommending that the revamped forecasting framework should include ‘detailed models of the financial sector’ among other things.

  87. Past internal work on the interaction between monetary and financial stability policy includes the two-period model of Aikman et al (2023b). As noted in the main text, the HANK literature likely has a lot of potential to be developed to include financial stability policy alongside monetary policy.

  88. July 2025 Financial Stability Report.

  89. It would also be of interest to combine two or more of these scenarios. For example, a macroeconomic recessionary scenario would often go together with financial market turmoil. As noted, in practice, the ability to do that depends on the modelling technology. The modelling proposals made here would not support combinations of these scenarios. Construction of models that combine amplification mechanisms relevant to more than one of these types of scenario is left to future work.

  90. For example, the Bank has in the past run a climate exploratory scenario to explore the implications of alternative climate scenarios, designed to explore both physical and transition (to net zero) risk. Results of the 2021 Climate Biennial Exploratory Scenario.

  91. This overarching objective is also relevant for scenario design. For example, scenarios designed based on observation of historical outcomes will incorporate financial amplification to the extent that that was a feature of the data on which the scenario is built. In particular, the tail of outcomes over recent history includes the GFC, an event that was largely driven by financial amplification. This matters for the interpretation of the scenario and potentially also for the resulting amplification in the model used for quantification.

  92. Used to match the projection for credit in the VAR block with the output of the microeconometric block in the spirit of the methodology described in Caldara et al (2014).

  93. The brevity of this section relative to the preceding sections reflects that the data strategy has already been fully scoped out and so less exploration of different approaches is needed (than for thinking about financial sector mapping and modelling for example).

  94. It is currently unclear the precise ways in which AI will prove to be useful for risk assessment. Use of AI techniques in statistical modelling require evaluation in the same way that any statistical modelling would, including back-testing and out-of-sample forecasting where appropriate. AI is likely to be particularly useful in revealing patterns or pattern changes in large datasets (such as in the planned automated tool to monitor hedge fund exposures) as a prompt for further (human-led) investigation. It is also likely to be particularly useful in analysis of large, unstructured data like company reports that would otherwise be impossible for a human to undertake at reasonable cost. In general, some experimenting is going to be necessary to learn how to make the most of AI for financial stability analysis. Future uses may include utilising AI approaches to behavioural modelling for risk assessment as part of the suite described in Section 4, especially in the event that routine use of AI becomes more widespread among financial entities in a way that effectively governs their behaviour relevant for risk assessment.

  95. Khan & Thomas (2013) examines the effect of credit supply shocks on productivity via capital misallocation in a heterogeneous agent model, but not the optimal response to that.

  96. See Davila and Walther (2023) for an example of a paper that studies the implications of non-rational (distorted) beliefs for prudential policy. The paper shows that the implications of distorted beliefs for policy depends on whose beliefs are distorted.