Innovations in PD Modeling for IFRS 9: Extending the Vasicek Framework

May 2025
7 min read

In the rapidly shifting landscape of financial risk management, accurate estimation of default probabilities is crucial for informed decision-making.


According to the IFRS 9 standards, financial institutions are required to model probability of default (PD) using a Point-in-Time (PiT) measurement approach — a reflection of present macroeconomic conditions. In practice, PiT PD estimates are most often obtained through the conversion of their Through-the-Cycle (TtC) counterpart. As the Vasicek model has long stood as the industry-standard for this conversion, Zanders is continuously driving model enhancements through novel research. This research delves into modern adaptations of the industry-standard Vasicek methodology. 

This article highlights collaborative research involving 17 students from Erasmus University Rotterdam, aiming to infuse greater granularity into credit risk modeling. Research was conducted by four student teams in the form of a group seminar project. Additionally, one student investigated this topic as part of her master thesis. By integrating both advanced statistical and machine learning techniques, this research showcases how modern adaptations could be introduced to redefine the traditional Vasicek framework, offering deeper insights into PD conversion methodologies. These enhancements provide flexibility and interpretability and contribute to a more extensive modeling toolkit.   

Background 

Compliance with International Financial Reporting Standard (IFRS 9) requires companies to obtain PiT PD estimates, which are influenced by macroeconomic variables. Banks reporting under IFRS 9 often use the TtC counterpart as a starting point, making use of various different conversion techniques to obtain the PiT PD (also our previous blog post A comparison between Survival Analysis and Migration Matrix Models). The industry-standard methodology to obtain these is by means of conversion through the Vasicek framework. The TtC PD reflects the PD irrespective of systemic factors, thereby reflecting the long-term average of the PD. Contrarily, the PiT PD reflects the probability that a party defaults at a specific point in the macroeconomic cycle, implying that PiT PD estimates fluctuate throughout the macroeconomic cycle. The mathematical technique introduced by Vasicek in 1977 and formalized in 2002 ​(Vasicek, An equilibrium characterization of the term structure, 1977; Vasicek, The distribution of loan portfolio value, 2002)​, serves as the industry-standard method for performing this conversion, integrating both systematic and idiosyncratic risks. 

Understanding the Vasicek Model 

Under the Vasicek model, the PiT PD can be derived from the TtC PD with the use of the Z-factor, which is defined as the state of the economy. The Z-factor corresponds to the systematic factor within the Vasicek framework and should be modeled as a function of macroeconomic variables. The linear regression constitutes a benchmark for modeling the Z-factor within the IFRS 9 framework due to its simplicity and the interpretability of its predictions. Formally, the Vasicek model is denoted as: 

where PDPiT,i,t represents the PiT PD of firm i at time t. The economic state at time t is denoted as Zt, with ρ representing the correlation between firm i’s asset returns and the economic state. The Vasicek model assumes normality in asset returns and integrates both systematic and idiosyncratic risk, making it suitable for a broad range of applications.  

Despite its simplicity and theoretical consistency, the model also faces critiques for its limitations under certain conditions ​(Basson & Van Vuuren, 2023)​, such as: 

  • Simplistic Linearity Assumptions 
  • Distributional Assumptions 
  • Static Correlation Structure  

These limitations can cause inaccurate PD estimations resulting in poor risk management, which could be detrimental for financial institutions. The industry-standard model often struggles with the extreme economic scenarios or sector-specific variations that are the most informative to predict.  As the IFRS 9 principles allow for more freedom with regards to the modelling usage (while still being explainable), these limitations lead to ongoing exploration for enhancements. 

Extending the Vasicek Model 

To improve the Vasicek model, three different approaches were considered. The first approach covers extending the Vasicek model to a non-linear model which does not rely on linear assumptions. Secondly, granularity is added to the Vasicek model by considering a multitude of Copula functions. Finally, the correlation between a firms asset returns and the economic state is made time- and industry dependent in order to relax the assumption of a static correlation structure.  

Non-linear Techniques: Enhancing Z-Factor Modeling 

The Z-factor is heavily influenced by many interconnected variables, with regulations and policies leading to increasingly complex dynamics that are difficult to accurately model. As of now, the industry-standard method to model the Z-factor is through the use of linear regression. However, one could argue that the real-world state of the economy rather exhibits non-linear patterns.  

To introduce this non-linearity, many methodologies were considered, including statistical models such as regularized regressions and regime-switching models. Additionally, Machine Learning (ML) techniques, ranging from Gradient Boosting to Neural Network approaches, have been proposed to better capture the intricate relationships that cannot be captured by linear models. These techniques (partly) relax assumptions on the underlying data structures and help in understanding complex patterns in the data, offering improved estimation accuracy while minimizing overfitting risks. Such models are particularly beneficial when dealing with high-dimensional data where traditional approaches tend to underfit.  

Our findings indicate that Z-factor estimation accuracy can significantly be improved upon, using models such as the regime-switching model or the long short-term memory neural network. These models showed significantly more accurate Z-factor estimation, both in-sample and out-of-sample. Other ML models included in this research do not show a significant increase in prediction accuracy as compared to the single factor Vasicek model. However, the use of these models also adds an extra layer of complexity to the modeling approach. 

As IFRS 9 regulations require model predictions to be interpretable, frameworks such as SHapley Additive exPlanations (SHAP) can be introduced as a measurement tool. Although SHAP values do not equal full explainability, they can be used to assess feature importance and general insights into the identity and magnitude of important macroeconomic variables used for Z-factor prediction. This increased complexity and decreased interpretability of the modelling process, makes that additional academic and/or regulatory advancements need to be made before ML methods can be used within IFRS 9 frameworks. One could argue whether the additional complexity introduced by using ML models justifies the marginal increase in estimation accuracy. 

Copula Approach: Distributional Flexibility 

Risks are often aggregated over broad sectors or asset classes without considering nuances at more granular levels. Such a model may overlook specific risk drivers relevant to particular firms or industries, leading to less accurate PD estimates.  

The second research direction involves using Copula-based methodologies to inject granularity into PiT PD estimations. Within the Copula approach, dependencies between random macroeconomic variables can be captured independently of their respective distributions. This approach thus allows for a more accurate description of the system’s behavior, where the system consists of m macroeconomic variables and the Z-factor. Moreover, each (macroeconomic) variable can be modeled by its empirical CDF, avoiding the need for parametric assumptions. The option to avoid making any distributional assumptions makes the copula approach very flexible.  

By allowing for more flexible dependency structures, Copula models can provide a better representation of tail risks. This is particularly relevant for IFRS Stage 2 loans, which includes financial instruments that have shown a significant increase in credit risk since initial recognition. In this research a variety of conditional Copula models are considered and tested. The conditional Copula computes the distribution of the Z-factor conditional on the m macroeconomic variables in the system. Despite challenges like the Gaussian Copula’s inability to model joint extreme events effectively, alternatives such as the t-Copula show a statistically significant improvement over the benchmark Vasicek model. In particular, the copula models significantly reduce the amount of underestimation, a crucial advantage in the context of credit risk modelling.  

In conclusion, results indicate that this approach significantly improves PD estimation, as compared to the benchmark Vasicek model, while interpretability and marginal flexibility stays intact. However, it does introduce a higher degree of complexity within the model compared to the linear benchmark model. Hence, we consider this method to be a promising area of future research for PD estimation.    

Time and Industry Varying Correlation Structure 

The industry-standard Vasicek model assumes a constant correlation across industries and time periods. In reality, correlations among default events can change over time and/or vary across industries due to economic cycles or industry specific shocks. This presents a research opportunity to enhance the practical applicability by incorporating sectoral dynamics and temporal variations within the correlation parameter of the model. Resulting in the following equation: 

where the correlation parameter ρi,t is unique for firm i at time t. This segmentation is not limited to industries and time periods, but could also be extended across different regions or size classes, depending on the specific portfolio that is being considered. The correlation parameter ρi,t is modeled with a beta-distribution with time-varying mean and is defined on the interval between 0 and 1. The mean of the beta distribution is modelled as a logit link function driven by company-specific data ​(Ferrari & Francisco, 2004)​. This method thus allows for the temporal dependency to change over time, taking into account that the underlying relationship of the data does not remain identical across different time periods. Implementing varying correlations allows PD models to adapt and reflect real-world scenarios more precisely, ultimately leading to more robust credit risk predictions.  

Results indicate that this approach significantly improves PD estimation accuracy, as compared to the benchmark Vasicek model. Moreover, this improvement is realized while the interpretability and logic of the benchmark model stays intact. Therefore, we consider this improvement to be a general improvement to existing Vasicek frameworks!  

Conclusion 

In this research, we have examined several different improvements with regards to PD modelling under IFRS 9 by using a Vasicek model, which is the industry standard. The first approach focused on extending the Vasicek framework by including non-linearity through advanced statistical and machine learning models. It was found that the regime-switching model and the long short-term memory neural network significantly improved Z-factor prediction accuracy. However, the increased complexity and the decreased interpretability of these models raises the question whether the gains of these approaches outweigh the additional efforts in practical applications. 

Secondly, a conditional copula approach was introduced to capture the dependencies between macroeconomic variables and the Z-factor. The Copula models demonstrated exceptionally good relative performance in certain industries. Overall, the t-Copula proved to be the best Copula model in terms of overall predictive accuracy, significantly outperforming the standard Vasicek model. However, introducing a Copula model does lead to a higher degree of complexity within the model framework. 

Lastly, we have incorporated a time and industry varying correlation parameter into the standard Vasicek model, thereby relaxing the static assumption implied by the original model. The use of this approach shows promising results, with PD estimation accuracy increasing significantly. This methodology is a simple yet important extension of the Vasicek framework that improves estimation accuracy while maintaining a level of simplicity and interpretability.  

To conclude, we find that various methodologies can be introduced to challenge the existing Vasicek framework. Findings indicate that a number of models improve estimation accuracy, However, in some cases the marginal increase in accuracy does not weigh up against the additional efforts that are necessary to use these models in practice. The methodology that we would focus on in actual use-cases is the inclusion of time- and industry-varying correlations, which has shown positive results that are both theoretically consistent and remain interpretable and compliant with IFRS 9 regulations. 

By extending the Vasicek model, Zanders continues to contribute valuable insights, supporting the financial industry's shift to a more comprehensive modeling toolkit. These advancements highlight our commitment to developing solutions that meet modern accounting and regulatory standards while providing financial risk managers with enhanced tools for risk estimation.  

Are you interested in how you could leverage these methodologies to enhance your PD modelling approach? Contact Kasper Wijshoff, Kyle Gartner or Mila van den Bergh for more information. 

​​Bibliography 

​​Basson, L. J., & Van Vuuren, G. (2023). Through-the-cycle to Point-in-time Probabilities of Default Conversion: Inconsistencies in the Vasicek Approach. International Journal of Economics, 13(6), 42-52. 

​Ferrari, S., & Francisco, C.-N. (2004). Beta regression for modelling rates and proportions. Journal of applied statistics, 31(7), 799-815. 

​Vasicek, O. (1977). An equilibrium characterization of the term structure. Journal of financial economics, 177-188. 

​Vasicek, O. (2002). The distribution of loan portfolio value. Risk, 160-162.

What Banks Need to Know from the EBA’s Fourth LCR & NSFR Monitoring Report

May 2025
4 min read

On May 14th 2025, the European Banking Authority (EBA) published the fourth report on the monitoring of the liquidity coverage ratio (LCR) and net stable funding ratio (NSFR) in the European Union. The report includes an assessment and updated guidance for four topics, which are discussed in this article.


Inflows from open reverse repos 

In May 2024 the EBA stated1 that inflows from open reverse repos cannot be recognised in LCR calculations unless the call option has already been exercised, or the institution can demonstrate that the repos would be called under specific circumstances Given the ambiguity and interpretive room in the initial statement, the EBA now provides two approaches that institutions may use to substantiate when such repos would be called. 

  • The first approach is forward-looking and states that banks can include the inflows in case they clearly specify events whose occurrence triggers the call of the option of the repo. The deadline of calling the option should also be shorter than 30 days. 
  • The second approach is backward-looking and allows banks to use their historic data. If banks can show that certain events have historically led to calling the option within 30 days, the inflows from the repos can be included in LCR calculations. The EBA stresses that the historical observations must have occurred during a period of stress of at least 30 days to ensure that reputational issues under such scenario are taken into account.

Operational deposits and excess operational deposits 

Outflows of operational deposits, defined as deposits received for the purposes of obtaining clearing, custody, cash management or other comparable services, are significantly lower than those of non-operational deposits. Moreover, operational deposits are generally less vulnerable to significant withdrawals during a period of idiosyncratic or market-wide stress. As a result, operational deposits may be treated differently from non-operational deposits in LCR calculations. EBA’s first report (2019)2 on the monitoring of the LCR implementation already provided additional guidance on identifying operational deposits. However, based on a recent survey of competent authorities, the EBA found there may be divergence in the way institutions demonstrate the existence of legal or operational limitations that make significant withdrawals within 30 calendar days unlikely. Additionally, quite some heterogeneity was observed in the modelling techniques used to identify excess operational deposits. 

To promote a level-playing field, the EBA now provides further regulatory guidance. It emphasizes the importance that institutions have a framework in place that provides a proper and evidence-based mapping of deposits to the different deposit types recognized in the LCR regulation (i.e. stable retail deposits, non-stable retail deposits and operational deposits). It is also noted that the length of the trade cycle used to identify excess operational deposits should reflect the specific characteristics of the client’s business model. Finally, prudent statistical measures should be applied when analysing historical deposit balances or payment flows. 

Retail deposits excluded from outflows 

Retail term deposits with a maturity beyond 30 days may be exempted from outflows in LCR calculations. Institution may exclude such deposits if they can demonstrate, on economic or contractual grounds, that the deposits will mature beyond 30 days. EBA’s first report on the monitoring of the LCR implementation (2019) already provided guidance on how to assess whether deposits mature beyond 30 days for economic reasons. Given the significant increase in term deposits in recent years, the EBA has reviewed whether this guidance remains adequate. 

Overall, competent authorities report that the 2019 guidelines are properly implemented across the industry. However, the definition of a ‘material’ penalty for early termination under the LCR still leaves room for interpretation. This may lead to wrongly classifying term deposits as having a maturity beyond 30 days. To address this, the EBA now requires that historical data show a material penalty was applied to all early redemptions of exempted retail term deposits, even during periods of rising interest rates. 

Interdependent assets and liabilities in NSFR 

Under the CRR, banks can exclude assets and liabilities from NSFR calculations if they appear on the balance sheet only because the bank serves as a pass-through unit to channel the funding from the liability into the corresponding asset.3 Often times, these are derivatives from clients that are passed through to Central Counterparties (CCPs) in order to provide the client access to central clearing. 

In the report, the EBA highlighted some unclarities around the eligibility of indirect client clearing activities where affiliated institutions are used to clear derivatives. To further clarify when such activities are eligible to be excluded from NSFR calculations, the EBA suggested a slight adjustment to the regulatory CRR text. This does not have any practical implications to the requirements around NSFR. 

Conclusion 

The fourth report on the monitoring of LCR and NSFR by the EBA provides additional guidance on four regulatory unclarities around LCR and NSFR. For three of the topics, all related to LCR, there is an impact on the implementation of the liquidity metric: 

1- Banks may include inflows from open reserve repos when they demonstrate that the repos would be called within 30 days in a period of stress. This can be done by clearly stating this in the liquidity risk management policy or by showing that this has historically been the case. 

2- Banks should have proper frameworks in place to map deposits to different deposit types, should ensure that the length of the trade cycle used to identify excess operational deposits reflects the client’s business model and should ensure that prudent statistical measures are applied when analyzing historical deposit balances. 

3- Banks have to show with historical data that material penalties were applied to prematurely terminated retail term deposits before the term deposits can be excluded from LCR outflows. 

The EBA provided clear instructions to follow up for the inflows from open reverse repos and the outflows from retail term deposits, but the additional guidance for (excess) operational deposits still leaves some ambiguity. Especially the statement that institutions should refer to ‘prudent statistical measures’ leaves room for interpretation. Nevertheless, the additional remarks made by the EBA indicate that competent authorities are likely to increase focus on the (liquidity-typical) modelling of operational deposits and on the calculation of LCRs in general. 

Reach out to Erik Vijlbrief or Jelle Thijssen for more information. 

  1. See Q&A 2024_7053 for more information. ↩︎
  2. EBA reports on the monitoring of the LCR implementation in the EU | European Banking Authority ↩︎
  3. The conditions set out in Article 428f of the CRR must be met. ↩︎

Ensuring Robust Controls and Checks in SAP TRM with a Kaizen Approach

May 2025
3 min read

This article explores SAP Treasury and Risk Management (TRM) controls and automation frameworks, with a focus on design and implementation challenges, as well as on the ongoing validation of the controls to ensure they are efficient, seamless and reliable.


This article is intended for finance, risk, and compliance professionals with business and system integration knowledge of SAP, but also includes contextual guidance for broader audiences.

1. Introduction

SAP Treasury and Risk Management (SAP TRM) provides a robust framework to meet the need for enhanced transparency, mitigate financial risks, and ensure regulatory compliance, but its effectiveness depends on the correct implementation of control mechanisms and automated checks. 

In Japan, the approach to financial and treasury management is influenced by principles of Kaizen (continuous improvement) and high-quality process control. The core emphasis is on minimizing errors, enhancing efficiency, and ensuring the reliability of Treasury management systems.  
The approach aligns with the necessity of seamless internal controls in SAP TRM, where even minor inconsistencies can lead to significant financial risks and losses. 

This article outlines key controls within SAP TRM, provides practical implementation steps, and explores AI-driven enhancements that improve compliance, security, and operational efficiency. 

2. SAP TRM: Key Areas Requiring Controls

SAP TRM covers a broad range of treasury functions, each requiring specific control mechanisms:

  • Transaction Management: Handling financial instruments such as FX, money markets, securities, derivatives, etc.
  • Risk Management: Identifying and mitigating market and credit risks.
  • Cash and Liquidity Management: Real-time cash positioning and cash flow forecasting, and optimization.
  • Hedge Management and Accounting: Compliance with IFRS 9 and other standards.

3. How SAP TRM Supports Real-Time Risk Control

A solid control framework requires detailed analysis, alignment, and structured implementation. Below are some key controls that should be assessed and implemented within transaction management of SAP TRM. In the picture, the areas where possible checks can be applied are highlighted:

Deal Release (Approval) Control

Ensuring proper authorization and validation before a deal is released (approved internally) is essential to prevent unauthorized or erroneous transactions from being booked, modified, and processed in SAP.

Proposed configuration objects to consider:

  • Business Rule Framework Plus (BRFplus): Define validation checks, such as ensuring that only authorized users can release deals above a certain value.
Deal Settlement Control

Settlement is SAP term for the deal confirmation with counterparties. The deal status is changed to "Settlement" immediately after confirmation. 

Accurate deal confirmation is crucial to avoid financial mismatches or discrepancies in payment details. The settlement process is required for all external deals and may also be needed for intercompany deals. Settlement is required for every key lifecycle step of a deal: creation, NDF fixing, rollover, option exercise, etc. Settlement can be done manually or automatically upon counter-confirmation of the correspondence (MT300, MT320, MT305, or deal confirmation messages). 

Proposed configuration objects to consider:  

  • Set up automated reports to track unsettled deals within a specific timeframe (as of today or as of the current week).    
  • SAP Fiori app “Display Treasury alerts” is an effective solution for this. 
Ensuring Transaction Integrity: Key Checks in SAP 

Automating the matching process minimizes manual errors and improves operational efficiency, reducing the time required for this process. 

Proposed configuration objects to consider: 

  • Define a report to show all pending unmatched correspondences. This is often proposed as a control over the matching process. It can be implemented using the Correspondence Monitor or the Fiori app "Display Treasury Alerts." 
Restricted Deal Modifications 

To prevent unauthorized modifications of the treasury transactions that could lead to fraudulent or erroneous results, deal changes should be strictly controlled. We do not recommend allowing deals in status Settlement (after confirmation matching process) to be changed—especially key matching fields such as amounts, value dates, and currencies. However, modifications may be required from time to time. In such cases, they need to be controlled and may require another round of verification and counter-confirmation matching. 

Proposed configuration objects to consider:  

  • Enable Change Log Activation: Use SAP Audit Information System (AIS) to track modifications and maintain audit trails for TRM. 
  • Implement SAP TRM Deal Approval Workflow: Require validation and approval for all deal amendments and reversals. 
  • Restrict critical field modifications: Configure SAP field selection variant to lock key fields of the deals from being altered after deal approval/settlement. 
  • Enforce Segregation of Duties (SoD): Ensure different users handle deal creation, approval, settlement processing, and modifications to prevent errors or fraud. 
Deal Payment Controls Using BCM Approvers 

Ensuring that deal payments go through the correct approval process reduces financial risk and enhances security. It is important to define unusual treasury payment scenarios, such as when cash flow processing occurs outside normal business hours. 

Proposed configuration objects to consider:  

  • Implement a Payment Block Mechanism: Ensure that payments are blocked unless predefined criteria are met, or review and approval is done/granted. 
  • Automated Exception Handling: Configure Bank Communication Management (BCM) rules to flag transactions that deviate from standard payment behavior, requiring manual review before processing (e.g., payments to a new business partner or deals posted outside normal business hours). 

4. AI-Driven Enhancements for Treasury Control

AI is transforming treasury processes by automating risk detection, optimizing forecasting, and enhancing decision-making. Key AI-powered enhancements in SAP TRM include:

  • Fraud Detection: Machine learning models in SAP Business Technology Platform (BTP) to detect suspicious transactions.
  • SAP Business Integrity Screening: AI-driven software from SAP that improves anomaly detection, risk identification, and compliance checks.
  • Automated Matching: AI-powered bots in SAP Intelligent RPA match transactions (especially when non-SWIFT correspondence is used), helping eliminate manual errors.

5. Best Practices for SAP TRM Control Optimization

The following best practices are recommended as part of the business-as-usual process:

  • Conduct regular Role & Authorization reviews: Update SAP GRC, Governance, Risk, and Compliance, settings to align with changes in Treasury team structure.
  • Leverage SAP Fiori apps for real-time analytics and visibility into treasury processes, ensuring that no payments are stuck and that all back-to-back and mirroring deals are in place and processed on time.
  • Create and integrate AI-powered tools: Deploy SAP AI solutions to enhance risk analysis, compliance monitoring and decision-making.
  • Adopt a Kaizen approach: Continuously monitor and refine TRM controls to improve efficiency and accuracy, as technology is rapidly evolving and controls must remain valid and preventive.

6. Conclusion

Implementing strong controls in SAP TRM is critical for maintaining compliance, minimizing financial risks, and improving operational efficiency.

By leveraging SAP’s built-in functionalities—along with AI-driven enhancements—corporate treasurers can create a secure, transparent, and highly automated treasury management process. By integrating Kaizen principles and AI-driven automation, companies can transform their treasury operations into a continuously improving and highly efficient Treasury Management System.

Zanders consultants are deeply involved in this crucial topic, which can be addressed as a standalone initiative to improve controls, as part of ongoing treasury support, or within an SAP TMS implementation project. We are ready and eager to help your company enhance system controls and checks in SAP TRM.

For more information, please contact Aleksei Abakumov.

Case Study: SAP TRM in Practice

A multinational, multibillion-dollar company operating across the globe, with its HQ in Tokyo, Japan, implemented SAP TRM. By using J-SOX controls (both business and system) in SAP TRM, the company managed to build world-class, secure control mechanisms that are used and applied throughout the entire organization, across all markets.  The company regularly validates these controls and challenges them when they are obsolete or adds new controls if needed due to changing external conditions.

Modernizing Payment Systems: An Approach to BACS AUDDIS Implementation with SAP S/4HANA

May 2025
2 min read

In today’s rapidly evolving financial landscape, organizations are seeking efficient and secure solutions for payment processing.


Our team at Zanders has been at the forefront of implementing BACS AUDDIS (Automated Direct Debit Instruction Service) with SAP S/4HANA, helping clients to streamline their direct debit management while ensuring regulatory compliance.

Why BACS AUDDIS Matters

BACS Direct Debit is the UK's electronic direct debit service, enabling organizations to collect payments directly from customers' bank accounts. This service offers numerous advantages, including predictable cash flow, reduced administrative overhead, and improved operational efficiency.

The implementation of AUDDIS enhances this process by allowing organizations to electronically send Direct Debit instructions to the banking system. This digitization eliminates paper-based processes, reduces operational costs, and improves overall processing speed.

The Advantage

Our approach for AUDDIS implementation with SAP S/4HANA provides a bespoke solution that supports AUDDIS requirements, built on top of the standard SAP S/4HANA functionality. What sets our approach apart is our deep expertise in complex, mission-critical financial system implementations specifically tailored for market leading organizations.

The solution development leverages the full range of SAP's Direct Debit (DD) capabilities supplemented by our extensive experience in Treasury and Payment systems. This creates a seamless integration with existing SAP finance processes while ensuring full compliance with BACS standards.

The Solution

The solution developed supports the full cycle of BACS Direct Debit (DD) mandate management in SAP, from new mandate registration to mandate amendments, cancellation, and dormancy assessment.

Mandate information and amendments are automatically interfaced in XML format (PAIN.008) in compliance with BACS requirements, transmitting from the SAP Payment Hub to BACS through a BACSTEL-IP service provider.

Furthermore, to ensure a smooth and streamlined direct debit collection process, the custom solution enhances SAP’s automatic payment program with additional features.

These include verifying the validity of the customer or alternative payee’s mandate status at the payment proposal stage, considering DD collection lead periods.

This ensures that only customers with an active mandate status are included in the payment run. In addition, SAP’s Data Medium Exchange (DME) engine has been extended to include the mandatory mandate reference attributes in the bank interface.

Our Four-Phase Implementation Approach

 Every AUDDIS implementation follows four key stages:

1 - Application: The corporate entity applies to its sponsoring payment service provider for AUDDIS approval.

2 - Preparation: After approval, the corporate prepares its internal systems, ensuring software, processes, and infrastructure are in place for AUDDIS integration.

3 - Testing: The corporate submits test files to BACS to validate both compliance and successful transmission.

4 - Go-Live: The corporate transitions to using the AUDDIS service either as a ‘live’ user from the start or by migrating first before going live. This is the final stage of the AUDDIS implementation process, and exactly what this entails depends on whether the corporate is joining AUDDIS with a new Service User Number or with an existing one, and therefore, has a migration (to onboard existing customer mandates to AUDDIS) to complete.

AUDDIS implementation projects require ongoing coordination with Banking Partners, System Integrators and Business Process Teams. This includes the process of creating BACS Service User Numbers, extensively testing bank interfaces for AUDDIS and Direct Debit Instruction files. At go-live, detailed coordination is required to support penny testing.

The results after implementation

Our clients report significant benefits from implementing BACS AUDDIS with our support, including:

  • Enhanced management of Direct Debit (DD) mandates
  • Reduced processing costs and improved operational efficiency
  • Ensured compliance with banking and regulatory requirements
  • Accelerated payment cycles through streamlined processes

By partnering with Zanders, organizations gain a trusted advisor in BACS AUDDIS implementations. Our expertise ensures smooth transitions while maintaining business continuity throughout the process.

For more information on how we can support your organization's transition to BACS AUDDIS with SAP S/4HANA, please contact Eliane Eysackers or Nadezda Zanevskaja.

Thailand’s WHT Digital Transformation: A New Era for Corporate Treasury

May 2025
2 min read

Electronic Withholding Tax System Modernizes Payment Processes.


Thailand's e-Withholding Tax (e-WHT) system officially launched on October 27, 2020, in collaboration with 11 banks, marking a significant digital transformation with far-reaching benefits for corporate treasurers. This system was introduced to streamline the process of withholding tax by allowing taxpayers to remit taxes electronically through participating banks. The initiative aimed to enhance convenience for taxpayers and reduce the administrative burden associated with traditional tax filing methods. To encourage adoption, the government implemented measures such as reducing withholding tax rates for those utilizing the e-WHT system.

Thailand's Digital Tax Evolution

The e-WHT system aligns with Thailand's digital transformation strategy, offering corporate treasurers an opportunity to modernize tax processes and enhance cash flow management. This initiative reflects Thailand's commitment to creating a more efficient business environment through technology.

What Corporates Need to Know

Companies conducting business in Thailand are required to withhold tax on certain payments including:

  • Service fees
  • Professional fees
  • Royalties
  • Interest
  • Dividends
  • Rent/Lease

Once the tax is withheld, businesses must submit the amount to The Revenue Department, part of the Thai Ministry of Finance. This requirement applies to both domestic and foreign companies conducting business in Thailand. 

e-WHT only applies to domestic electronic payments. Paper cheques must continue to follow the traditional paper WHT model.

Key Benefits for Treasury Departments

The adoption of e-WHT offers multiple strategic advantages:

  • Improved cash flow visibility – provides real-time status of tax-related movements, enabling more efficient WHT monitoring
  • Simplified documentation – reduces the risk of human error in tax calculations and documentation procedures
  • Accelerated processing times – shortens the tax filing and refund cycles
  • Enhanced compliance – reduces the risk of non-compliance
  • Better reconciliation – enables real-time verification of taxes withheld and credits
Implementation Insights

Our treasury technology specialists have assisted companies operating in Thailand with the successful implementation of  e-WHT submissions, covering domestic payment types such as ACH and RTGS via Standard Chartered Bank (SCB).

Key insights from these implementations include the importance of establishing dedicated interfaces from ERP S/4 HANA systems to the banking file gateway linked to Thailand's Revenue Department for seamless e-WHT processing.

Impact on Corporate Operations and User Experience

The introduction of e-WHT has streamlined tax filing for companies in Thailand, marking a significant shift toward digital transformation and compliance.

From a user experience perspective, companies can expect the following:

  • No need to issue or store paper-based withholding tax certificates
  • No manual month-end tax submissions to the Revenue Department
  • Reduced document handling and storage costs
  • Verification of withholding tax records by beneficiaries via the Revenue Department’s website
  • Reduced withholding tax rates for eligible businesses

Looking Ahead: Strategic Implementation Considerations

When planning for e-WHT implementation, treasurers should keep the following in mind:

  • Errors in tax rates cannot be corrected once submitted
  • Faster tax payment cycles may require adjustments in cash flow planning
  • Managing both paper and electronic WHT processes requires additional oversight and potential training for finance teams

For corporate treasurers, e-WHT represents an opportunity to modernize tax procedures, improve processing efficiency, and embrace digital transformation. Companies that properly plan the adoption of this initiative will align with Thailand's digital tax strategy while gaining operational benefits.

For more information on navigating Thailand's e-WHT implementation for your organization, please contact our Treasury Technology specialists.

Reinforcing Financial Resilience: Adopting a Holistic FRM Approach 

May 2025
3 min read

Heightened market uncertainty means that Financial Risk Management remains a key focus for multinational corporations.


In today’s rapidly evolving financial landscape, fortifying the Financial Risk Management (FRM) function remains a top priority for CFOs. Zanders has identified a growing trend among corporations striving to modernize their current FRM practices to achieve comprehensive risk visibility, advanced risk quantification, and a holistic, proactive and integrated approach to risk management. These efforts aim ultimately to boost shareholder value. The Zanders Financial Risk Management Framework offers multinational corporations a tested and structured methodology to meet these objectives. In this article, we delve into this approach and explore the benefits it can bring to your organization. 

Financial Risks as a Collateral Effect: 

To better understand the role of FRM in a corporate setting, it's helpful to define its scope. FRM involves identifying, measuring, and managing financial risks such as market risk (including foreign exchange, interest rate, and commodity risk), liquidity risk, and credit risk. These risks often arise as side effects of inherent business risks. Business risks originate from core activities like international trade, working capital investments, capital expenditures, and supply chain design. Shareholders typically invest in companies to gain exposure to and be compensated for these specific business risks, while often viewing financial risks as unfavorable side effects. Therefore, aligning a company’s FRM practices with shareholders’ expectations is crucial. 

Increased market volatility often triggers FRM initiatives. Additionally, several internal and external factors prompt companies to reevaluate their FRM frameworks, including: 

  • Geopolitical instability 
  • MADS events (Mergers, Acquisitions, Divestitures, Spin-offs) 
  • Organic growth, especially expansion into emerging markets 
  • Changes in the regulatory landscape 
  • Aspirations to adopt innovative, best-in-class practices 
     

A well-defined FRM framework is important for any organization, as it provides clarity on how financial risks are managed. By maintaining an up-to-date and well-structured FRM framework, companies can respond more proactively to changing market conditions and ensure better alignment with shareholder goals. 

Holistic vs. Integrated Financial Risk Management 

The landscape of FRM is evolving, with multinational corporations (MNCs) shifting from a silo-based approach to a more holistic strategy. In this modernized approach, various market risks, along with their connections to liquidity risk and the company's credit profile, are analyzed comprehensively. The new standard for treasurers increasingly involves quantitative methodologies to measure the potential impact of financial risks on key financial metrics—such as earnings, net income, and financial covenants—using "at-risk" quantification measures like earnings-at-risk (EaR). These "at-risk" methodologies are increasingly utilized to inform strategic decisions or reduce hedging costs. An illustration of this is the utilization of efficient frontier analysis to achieve optimal hedging costs for a particular "at-risk" level. 

In addition to adopting a holistic FRM approach, Zanders has observed a growing trend towards greater business integration. Effective FRM requires seamless integration between the treasury function and the broader business organization. By embedding treasury within the business processes, companies can add significant value through the early identification of financial exposures and by anticipating the financial risk implications of business decisions at their inception. Tools such as Risk Adjusted Return on Sales (RAROS) exemplify this integrated approach. With RAROS, FRM becomes an integral part of the commercial process, where the financial risks of specific transactions are quantified to determine their true economic value-add. 

The shift towards holistic and integrated FRM empowers organizations to not only manage risks more effectively but also to drive value creation and enhance financial resilience. 

Zander’s Risk Management Framework  

As financial risk managers, it's important to take a holistic approach to FRM. At Zanders, we suggest that our clients use a structured 5-step FRM framework. 

The framework is applied as follows:

1- Identification: Establish the risk exposure profile by identifying all potential sources of financial exposure, classifying their likelihood and impact on the organization, and prioritizing them accordingly.

2- Measurement: Risk quantification involves a detailed quantitative analysis of the exposure profile. This includes assessing the probability of market events and quantifying their potential impact on financial parameters using techniques such as sensitivity analysis, scenario analysis, and simulation analysis (for example, cash flow at risk, value at risk).

3- Strategy & Policy: With a clear understanding of the existing risk profile, the objectives of the risk management framework are defined, considering the specific goals of various departments such as finance, operations, and procurement. A hedging strategy is then developed in alignment with these established financial risk management objectives.

4- Process & Execution: This phase follows the development of the hedging strategy, where the toolbox for hedging is defined, and roles and responsibilities are clearly allocated within the organization.

5- Monitoring & Reporting: All activities should be supported by consistent monitoring and reporting, with exception handling capabilities and risk assessments shared across departments.

    Conclusion

    Amid today's unpredictable financial markets, organizations are placing greater emphasis on FRM. Establishing a future-proof FRM framework is necessary not only for safeguarding key financial parameters and enhancing risk visibility but also to support long-term business sustainability. A robust FRM framework empowers companies to communicate more effectively with debt and equity investors, as well as other stakeholders, about their financial risks. Additionally, effective financial risk control can lead to better credit ratings, thereby improving access to liquidity under more favorable conditions.

    To achieve a comprehensive and successful transformation of all key areas in FRM, a structured and proven project approach is indispensable.

    Why Choose Zanders?

    • Comprehensive, Integrated, and Strategic Approach: Our focus is on managing financial risk to enhance shareholder value.
    • Advanced Tools: We utilize extensive proprietary benchmarking and advanced risk modeling tools.
    • Expertise and Experience: With over 500 treasury and risk professionals, we cover the full spectrum of FRM with deep subject matter expertise.
    • Proven Track Record: Excellent references and client testimonials underscore our extensive knowledge base and successful track record in Financial Risk Management.
    • Strategic Insight and Implementation Capability: We combine strategic knowledge of treasury and risk best practices with the ability to implement these solutions effectively.

    In an ever-changing financial landscape, Zanders provides the expertise and structured approach necessary to build a resilient FRM framework, ensuring your organization is well-equipped to navigate future challenges while enhancing financial stability and shareholder value. To learn more, contact us.

    Revealing New Insights Through Machine Learning: An Application in Prepayment Modelling

    April 2025
    6 min read

    Emergence of Artificial Intelligence and Machine Learning 

    The rise of ChatGPT has brought generative artificial intelligence (GenAI) into the mainstream, accelerating adoption across industries ranging from healthcare to banking. The pace at which (Gen)AI is being used is outpacing prior technological advances, putting pressure on individuals and companies to adapt their ways of working to this new technology. While GenAI uses data to create new content, traditional AI is typically designed to perform specific tasks such as making predictions or classifications. Both approaches are built on complex machine learning (ML) models which, when applied correctly, can be highly effective even on a stand-alone basis.  

    Figure 1: Model development steps

    Though ML techniques are known for their accuracy, a major challenge lies in their complexity and limited interpretability. Unlocking the full potential of ML requires not only technical expertise, but also deep domain knowledge. Asset and Liability Management (ALM) departments can benefit from ML, for example in the area of behavioral modelling. In this article, we explore the application of ML in prepayment modelling (data processing, segmentation, estimation) based on research conducted at a Dutch bank. The findings demonstrate how ML can improve the accuracy of prepayment models, leading to better cashflow forecasts and, consequently, a more accurate hedge. By building ML capabilities in this context, ALM teams can play a key role in shaping the future of behavioral modelling throughout the whole model development process.

    Prepayment Modelling and Machine Learning

    Prepayment risk is a critical concern for financial institutions, particularly in the mortgage sector, where borrowers have the option to repay (a part of) their loans earlier than contractually agreed. While prepayments can be beneficial for borrowers (allowing them to refinance at lower interest rates or reduce their debt obligations) they present several challenges for financial institutions. Uncertainty in prepayment behaviour makes it harder to predict the duration mismatch and the corresponding interest rate hedge.

    Effective prepayment modelling by accurately forecasting borrower behaviour is crucial for financial institutions seeking to manage interest rate risks. Improved forecasting enables institutions to better anticipate cash flow fluctuations and implement more robust hedging strategies. To facilitate the modelling, data is segmented based on similar prepayment characteristics. This segmentation is often based on expert judgment and extensive data analysis, accounting for factors like loan age, interest rate, type, and borrower characteristics. Each segment is then analyzed through tailored prepayment models, such as a logistic regression or survival models.1

    ML techniques offer significant potential to enhance segmentation and estimation in prepayment modelling. In rapidly changing interest rate environments, traditional models often struggle to accurately capture borrower behaviour that deviates from conventional financial logic. In contrast, ML models can detect complex, non-linear patterns and adapt to changing behaviour, improving predictive accuracy by uncovering hidden relationships. Investigating such relationships becomes particularly relevant when borrower actions undermine traditional assumptions, as was the case in early 2021, when interest rates began to rise but prepayment rates did not decline immediately.

    Real-world application

    In collaboration with a Dutch bank, we conducted research on the application of ML in prepayment modelling within the Dutch mortgage market. The applications include data processing, segmentation, and estimation followed by an interpretation of the results with the use of ML specific interpretability metrics. Despite being constrained by limited computational power, the ML-based approaches outperformed the traditional methods, demonstrating superior predictive accuracy and stronger ability to capture complex patterns in the data. The specific applications are highlighted below.

    Data processing

    One of the first steps in model development is ensuring that the data is fit for use. An ML technique that can be commonly applied for outlier detection is the DBSCAN algorithm. This clustering method relies on the concept of distance to identify groups of observations, flagging those that do not fit well into any cluster as potential outliers. Since DBSCAN requires the user to define specific parameters, it offers flexibility and robustness in detecting outliers across a wide range of datasets.

    Another example is an isolation forest algorithm. It detects outliers by randomly splitting the data and measuring how quickly a point becomes isolated. Outliers tend to be separated faster, since they share fewer similarities with the rest of the data. The model assigns an anomaly score based on how few splits were needed to isolate each point, where fewer splits suggest a higher likelihood of being an outlier. The isolation forest method is computationally efficient, performs well with large datasets, and does not require labelled data.

    Segmentation

    Following the data processing step, where outliers are identified, evaluated, and treated appropriately, the next phase in model development involves analyzing the dataset to define economically meaningful segments. ML-based clustering techniques are well-suited for deriving segments from data. It is important to note that mortgage data is generally high-dimensional and contains a large number of observations. As a result, clustering techniques must be carefully selected to ensure they can handle high-volume data efficiently within reasonable timelines. Two effective techniques for this purpose are K-means clustering and decision trees.  

    K-means clustering is an ML algorithm used to partition data into distinct segments based on similarity. Data points that are close to each other in a multi-dimensional space are grouped together, as illustrated in Figure 2. In the context of mortgage portfolio segmentation, K-means enables the grouping of loans with similar characteristics, making the segmentation process data-driven rather than based on predefined rules.  

    Figure 2: K-Means concept in a 2-dimensional space. Before, the dataset is seen as a whole unstructured dataset while K-means reveals three different segments in the data

    Another ML technique useful for segmentation is the decision tree. This method involves splitting the dataset based on certain variables in a way that optimizes a predefined objective. A key advantage of tree-based methods is their interpretability: it is easy to see which variables drive the splits and to assess whether those splits make economic sense. Variable importance measures, like Information Gain, help interpret the decision tree by showing how much each split reduces the entropy (uncertainty). Lower entropy means the data is more organized, allowing for clearer and more meaningful segments to be created.    

    Estimation

    Once the segments are defined, the final step involves applying prediction models to each segment. While the segments resulting from ML models can be used in traditional estimation models such as a logistic regression or a survival model, ML-based estimation models can also be used. An example of such an ML estimation technique is XGBoost. The method combines multiple small decision trees, learning from previous errors, and continuously improving its predictions. It was observed that applying this estimation method in combination with ML-based segments outperformed traditional methods on the used dataset. 

    Interpretability

    Though the techniques show added value, a significant drawback of using ML models for both segmentation and estimation is their tendency to be perceived as black-boxes. A lack of transparency can be problematic for financial institutions, where interpretability is crucial to ensure compliance with regulatory and internal requirements. The SHapley Additive exPlanations (SHAP) method provides an insightful way for explaining predictions made by ML models. The method provides an understanding of a prediction model by showing the average contribution of features across many predictions. SHAP values highlight which features are most important for the model and how they affect the model outcome. This makes it a powerful tool for explaining complex models, by enhancing interpretability and enabling practical use in regulated industries and decision-making processes.

    Figure 3 presents an illustrative SHAP plot, showing how different features (variables) influence the ML model’s prepayment rate predictions on a per-observation basis. These features are given on the y-axis, in this case 8 features. Each dot represents a prepayment observation, with its position on the x-axis indicating the SHAP value. This value indicates the impact of that feature on the predicted prepayment rate. Positive values on the x-axis indicate that the feature increased the prediction, while negative values show a decreasing effect.  For example, the feature on the bottom indicates that high feature values have a positive effect on the prepayment prediction. This type of plot helps identify the key drivers of prepayment estimates within the model as it can be seen that the bottom two features have the smallest impact on the model output. It also supports stakeholder communication of model results, offering an additional layer of evaluation beyond the conventional in-sample and out-of-sample performance metrics.

    Figure 3: Illustrative SHAP plot

    Conclusion

    ML techniques can improve prepayment modelling throughout various stages of the model development process, specifically in data processing, segmentation and estimation. By enabling the full potential of ML in these facets, future cashflows can be estimated more precisely, resulting in a more accurate hedge. However, the trade-off between interpretability and accuracy remains an important consideration. Traditional methods offer high transparency and ease of implementation, which is particularly valuable in a heavily regulated financial sector whereas ML models can be considered a black-box. The introduction of explainability techniques such as SHAP help bridge this gap, providing financial institutions with insights into ML model decisions and ensuring compliance with internal and regulatory expectations for model transparency.

    In the coming years, (Gen)AI and ML are expected to continue expanding their presence across the financial industry. This creates a growing need to explore opportunities for enhancing model performance, interpretability, and decision-making using these technologies. Beyond prepayment modelling, (Gen)AI and ML techniques are increasingly being applied in areas such as credit risk modelling, fraud detection, stress testing, and treasury analytics.

    Zanders has extensive experience in applying advanced analytics across a wide range of financial domains, e.g.:

    • Development of a standardized GenAI validation policy for foundational models (i.e., large, general-purpose AI models), ensuring responsible, explainable, and compliant use of GenAI technologies across the organization.
    • Application of ML to distinguish between stable and non-stable portions of deposit balances, supporting improved behavioural assumptions for liquidity and interest rate risk management.
    • Use of ML in credit risk to monitor the performance and stability of the production Probability of Default (PD) model, enabling early detection of model drift or degradation.
    • Deployment of ML to enhance the efficiency and effectiveness of Financial Crime Prevention, including anomaly detection, transaction monitoring, and prioritization of investigative efforts.  

    Please contact Erik Vijlbrief or Siska van Hees for more information.

    FINMA Circular on Nature-related Financial Risks 

    April 2025
    6 min read

    In December 2024, FINMA published a new circular on nature-related financial risks. Read our main take-aways.


    Introduction

    In December 2024, FINMA published a new circular on nature-related financial (NRF) risks. Our main take-aways: 

    • NRF risks not only comprise climate-related risks, but also other nature-related risks (such as loss of biodiversity, invasive species and degradation in the quality of air, water and soil). However, risks other than climate-related risks only need to be covered in 2028, whereas climate-related risks need to be covered by 2026 (for large institutions) and 2027 (for small institutions). 
    • All institutions independent of size need to perform a risk identification and materiality assessment of NRF risks. 
    • As part of the materiality assessment, banks need to perform scenario analysis. Small institutions may limit themselves to qualitative scenario analysis, whereas large banks need to perform quantitative scenario analysis. 

    In this blog we summarize the contents of the circular as applicable to banks, including the additional guidance provided by FINMA (“Erläuterungen”). 

    FINMA states the following aims for publishing the circular:  

    • Clarify expectations about the management of nature-related financial (NRF) risks by supervised institutions, based on existing laws and regulations. 
    • Support supervised institutions to adequately identify, assess, limit and monitor these risks. 
    • Be lean, principle-based, proportional, technology-neutral, and internationally aligned1

    The circular applies to both banks and insurance companies in Switzerland, including branches of foreign institutions.  

    The scope of application depends on the bank category: 

    • Category 4 and 5 banks that are very well capitalized and very liquid (‘Kleinbankenregime’) are exempted from implementation of the circular. 
    • Other category 4 and 5 banks are exempted from quantitative scenario analysis, whereas category 3 banks only need to perform quantitative scenario analysis for portfolios with heightened exposure to NRF risks. 
    • Category 3, 4 and 5 banks are exempted from consideration of material NRF risks in stress tests. 

    The circular has to be implemented in full by January 1, 2028, for all banks in scope. However, implementation for climate-related risks needs to be completed earlier, as outlined in the table below, reflecting the greater maturity in the assessment of climate-related risks:  

    FINMA emphasizes that banks will need to have completed their risk identification & materiality assessment well before the overall implementation timeline to allow sufficient time to embed material NRF risks in the overall risk management framework in line with the other parts of the circular.  

    Definitions

    FINMA distinguishes the following types and examples of NRF risks: 

    The circular does not adopt a ‘double materiality’ perspective but focuses on the potential financial impact of NRF risks on a financial institution. However, FINMA emphasizes that the impact of an institution on its environment (e.g., through the lending and investment activities) can influence the relevance of NRF risks for the institution. 

    General expectations 

    The circular emphasizes that all banks need to perform a risk identification and materiality assessment of NRF risks, independent of size and bank category. Implementation of the other parts of the circular depends on whether and which material NRF risks have been identified. 

    Governance

    As all banks need to perform a risk identification and materiality assessment of NRF risks, all banks also need to set up an appropriate governance under which this takes place, including definition and documentation of tasks, competencies and responsibilities. This needs to cover the management and supervisory boards as well as the independent control functions. For management and supervisory boards, it is specifically important to reflect material NRF risks in the business and risk strategy. The nature of the governance arrangements can reflect the size and complexity of the institution (proportionality). 

    Risk identification and materiality assessment 

    Each bank needs to identify all NRF risks that may impact the institution’s risk profile and assess the potential financial materiality. This should include 

    • the potential strategic impact, driven by changing expectations from the public and authorities and consequential changes in markets and technologies, as well as  
    • potential legal and reputational risks through lawsuits against the bank’s counterparties or the bank itself as well as through increasing regulation 

    The risk identification needs to be performed on a gross (inherent) basis. A net (residual) risk can be considered in addition if the effectiveness of risk mitigation measures can be substantiated. 

    FINMA emphasizes that NRF risks need to be considered as risk drivers of existing risk types, rather than as new stand-alone risks. For the existing risk types, at least credit, market, liquidity, operational, compliance, legal and reputational risk need to be considered. Moreover, concentration risks driven by NRF risks need to be considered both within and across the existing risk types. For example, transition risks can simultaneously lower the creditworthiness of counterparties (credit risk), decrease the value of investment positions (market risk) and affect the reputation of the institution (reputation risk). Hence, significant exposure to sectors that are sensitive to transition risk, such as fossil fuel and transport, can lead to additional concentration risk. 

    To assess the financial materiality, institutions are expected to understand the transmission channels through which NRF risks can materialize in financial risks. The following chart provides an illustration of possible transmission channels.  

    Source: Adapted from Figure 2 in NGFS, Nature-related Financial Risks: a Conceptual Framework to guide Action by Central Banks and Supervisors, July 2024.

    In performing the materiality assessment, the institution should consider all relevant internal and external information, consider the indirect impact of NRF risks through clients and related third parties and pay attention to its exposure to sectors, regions and jurisdictions with heightened NRF risks.  

    The process and results of the risk identification and materiality assessment need to be clearly documented, including: 

    • Criteria and assumptions used, such as scenarios and time horizons considered 
    • Physical and transition risks considered and their impact on traditional risk types 
    • The applicable time horizon for the financial materiality 
    • NRF risks that were assessed as non-material 

    The risk identification and materiality assessment needs to be updated regularly, for which FINMA suggests linking it to the annual business planning process. 

    FINMA emphasizes the central role that scenario analysis plays in the materiality assessment. In this respect it expects banks to 

    • Use at least qualitative considerations how adverse scenarios could impact the business model 
    • Consider multiple scenarios, including those with a low probability and possibly large impact 
    • Consider direct impacts and indirect impacts (e.g., on clients and suppliers and their supply chains) from NRF risks 
    • Use multiple relevant time horizons (short, medium and long term) 

    In the additional guidance, FINMA indicates that publicly available scenarios can be used (such as those from the NGFS) but they may need to be tailored to the characteristics of the institution.  

    FINMA expects all banks to perform qualitative scenario analyses, but expects quantitative scenario analysis only at larger institutions, as summarized in the following table. 

    In addition to scenario analysis, FINMA expects banks to use other quantitative approaches (such as sector exposures) to substantiate the materiality assessment. 

    Risk Management

    Material NRF risks need to be integrated in the existing processes for the management, monitoring, controlling and reporting of existing risk types. Risk tolerances for exposure to material NRF risks need to be reflected in risk indicators with warning levels and limits and include forward-looking indicators. For example, risk tolerance for transition risk can be expressed in terms of the nominal exposure and/or financed CO2 emissions in sectors that are sensitive to transition risks, including targets for a reduction in the exposure over time.  

    To account for the large uncertainty in existing methods and data, institutions are expected to apply a margin of conservatism in the risk tolerances (“Vorsichtsprinzip”). Furthermore, they are expected to regularly evaluate, and when necessary, amend, the data, methods and processes needed to manage material NRF risks. This evaluation needs to take national and international developments into account. In the additional guidance, FINMA emphasizes the importance of describing in the existing documentation of the risk management and control processes how material NRF risks are managed, including required data such as transition plans and physical locations of counterparties as well as assumptions, approximations and estimates used when proper data is still lacking.  

    In addition, firms are expected to verify regularly whether their publicly disclosed sustainability objectives are aligned with their business strategy, risk tolerance, risk management and legal requirements, such as national or international commitments to reduce emissions and protect biodiversity. Any such misalignments would increase legal and reputational risks. To reduce the risk of such misalignments, FINMA suggests to include the publicly stated objectives in the annual targets for business lines and employees and embed them in the internal control reviews. 

    Stress testing 

    Category 1 and 2 banks with material NRF risks need to integrate these in their stress test and the internal capital adequacy assessment. To define stress tests, scenarios that have been used for the materiality assessment can be used as basis, but they may need to be broadened in scope and made more severe. 

    Expectations per risk type 

    The FINMA expectations per risk type below apply to material NRF risks only. 

    Credit risk 

    NRF risks that are assessed as material in relation to a bank’s credit risk need to be monitored throughout the full lifecycle of credit risk positions. The bank is expected to consider measures to control or reduce the exposure to these NRF risks, for example by 

    • Adjusting the lending criteria and, if applicable, acceptable collateral. As part of this, the bank can consider providing incentives to counterparties to reduce exposure to NRF risks. 
    • Adjusting client or transaction ratings. 
    • Lending restrictions, such as shorter maturities, lower lending limits and discounted asset values for clients materially exposed to NRF risks. 
    • Client engagement, encouraging sustainable business practices and enhanced external disclosures about exposures to NRF risks and transition plans. 
    • Setting thresholds or other risk mitigation techniques for activities, counterparties, sectors and regions which are not in line with the risk tolerance. For example, in highly sensitive sectors, the bank may restrict exposures to those counterparties with credible transition plans. 
    Market risk 

    Institutions with material NRF risk exposure in their market risk positions need to assess the loss potential and the impact of increased volatility in relation to the potential materialization of NRF risks. Category 1 to 3 banks with material NRF risks need to do so regularly. The market risk positions cover both trading book positions (bonds, equities, FX, commodities) and banking book investments.  

    The loss potential can be assessed using scenario analyses and stress tests that 

    • Are forward looking and also cover medium and long-term horizons 
    • Consider the impact of a sudden shock on the value of financial instruments 
    • Reflect dependencies between market variables 
    • Embed forward-looking assumptions rather than historical distributions 
    • Take into account the prices and availability of hedges under different scenarios (e.g., in a ‘disorderly transition’ scenario) 

    FINMA notes that the impact of NRF risks on managed investments can lead to business risk (lower revenues) when other institutions are better managing them for their clients. 

    Liquidity risk 

    Examples of the potential impact of NRF risks on a bank’s liquidity position are clients hoarding cash ahead of, or withdrawing funds for repairs after, a natural disaster. FINMA stipulates that banks with material exposure to NRF risks need to evaluate the impact in normal and adverse situations. Material impacts need to be controlled or mitigated.  

    To assess the potential impact, FINMA suggests that banks can consider a combined market-wide and idiosyncratic stress situation in combination with the occurrence of a natural disaster (e.g., a flooding). Not only cash outflows need to considered, but also the impact on the value of financial instruments that are part of the stock of high-quality liquid assets (HQLA). Any material impact needs to be considered in the calibration of the necessary amount of HQLA and the management of liquidity risks. 

    Operational risk 

    Banks with material NRF risks in relation to operational risks need to consider these in risk and control assessments (RCA) for operational risk and in the management of operational risks, where appropriate. This is aligned with the expectations in the FINMA Circular 2023/1 “Operational risks and resilience – Banks”. 

    Category 1 to 3 institutions that perform a systematic collection and analysis of loss data according to FINMA Circular 2023/1 need to clearly show losses and events in relation to NRF risks in relevant reports. 

    Furthermore, material NRF risks in relation to operational risks that may impair critical functions need to be documented and considered in the operational resilience of the institution as reflected in business-continuity plans and disaster-recovery plans. An example could be the unavailability of a data center due to a natural disaster such as a flooding or severe storm. 

    Compliance, legal and reputational risk 

    FINMA sees heightened compliance, legal and reputation risks in relation to NRF risks due to high expectations from society and the government for banks to contribute to achieving society’s sustainability goals. For example, the Swiss government is obliged by law to ensure that the Swiss Financial Sector contributes to a low-emission and climate-resilient development. FINMA expects banks to explicitly assess the potential impact of NRF risks on legal and compliance costs as well as the bank’s reputation. Any such material risks need to embedded in relevant processes and controls. 

    As potential sources of reputation risks for banks, FINMA mentions the nature of investments and lending, the composition of investment portfolios, project financing, client advisory and marketing campaigns. Reputation risk will have a financial impact when clients and/or the public in general lose trust and stop doing business with the institution, leading to lower revenues. Reputation risk should therefore also be considered in new product development and go-to-market, new business initiatives and new marketing campaigns. FINMA also expects institutions to elaborate on the impact of NRF risks on reputation risk in their external disclosure.  

    Implementation 

    To prepare for the implementation of the new FINMA circular, we advise banks to take the steps as outlined below.  

    1 - Design a process to perform a risk identification and materiality assessment of NRF risks, including the universe of NRF risks to be considered

    2 - Execute the process and document the results. This needs to include:

    • Identify possible transmission channels for each combination of NRF risk and existing risk type
    • Assess potential financial materiality for all transmission channels, using internal and external expertise as well as scenario analysis. For this purpose, suitable scenarios need to be identified.

    3 - Decide on metrics (KRIs) for the material NRF risks and identify sources of required data

    4 - Agree on risk tolerances for exposure to NRF risks, including KRIs

    5 - Embed material NRF risks in internal risk management, monitoring and reporting processes for all relevant risk types (credit, market, liquidity, operational, legal, compliance and reputation risk)

    6 - Collect required data and include KRIs in relevant reports

    7 - Prepare external disclosure

    Based on our experience, completing the steps may well take up to a year, including the time needed for internal discussion and decision taking. Although this can be shortened if the bank has already taken initial steps, we advise banks to start timely with the implementation of the circular.  

    If you would like to discuss our proposed approach or are looking for an experienced partner to support you in this process,  please reach out to Pieter Klaassen.  

    1. Regarding international alignment, for banks FINMA specifically refers to the BCBS “Principles for the effective management and supervision of climate-related financial risks” (link) and the NGFS paper on “Nature-related risks: A conceptual framework to guide action by central banks and supervisors” (link).  ↩︎

    Transforming Treasury: Unlocking Growth for Mid-Sized Corporations 

    April 2025
    3 min read

    This article outlines key focus areas for an evolving treasury function, detailing how it can effectively support the organization’s growth trajectory and ensure long-term financial resilience.


    As mid-sized corporations expand, enhancing their Treasury function becomes essential. International growth, exposure to multiple currencies, evolving regulatory requirements, and increased working capital demands are key indicators of the need for a well-structured Treasury function. These factors heighten the risk of challenges such as limited cash visibility, foreign exchange fluctuations, and a greater need for centralization and diverse financing sources—making a solid policy framework essential. A Treasury function built around a clear Target Operating Model (TOM) is critical for managing this complexity and enabling sustainable growth. 

    Zanders has deep experience in helping companies of all sizes define and optimize their Treasury TOM—from small businesses to global multinationals. Across industries from pharmaceuticals to manufacturing, we support organizations at every stage of treasury maturity. A well-designed TOM gives you a roadmap to transform Treasury into a strategic, scalable capability. 

    Benchmarking your Treasury Performance: Know Where You Stand 

    A treasury benchmark study provides a clear and objective view of how your treasury function compares to peers and industry best practices. It helps to identify strengths, spot inefficiencies, and uncover opportunities to enhance performance and resilience. 

    This kind of assessment is especially valuable during periods of rapid growth, when your treasury must adapt to increasing complexity. It also proves to be critical during major events such as mergers, acquisitions, or shifts in the market, where quick adaptation is key. 

    Benchmarking is more than a comparison exercise. It delivers clarity on your current state and defines what’s needed to evolve. That insight becomes the foundation for targeted improvements, stronger risk management, and the development of a TOM that aligns with your organization’s goals. 

    In the sections below, we outline two key areas to consider and how benchmarking provides the insights needed to build your optimal treasury roadmap. 

    Strategic Alignment in Action: Optimizing Organizational Structures for Sustainable Growth 

    As organizations grow, it becomes increasingly important to clearly define treasury responsibilities separately from those of the broader finance function. At the same time, integrating Treasury into the interconnected structure of the Office of the CFO helps build a stronger and more resilient finance organization. A common challenge in change management arises when legacy definitions of roles and responsibilities remain unaddressed. For example, certain processes—such as reconciliation—may continue to be performed within the ERP system simply because “that’s how it’s always been done,” even if it’s no longer the most efficient approach. In some cases, the accounting team may lack the capacity to take ownership of such tasks, resulting in inefficiencies and blurred accountability. 

    A TOM review creates the opportunity to redefine treasury roles, policies, and processes. This should be revisited regularly to keep it aligned with the organization's structure and goals. 

    Key Opportunities for Policy and Procedure Optimization: 

    • Consolidation and Standardization: Implementing unified policies and procedures can enhance efficiency and consistency across the organization. This involves consolidating knowledge, standardizing processes, and ensuring that all departments operate in alignment with organizational goals. 
    • Enhancing Segregation of Duties: Reviewing and refining the organizational structure can better support the segregation of duties, reducing risks and improving operational integrity. This involves defining clear roles and responsibilities to ensure effective internal controls. 
    • Streamlining Operations: Centralizing certain activities currently performed locally can lead to streamlined operations, improved efficiency, and reduced costs. Centralization allows for standardized procedures, clearer decision-making processes, and improved resource utilization. 
    • Strategic Resource Realignment: Redirecting treasury resources from routine operational tasks to strategic initiatives can significantly enhance the treasury's value proposition. By automating non-core activities, the treasury can focus on high-impact projects and internal consulting roles, driving business growth and strategic alignment 

    Alongside a review and assessment of the organizational structure, a deep dive into the treasury technology landscape of an organization is another key aspect to consider when transforming a treasury. 

    Technology as a tool 

    While large organizations typically have an ERP or TMS in place, many small to mid-sized companies have not yet reached that level of maturity. In these organizations, treasury functions often rely heavily on spreadsheets, which can be cumbersome and prone to error. Additionally, treasurers must log into multiple bank portals to gather essential data for reconciliation and forecasting. As the company grows, the time spent on these manual processes increases, along with the risk of mistakes and inefficiencies. 

    Recognizing the need for modernization, treasurers are increasingly focusing on upgrading treasury technology. As mid-sized corporations scale and face greater financial complexity, the reliance on outdated, custom-developed solutions and manual processes becomes more problematic. This makes the need for an enhanced treasury management system even more critical to efficiently manage financial operations and reduce operational risks. 

    Key opportunities for a Treasury Technology Upgrade: 

    • Enhanced Flexibility and Scalability: Upgrading to cloud-based treasury management systems (TMS) can provide greater flexibility, scalability, and cost efficiency compared to traditional onsite systems. This allows for easier access and management of treasury operations across different locations and teams. 
    • Advanced Reporting Capabilities: Real-Time Financial Insights: Implementing a more advanced TMS can address reporting challenges by providing accurate, real-time financial insights. This capability enables treasurers to make timely and informed decisions, enhancing overall financial management and strategic planning. 
    • Streamlined Operations: Upgrading to a system with seamless integration capabilities can automate processes, reduce operational delays, and minimize errors. This integration ensures that all systems communicate effectively, fostering a more efficient and cohesive treasury environment. 
    • Efficient Cost Management and Regulatory Compliance: A modern TMS can help achieve significant cost savings and improve compliance by supporting segregation of duties and multi-level approval processes. This ensures adherence to regulatory requirements while optimizing operational expenses. 
    • Improved Cash Management: Access to real-time data, such as global cash positions, is crucial for effective decision-making and cash management. Upgrading to a system that provides these insights can enhance treasury operations by allowing for proactive management of liquidity and financial risks. 

    Organizations should carefully evaluate when and how to upgrade their TMS, recognizing signs of inadequacy, understanding the benefits, and identifying essential features. A suitable TMS will not only optimize treasury operations but also provide the necessary tools for effective financial management, maintaining a competitive edge in an evolving landscape.  

    Build a Strategic Roadmap 

    With the insights gained from benchmarking, organizations can define their long-term target state and build a tailored solution design. This becomes the foundation for a strategic roadmap that outlines the initiatives needed to elevate your treasury function. Whether the focus is on technology upgrades, process improvements, or resource realignment, each initiative should shape a treasury function that is agile, efficient, and growth ready. A fit-for-purpose treasury is not just a support function; it is a strategic asset that underpins long-term performance and resilience. 

    If you wish to learn more about how we can support the growth of your organization through the treasury function, please contact Ernest Huizing or Vincent Casterman.

    Using Capital Attribution to Understand Your FRTB Capital Requirements

    April 2025

    As FRTB tightens the screws on capital requirements, banks must get smart about capital attribution.


    Industry surveys show that FRTB may lead to a 60% increase in regulatory market risk capital requirements, placing significant pressure on banks. As regulatory market risk capital requirements rise, it is imperative that banks employ robust techniques to effectively understand and manage the drivers of capital. However, isolating these drivers can be challenging and time-consuming, often relying on inefficient and manual techniques. Capital attribution techniques provide banks with a solution by automating the analysis and understanding of capital drivers, enhancing their efficiency and effectiveness in managing capital requirements.

    In this article, we share our insights on capital attribution techniques and use a simulated example to compare the performance of several approaches.

    The benefits of capital attribution

    FRTB capital calculations require large amounts of data which can be difficult to verify. Banks often use manual processes to find the drivers of the capital, which can be inefficient and inaccurate. Capital attribution provides a quantification of risk drivers, attributing how each sub-portfolio contributes to the total capital charge. The ability to quantify capital to various sub-portfolios is important for several reasons:

    An overview of approaches

    There are several existing capital attribution approaches that can be used. For banks to select the best approach for their individual circumstances and requirements, the following factors should be considered:

    • Full Allocation: The sum of individual capital attributions should equal the total capital requirements,
    • Accounts for Diversification: The interactions with other sub-portfolios should be accounted for,
    • Intuitive Results: The results should be easy to understand and explain.

    In Table 1, we summarize the above factors for the most common attribution methodologies and provide our insights on each methodology.

    Table 1: Comparison of common capital attribution methodologies.

    Comparison of approaches: A simulated example

    To demonstrate the different performance characteristics of each of the allocation methodologies, we present a simulated example using three sub-portfolios and VaR as a capital measure. In this example, although each of the sub-portfolios have the same distribution of P&Ls, they have different correlations:

    • Sub-portfolio B has a low positive correlation with A and a low negative correlation with C,
    • Sub-portfolios A and C are negatively correlated with each other.

    These correlations can be seen in Figure 1, which shows the simulated P&Ls for the three sub-portfolios.

    Figure 1: Simulated P&L for the three simulated sub-portfolios: A, B and C.

    The capital allocation results are shown below in Figure 2. Each approach produces an estimate for the individual sub-portfolio capital allocations and the sum of the sub-portfolio capitals. The dotted line indicates the total capital requirement for the entire portfolio.

    Figure 2: Comparison of capital allocation methodologies for the three simulated sub-portfolios: A, B and C. The total capital requirement for the entire portfolio is given by the dotted line.

    Zanders’ verdict

    From Figure 2, we see that several results do not show this attribution profile. For the Standalone and Scaled Standalone approaches, the capital is attributed approximately equally between the sub-portfolios. The Marginal and Scaled Marginal approaches include some estimates with negative capital attribution. In some cases, we also see that the estimate for the sum of the capital attributions does not equal the portfolio capital.

    The Shapley method is the only method that attributes capital exactly as expected. The Euler method also generates results that are very similar to Shapley, however, it allocates almost identical capital in sub-portfolios A and C.  

    In practice, the choice of methodology is dependent on the number of sub-portfolios. For a small number of sub-portfolios (e.g. attribution at the level of business areas) the Shapley method will result with the most intuitive and accurate results. For a large number of sub-portfolios (e.g. attribution at the trade level), the Shapley method may prove to be computationally expensive. As such, for FRTB calculations, we recommend using the Euler method as it is a good compromise between accuracy and cost of computation.

    Conclusion

    Understanding and implementing effective capital attribution methodologies is crucial for banks, particularly given the increased future capital requirements brought about by FRTB. Implementing a robust capital attribution methodology enhances a bank's overall risk management framework and supports both regulatory compliance and strategic planning. Using our simulated example, we have demonstrated that the Euler method is the most practical approach for FRTB calculations. Banks should anticipate capital attribution issues due to FRTB’s capital increases and develop reliable attribution engines to ensure future financial stability.

    For banks looking to anticipate capital attribution issues and potentially mitigate FRTB’s capital increases, Zanders can help develop reliable attribution engines to ensure future financial stability. Please contact Dilbagh Kalsi (Partner) or Robert Pullman (Senior Manager) for more information.

    Fintegral

    is now part of Zanders

    In a continued effort to ensure we offer our customers the very best in knowledge and skills, Zanders has acquired Fintegral.

    Okay

    RiskQuest

    is now part of Zanders

    In a continued effort to ensure we offer our customers the very best in knowledge and skills, Zanders has acquired RiskQuest.

    Okay

    Optimum Prime

    is now part of Zanders

    In a continued effort to ensure we offer our customers the very best in knowledge and skills, Zanders has acquired Optimum Prime.

    Okay
    This site is registered on wpml.org as a development site.