Mastering Pre-Hedge Strategies: Data-Driven FX and Risk Management for 2025

A recent webinar outlined strategies for optimizing corporate treasury FX programs, addressing recent risk events, potential 2025 challenges, and the importance of strong risk management policies.
Recently, Zanders' own Sander de Vries (Director and Head of Zanders’ Financial Risk Management Advisory Practice) and Nick Gage (Senior VP: FX Solutions at Kyriba) hosted a webinar. During the event, they outlined strategies for optimizing corporate treasury FX programs. The duo analysed risk-increasing events from recent years, identified potential challenges that 2025 may pose, and discussed how to address these issues with a strong risk management policy.
Analyzing 2024 Events
The webinar began with a review of 2024's key financial events, particularly the Nikkei shock. During this period, the Japanese Yen experienced significant appreciation against the USD, driven by concerns over U.S. economic projections and overvalued tech stocks. This sharp rally in the JPY led to a wave of unwinding carry trades, forcing investors to sell assets, including stocks, to cover their positions. Additionally, western central banks continued their gradual reduction of interest rates throughout 2024, further influencing market dynamics. The webinar explored the correlation between these economic shocks and anticipated events, particularly their impact on EUR/USD rate fluctuations. By examining how past events shaped market volatility, risk managers can better prepare for potential future disruptions.
Coincidentally, the webinar was held on November 5, 2024, the same day as the U.S. presidential election—a key topic of discussion among the hosts. The election outcome was expected to have a significant impact on markets, increasing both volatility and complexity for corporate risk managers. Shortly after the session, another Trump victory was announced, leading to a strengthening of the USD against the EUR, even as the Federal Reserve reduced interest rates further in the following days. In addition to the election, rising geopolitical tensions and ongoing reductions in base interest rates were highlighted as potential catalysts for heightened market volatility.
Challenges and Opportunities in 2025
By reflecting on past challenges and looking ahead, risk managers can optimize their policies to better mitigate market shocks while protecting P&L statements and balance sheets. Effective risk management begins with accurately identifying and measuring exposures. Without this foundation, FX risk management efforts often fail—commonly referred to as “Garbage In, Garbage Out.” A complete, measurable picture of exposures enables risk managers to select optimal responses and allocate resources efficiently.
During the webinar, a poll revealed that gathering accurate exposure data is the biggest challenge in FX risk management. Common issues include fragmented system landscapes, incomplete data, and delays in data registration. Tools designed for FX risk planning and exposure analysis can address these gaps by verifying data accuracy and ensuring completeness.
A sound financial risk management strategy considers three core drivers:
1- External Factors: These include the ability to pass FX or commodity rate changes to customers and suppliers, as well as regulatory constraints faced by corporate treasuries.
2- Business Characteristics: Shareholder expectations, business margins (high or low), financial leverage, and debt covenants shape this driver.
3- Risk Management Parameters: These encompass a company’s risk-bearing capacity (how much risk it can absorb) and its risk appetite (how much risk it is willing to take).
By incorporating these drivers into their approach, risk managers can design more effective and strategic responses, ensuring resilience in the face of uncertainty.
Understanding these core risk drivers can enable risk managers to derive a more optimal response to their risk profile. To design an optimal hedging strategy, treasurers need to consider various risk responses, which include:
- Risk Acceptance
- Risk Transfer
- Minimization of Risk
- Avoidance of Risk
- Hedging of Risk
Treasury should serve as an advisory function, ensuring other departments contribute to mitigating risks across the organization. While creating an initial risk management policy is critical, continuous review is equally important to ensure the strategy delivers the desired results. To validate the effectiveness of a financial risk management (FRM) strategy, treasurers must regularly measure risks using tools like sensitivity analysis, scenario analysis, and at-risk analysis.
- Sensitivity analysis and scenario analysis evaluate how market shifts could impact the portfolio, though they do not account for the probability of these shifts.
- At-risk analysis combines the impact of changes with their likelihood, providing a more holistic view of risk. However, these models often rely on historical correlations and volatility data. During periods of sharp market movement, volatility assumptions may be overstated, which can undermine the reliability of results.
We recommend a combined approach: use at-risk analysis to understand typical market conditions and scenario analysis to model the impact of worst-case scenarios on financial metrics.
To further enhance hedging strategies, some corporates are turning to advanced methods such as dynamic portfolio Value-at-Risk (VaR). This sophisticated approach improves risk simulation analysis by integrating constraints that maximize VaR reductions while minimizing hedging costs. It generates an efficient frontier of recommended hedges, offering the greatest risk reduction for a given cost.
Dynamic portfolio VaR requires substantial computing power to process a large number of scenarios, allowing for optimized hedging strategies that balance cost and risk reduction. With continuous backtesting, this method provides a robust framework for managing risks in volatile and complex environments, making it a valuable tool for proactive treasury teams.
Conclusion: Preparing for 2025
2024 was a year that brought many challenges for risk managers. The market uncertainty resulting from many larger economic shocks, such as the U.S. Election and multiple geopolitical tensions made an efficient risk management policy more important than ever. Understanding your organization's risk appetite and bearing capacity enables the selection of the optimal risk response. Additionally, the use of methods such as dynamic portfolio VaR can promote your risk management practices to the next level. 2025 looks to create many challenges, where treasurers should stay vigilant and create robust risk management strategies to absorb any adverse shocks. How will you enhance your FX risk management approach in 2025?
You can view the recording of the webinar here. Contact us if you have any questions.
Hidden Savings: Why Payment Orchestration Deserves a Place on the Treasury Agenda

Mastering payment orchestration isn’t just a tech upgrade—it’s a strategic game-changer that can unlock savings, boost liquidity, and fuel global competitiveness in today’s complex, multi-channel marketplace.
The Right Payment Orchestration Strategy: A Critical Factor for Success
The digitalization and globalization of payment infrastructures have significantly impacted businesses in recent years. Financial departments of multinationals operating in a multi-channel environment are now required to manage a diverse range of payment methods, currencies, and interacting platforms, increasing complexity. This results in higher costs for companies, especially those with higher transaction volumes in B2C, diverse market operations, and global presence. These organizations are further required to integrate disparate payment solutions, including mobile payments, e-wallets, and physical channels. The global reach of these systems adds challenges in overseeing cross-border transactions, managing a broad and diverse group of payment service providers.
Moreover, the regulatory environment in the payments sector is constantly evolving. New regulations such as PSD2 in Europe, stricter data protection guidelines like GDPR, and international standards like ISO 20022 are influencing how payments are processed. Organizations must continually adapt to these regulatory changes to ensure compliance while maintaining efficiency and security in their payment processes. These changes can significantly impact systems, processes, internal cost structures, and often require strategic realignment.
Under this scenario, can companies afford to leave millions in savings on the table?
Previously considered merely as an operational task, payment transactions offer a way to unlock untapped value while navigating the complexities of global payments and hence has gained strategic importance. Efficient payment management will enhance a company's liquidity, minimize risks, and reduce costs. By leveraging a modern payment orchestration, companies can centrally control their payment flows, gain real-time transparency over their financial positions, and make better decisions in cash and liquidity management. This not only significantly contributes to a company's competitiveness and financial stability but also has the potential to become a critical success factor for the organization and to directly influence liquidity — a central focus of any treasury function.
In other words, a well-designed payment orchestration strategy is far more than a technical enhancement. It is actually a vital component of a modern, efficient business strategy for corporations operating in B2C environments and those adapting to evolving business models in B2B markets.
Understanding the Use of Payment Orchestration
As customer demand for specific payment channels has grown, companies have increasingly relied on single payment service providers, especially for incoming payments, in order to offer global solutions. However, this dependence amplifies organizational risks, as any failure or disruption within the provider's system can severely impact operations and costs. The challenges and resource demands of switching providers, coupled with reliance on bespoke solutions, further entrench this dependency, leaving companies vulnerable to operational disruptions and potentially escalating indirect transaction costs over time.
Under this context, payment orchestration can mitigate these adverse effects by integrating multiple providers and using advanced technologies, like redundancy and dynamic routing, that improve success rates and minimize transaction costs, ensuring maximum efficiency for global operations.
An efficient payment orchestration strategy requires careful consideration of factors such as the organizational anchoring of the topic (with treasury in the lead), the business model’s geographic presence, customer-specific needs, and the impact of payment methods on revenue and customer acceptance. Optimizing cost structures across providers and channels demands a deep understanding of business models and their fee structures. Under this topic, the Payment Orchestration Expert Thomas Tittelbach, CPO DINAPE Solutions GmbH, quotes the following: “Corporates often struggle with the complexity of payment processing, non-transparent pricing, and provider lock-ins. With the growing success of payment orchestration offerings, Corporates get back in the driver seat for their business.”
Technical Optimization: Breaking It Down
The payment environment is undergoing significant changes, particularly in digital payment methods. Concurrently, there is a shift in demand across various channels, prompting providers to offer innovative solutions. Leading providers like Stripe, Adyen, and Computop rely on advanced platform technologies to ensure optimal payment execution. These providers simplify payment processing by consolidating multiple payment channels—such as Payment Service Providers (PSPs), gateways, and wallets—onto a single platform. This integration aims to reduce costs through intelligent strategies that consistently deliver the best solutions for the organization. To understand this better, it is crucial to examine how payment orchestration impacts cost factors:
- Service and Technological Capabilities: Features offered by providers to help reduce transaction costs.
- Fees Charged by Providers: The costs associated with the services they deliver.
- Post-Processing Services: Solutions that unify and automate data transfer to ERP and treasury systems.
Regarding service and technological capabilities, providers often align their marketing with buzzwords like “dynamic routing,” “multi-acquirer strategy,” and “interchange optimization.” Let us break these down further:
- Dynamic Routing: Dynamic routing allows payment orchestration platforms to optimize transactions by selecting the most cost-effective provider, reducing interchange and scheme fees, improving success rates, and minimizing currency conversion costs. This results in significant cost savings and increased operational efficiency for the organization.
For instance, routing a European customer's payment to a local acquirer instead of a global provider could generate savings of almost 1.5% per transaction by avoiding unnecessary cross-border fees.
- Multi-Acquirer Strategy: By integrating multiple acquirers, platforms leverage local providers for local transactions, create competition to lower fees, and ensure redundancy by rerouting payments if one provider fails.
- Authorization Optimization: Advanced platforms use machine learning to determine the optimal time for submission, select transaction currencies that issuers are most likely to approve, and account for issuer preferences in different regions.
- Interchange Optimization: By routing transactions through acquirers with the lowest interchange fees, particularly in international contexts where these fees can range from 1–2% or higher, companies can minimize costs.
The Business Case: Numbers That Speak for Themselves
Asessing the effectiveness of technological capabilities can be complex, but comparing provider fees is much more straightforward. Transaction size and volume significantly influence overall costs, with volume-based discounts often playing a decisive role. Companies processing over one million transactions annually can negotiate processing fee reductions of up to 0.2%, resulting in substantial savings. Additionally, processing costs can be reduced by up to 0.5% through the addition of new payment methods via payment orchestration.

This case highlights the stark cost differences among providers, analyzing credit cards (EU and international), wallets, and direct debit payments. Providers such as Adyen, Stripe, Payone, Mollie, Checkout.com, Worldline, and Computop were included in the comparison, with a focus on European-based companies operating internationally.
While direct debit offers limited opportunities for price differentiation, credit cards and wallets reveal significant variations. For EU credit cards, fees can differ by a factor of 2.3 annually among providers. Wallet fees, while less variable, still show discrepancies of up to 20%. Consequently, companies could pay over 1 million EUR more in credit card fees alone when choosing a more expensive provider.
Not all fees are transparently structured, and acronyms are frequently used to describe cost components. However, the fees for each payment method can be categorized into the following structure:
Fee Component | Description |
Processing Fee | Fixed amount per transaction. |
Interchange Fee | Often regulated (e.g., 0.2% for debit cards in the EU, app. 2% for credit cards in the US). |
Scheme Fee | Varies by card network (e.g., Visa or Mastercard). |
Acquirer Margin | Percentage of the transaction value. |


Our analysis closely examined the official fee structures of the providers listed, and calculated costs based on specific parameters outlined in Figure 1. The scenario focuses heavily on EU credit cards and wallets, while chargeback costs, currency conversion fees, and other cost components were excluded for simplicity. Figure 2 provides an overview of the average costs across all providers. A key finding is that payment processing costs, particularly for credit cards and wallets, can accumulate significantly if the details are overlooked. For wallets specifically, fee structures show considerable variation between providers.
These differences are significant. In total, the gap between the most cost-effective and most expensive provider in our hypothetical analysis reached nearly EUR 5 million annually— a figure that one cannot neglect and would justify a closer look at payment orchestration.
Payment Orchestration as the untapped Opportunity
The business case underscores the significant cost differences between providers and the key factors affecting payment efficiency, showing that payment orchestration can play a significant role in the organization’s success in managing cost and liquidity — a central focus of any treasury function. As a result, a well-executed payment orchestration strategy is not only an operational improvement but also a strategic element for a company’s overall success. Consequently, selecting the right provider goes beyond operational considerations and demands strategic evaluation of company goals, payment flows, and international presence.
By integrating payment orchestration into their cost and efficiency strategies, multinationals can achieve measurable financial and organizational benefits. With Zanders’ expertise, companies can align payment orchestration capabilities into their payment transactions’ framework to enhance operational transparency, optimize transaction approvals, achieve cost savings, and improve liquidity management. This can lead to increased profitability, enhanced liquidity, and improved compliance across different jurisdictions. Neglecting this potential means leaving substantial untapped value on the table—value that could otherwise drive growth and innovation.
If you would like to learn more about digital payment strategies in corporate treasury, please reach out to Sven Warnke or Gustavo Alves Caetano.
From Day 1 to Strategic Partner: Building a Treasury Function for a Carved-Out Business

With this article we delve into the key mechanics of creating a successful Treasury function for a carved-out business, starting with its strategic foundation – a tailored, well-defined Target Operating Model (TOM).
In our previous article 'Navigating the Financial Complexity of Carve-Outs: The Treasury Transformation Challenge and Zanders’ Expert Solution' we outlined that in a carve-out, the TOM for the new Treasury function must be tailored to the unique characteristics of the newly formed entity. This includes considerations such as operational scale, geographical footprint and risk profile, aligned with investor preferences. To achieve this, the TOM will outline how the Treasury function will operate, including its governance structure, team size and composition, service delivery model, and its evolving role within the future organization.
Hence, a clear and prioritized TOM provides a roadmap for Treasury, ensuring the function is ready for its new challenges and addressing the carved-out entity’s needs. Furthermore, it ensures that Treasury’s role as a value-adding partner is fully realized, helping to drive long-term success.
So, where do you start with right-sizing the new Treasury organization? Below, we outline the key strategic and transactional topics to consider.
Shaping the Future of Treasury: Priorities and Strategic Objectives
The TOM of a Treasury function serves three essential purposes:
- (1) Ensuring the minimum viable design is in place for seamless operations from day one.
- (2) Defining how the Treasury function will actively drive value and align with the business’s strategic goals.
- (3) Creating a roadmap for growth: Laying the groundwork for Treasury’s future scalability and maturity as the business evolves.
From the outset, defining the TOM presents a key opportunity for Treasury to evolve from a purely operational support function to a strategic advisor. This transformation allows Treasury to play a pivotal role in decision-making, particularly in areas like working capital optimization, capital allocation, and future M&A activities. Moreover, it ensures that Treasury’s objectives are closely aligned with the broader goals of the organization, including growth, operational efficiency, and risk mitigation.
To achieve these objectives, a well-defined governance structure is essential. Governance provides the authority, alignment, and oversight necessary for effective risk management, cash flow optimization, and other core activities – transforming strategic aspirations into tangible results.
Governance Structure: Striking the Right Balance
Fundamentally a goal of the Treasury function is how it supports the broader business in achieving financial stability, operational efficiency and strategic objectives. Achieving this requires a governance structure that balances centralized control with local autonomy, tailored to the organization’s global footprint. The TOM serves as a unifying framework, aligning processes not only across Treasury but also with other functions to ensure consistency and collaboration.
This can be done by deciding where Treasury activities will be performed and applying either a centralized model (managed at a headquarters level) or decentralized model (with regional or local autonomy under a defined framework). With the need to keep agility in mind it’s a balancing act to ensure a lean but skilled Treasury team is in place to handle immediate requirements, with contingency plans for scaling as the business grows. Therefore, in most carve-outs a hybrid model is preferred initially, allowing for centralized oversight of critical activities (e.g., funding and risk management) while empowering local teams to handle region-specific cash and banking needs.
To complement this structure, clearly defining roles and responsibilities is crucial. Treasury teams must have well-delineated duties across areas like cash management, risk management, funding, financial reporting, and technology operations, ensuring effective decision-making and accountability. Equally important is the design of Treasury processes. As a central pillar of the organizational landscape, Treasury processes should prioritize standardization across regions to streamline operations and mitigate risks. Further, as a logical consequence of defined roles and responsibilities as well as of well-designed Treasury processes, the new TOM as offers an opportunity to clearly define and implement process and data interfaces between Treasury and other corporate functions, such as accounting and procurement, fostering seamless collaboration and enhancing overall efficiency.
Unveiling the Future: Day 1 Readiness and the Launch of a Bold Treasury Operating Model
As the governance structure is established, the next critical milestone in the carve-out process is ensuring readiness for Day 1 - when the responsibility officially shifts from the parent company to the newly independent entity. A well-designed TOM provides not only the governance framework but also the operational infrastructure needed for a seamless transition. This includes setting up the foundational capabilities that will allow the Treasury to operate without disruption, from basic cash visibility and banking relationships to staffing and technology setups.
- (1) Cash Visibility and Control with basic structures in place to ensure real-time visibility into cash positions across bank accounts, usually brought about by establishing processes for daily cash reconciliations, payment approvals and funding allocations.
- (2) Banking Relationships with the core banking partners onboarded and the account structures in place to manage inflows, outflows, and currency requirements. This may also include transferring the existing banking arrangements from the parent company or establishing new core banks.
- (3) Technology Setup comprising of a basic Treasury Management System (TMS) or interim solution in place to support key processes.
- (4) An appropriate level of Staffing and Expertise with a lean but skilled Treasury team in place to handle immediate requirements, with contingency plans for scaling as the business grows.
Underpinning these areas is the formalisation of a Treasury Policy and Procedures including essentials such as policies for cash management, payment processing and risk management to ensure compliance and mitigation of operational risks.
A Roadmap for achieving Treasury Maturity
While the minimum viable design outlines Day 1 functionality, the TOM also provides a clear pathway for scaling and maturing the Treasury function as the business evolves. Viewing the TOM as a strategic concept enables the business to set a target trajectory from the outset and identify and prioritise areas for long-term maturity. The design of a well-balanced – i.e. of appropriate complexity but with scope to grow – function at this stage is a critical tactical decision, something that Zanders does by taking into account development trends using our proprietary Treasury Risk Maturity Model.
Following the carve-out activity and as the new entity stabilises there will naturally be a move towards greater centralization and standardization of core Treasury functions, whose deployment can be improved by addressing them in the TOM. In turn, a centralization will result in the streamlining of processes such as cash pooling, intercompany funding, and hedging activities to improve efficiency and reduce costs.
To support these optimization processes after the transaction, designing the TOM to be agile enables the Treasury to scale operations both seamlessly in response to business growth or market changes but also to standardise by introducing consistent policies and processes. For example these could include those for cash management, risk mitigation, and compliance across all regions to enhance control and visibility across the function.
Ultimately the implementation of a well-defined TOM from the outset will be beneficial on both a functional and wider level. Taking Treasury as an intrinsic need of the business and therefore as a strategic function (as opposed to siloed and complimentary) empowers the broader finance department’s contributions to the strategic moves of the company overall.
Strategic evolution to Treasury excellence
Building a Treasury function for a carved-out entity is a complex yet rewarding process that requires strategic planning across all Treasury and finance areas. A clear TOM not only ensures operational readiness from Day 1 but also provides a robust framework for Treasury’s evolution. By defining how the Treasury function will serve the business, establishing a minimum design for immediate needs, and outlining a roadmap for maturity, the TOM ensures alignment with the carved-out entity’s goals. Over time, this approach positions Treasury as a strategic enabler, driving financial stability, operational efficiency, and long-term growth.
Zanders’ experts for Treasury M&A can help you in assessing and designing fit-for-purpose TOM models and considering the relevant aspects of their implementation, thereby realizing the specific benefits identified in the design phase. We bring extensive experience in M&A transactions and will effectively navigate you through the process of establishing a new Treasury function. Reach out to discuss your case with us.
In the next edition of this series, we look at implementing effective Cash and Liquidity Management practices within the newly carved-out entity, key areas of focus and the risks involved.
If you would like to learn more about how to build a Treasury function for a Carved-Out Business - please reach out to our Director Stephan Plein.
Redefining Credit Portfolio Strategies: Balancing Risk & Reward in a Volatile Economy

This article delves into a three-step approach to portfolio optimization by harnessing the power of advanced data analytics and state-of-the-art quantitative models and tools.
In today's dynamic economic landscape, optimizing portfolio composition to fortify against challenges such as inflation, slower growth, and geopolitical tensions is ever more paramount. These factors can significantly influence consumer behavior and impact loan performance. Navigating this uncertain environment demands banks adeptly strike a delicate balance between managing credit risk and profitability.
Why does managing your risk reward matter?
Quantitative techniques are an essential tool to effectively optimize your portfolio’s risk reward profile, as this aspect is often based on inefficient approaches.
Existing models and procedures across the credit lifecycle, especially those relating to loan origination and account management, may not be optimized to accommodate current macro-economic challenges.

Figure 1: Credit lifecycle.
Current challenges facing banks
Some of the key challenges banks face when balancing credit risk and profitability include:

Our approach to optimizing your risk reward profile
Our optimization approach consists of a holistic three step diagnosis of your current practices, to support your strategy and encourage alignment across business units and processes.
The initial step of the process involves understanding your current portfolio(s) by using a variety of segmentation methodologies and metrics. The second step implements the necessary changes once your primary target populations have been identified. This may include reassessing your models and strategies across the loan origination and account management processes. Finally, a new state-of-the-art Early Warning System (EWS) can be deployed to identify emerging risks and take pro-active action where necessary.

A closer look at redefining your target populations
With the proliferation of advanced data analytics, banks are now better positioned to identify profitable, low-risk segments. Machine Learning (ML) methodologies such as k-means clustering, neural networks, and Natural Language Processing (NLP) enable effective customer grouping, behavior forecasting, and market sentiment analysis.
Risk-based pricing remains critical for acquisition strategies, assessing segment sensitivity to different pricing strategies, to maximize revenue and reduce credit losses.

Figure 2: In the illustration above, we can visually see the impact on earnings throughout the credit lifecycle driven by redefining the target populations and application of different pricing strategies.
In our simplified example, based on the RAROC metric applied to an unsecured loans portfolio, we take a 2-step approach:
1- Identify target populations by comparing RAROC across different combinations of credit scores and debt-to-income (DTI) ratios. This helps identify the most capital efficient segments to target.
2- Assess the sensitivity of RAROC to different pricing strategies to find the optimal price points to maximize profit over a select period - in this scenario we use a 5-year time horizon.

Figure 3: The top table showcases the current portfolio mix and performance, while the bottom table illustrates the effects of adjusting the pricing and acquisition strategy. By redefining the target populations and changing the pricing strategy, it is possible to reallocate capital to the most profitable segments whilst maintaining within credit risk appetite. For example, 60% of current lending is towards a mix of low to high RAROC segments, but under the new proposed strategy, 70% of total capital is allocated to the highest RAROC segments.
Uncovering risks and seizing opportunities
The current state of Early Warning Systems
Many organizations rely on regulatory models and standard risk triggers (e.g., no. of customers 30 day past due, NPL ratio etc.) to set their EWS thresholds. Whilst this may be a good starting point, traditional models and tools often miss timely deteriorations and valuable opportunities, as they typically use limited and/or outdated data features.
Target state of Early Warning Systems
Leveraging timely and relevant data, combined with next-generation AI and machine learning techniques, enables early identification of customer deterioration, resulting in prompt intervention and significantly lower impairment costs and NPL ratios.
Furthermore, an effective EWS framework empowers your organization to spot new growth areas, capitalize on cross-selling opportunities, and enhance existing strategies, driving significant benefits to your P&L.


Figure 4: By updating the early warning triggers using new timely data and advanced techniques, detection of customer deterioration can be greatly improved enabling firms to proactively support clients and enhance the firm’s financial position.
Discover the benefits of optimizing your portfolios
Discover the benefits in optimizing your portfolios’ risk-reward profile using our comprehensive approach as we turn today’s challenges into tomorrow’s advantages. Such benefits include:

Conclusion
In today's rapidly evolving market, the need for sophisticated credit risk portfolio management is ever more critical. With our comprehensive approach, banks are empowered to not merely weather economic uncertainties, but to thrive within them by striking the optimal risk-reward balance. Through leveraging advanced data analytics and deploying quantitative tools and models, we help institutions strategically position themselves for sustainable growth, and comply with increasing regulatory demands especially with the advent of Basel IV. Contact us to turn today’s challenges into tomorrow’s opportunities.
For more information on this topic, contact Martijn de Groot (Partner) or Paolo Vareschi (Director).
Optimizing Liquidity Management with SAP: Cash Flow Analyzer and Short-Term Cash Positioning

Mastering liquidity management is key to business success. With SAP’s Cash Flow Analyzer and Short-Term Cash Positioning apps, companies can streamline financial planning and make smarter, data-driven decisions.
Effective liquidity management is essential for businesses of all sizes, yet achieving it is often challenging. Many organizations face difficulties due to fragmented data, inconsistent reporting, and the complexity of managing cash flows across different time horizons. These challenges are amplified in companies with complex cash flow structures, where tailored configurations and precise tracking are crucial for accurate planning and decision-making.
SAP offers two tools to support liquidity management: the Cash Flow Analyzer and the Short-Term Cash Positioning apps. This article examines how these tools address different aspects of liquidity management, helping businesses navigate their financial challenges and make informed decisions.
Exploring the Cash Flow Analyzer
The Cash Flow Analyzer app offers a versatile approach to liquidity management. It enables companies to screen and analyze cash flows with filters, drill-down options, and flexible time horizons—whether for medium or long-term planning.
A key strength of this app lies in its integration across SAP modules like FI (Finance), MM (Materials Management), SD (Sales and Distribution), and TRM (Treasury and Risk Management). This ensures real-time updates through SAP’s One Exposure functionality, giving companies a single source of truth for cash flow data.
SAP continues to enhance this app with updates, such as the Liquidity Calculation: G/L Classification, available in the public cloud. This feature simplifies configurations by replacing the older Define Default Liquidity Items for G/L Accounts while also offering a migration option for existing setups. With this new feature, old document chain stopping functionality in the predecessor version will be replaced by G/L account classification. Also, you will be able to see the prioritization logic behind the liquidity item assignment when there are multiple derivations for the same G/L account. Although currently available only for public cloud users, SAP is expected to bring these enhancements to private cloud and on-premises systems in the future.
Query-Based Liquidity Derivations
For companies with complex cash flow structures, query-based derivations allow for tailored liquidity item configurations in addition to defining default liquidity items for G/L accounts. This feature lets you define rules for assigning liquidity items based on unique needs. You can set different conditions in the query to pull up the correct liquidity item from the different sources which are captured by different origins of queries.
These queries seamlessly integrate data from various SAP modules, such as Finance, Treasury, and Sales, ensuring comprehensive coverage of liquidity-related processes. Additionally, hierarchical rules can be applied to manage overlapping conditions, providing clear prioritization for liquidity item assignments. By accommodating both actual and forecasted cash flows, query-based derivations support precise tracking and planning across different time horizons. Furthermore, the approach enhances transparency by offering visibility into the logic behind liquidity item assignments, enabling finance teams to validate and adapt their configurations effectively.
Cash Flow Analyzer in Action
The app’s flexibility doesn’t stop at data collection. It allows users to create custom layouts and views based on filters like liquidity items, company codes, bank accounts, or cash pools. By defining hierarchies, businesses can drill down into the details of their liquidity positions, analyze trends, and make data-driven decisions. Please find a sample screenshot from a demo system:

As shown in the screenshot above, the upper red rectangle allows you to set or remove a variety of filters as needed. The red rectangle below is another element that can be added or removed from your layout. If you want to see more details about cash flows, click the gear button and add or remove more.
Why Use Short-Term Cash Positioning?
With the Cash Flow Analyzer’s extensive capabilities, you may wonder why SAP offers a separate app for short-term cash management. The answer lies in its focus.
Short-Term Cash Positioning (STCP) is an SAP Fiori app designed to give treasury teams a focused view of their immediate cash position. Unlike broader tools like the Cash Flow Analyzer, STCP hones in on short-term liquidity, offering a straightforward way to manage cash inflows and outflows over a defined horizon—usually within a few days or weeks.
At its core, STCP organizes and presents cash flow data based on planning levels and cash position profiles. These profiles help group bank accounts or cash pools, providing treasury teams with a clear snapshot of their short-term liquidity. With intuitive visuals and fast data processing, the app simplifies decision-making in fast-paced environments. While the Cash Flow Analyzer provides a broad, flexible view, the Short-Term Cash Positioning (STCP) app is purpose-built for immediate liquidity analysis.
Here are some reasons to consider using STCP:
1- Speed and Simplicity: Designed for quick, straightforward analysis, the STCP app helps companies handle large volumes of cash flow data without getting bogged down by complexity.
2- Focus on Planning Levels: This app emphasizes planning levels to categorize cash flows for short-term financial planning.
3- Customizable Cash Position Profiles: Users can define hierarchies based on bank account master data, existing bank account hierarchies, or cash pools. These profiles allow for real-time monitoring of cash flows within defined groupings.
4- Clear Visualization: By focusing on shorter time horizons, STCP provides an actionable snapshot of a company’s cash position. This can be helpful for businesses with frequent cash inflows and outflows.
5- Efficient Decision-Making: The STCP app offers a focused approach, helping decision-makers access relevant information to manage liquidity effectively while avoiding unnecessary complexities.
For businesses balancing long-term planning with immediate liquidity needs, the Short-Term Cash Positioning app can complement the Cash Flow Analyzer. Together, they provide a comprehensive set of tools that address various aspects of liquidity management. However, businesses should carefully assess their specific requirements to determine the most effective way to use these tools, ensuring they align with their operational priorities and evolving needs.
A Holistic Approach to Liquidity Management
The Cash Flow Analyzer provides broader liquidity insights, while the Short-Term Cash Positioning app focuses on immediate needs. Together, these tools can help businesses address liquidity challenges across different time horizons, supporting better visibility and informed decision-making in both the short and long term. Since these two tools have distinct capabilities, a thorough evaluation of these differences should be conducted to ensure the chosen solution aligns with the organization’s specific needs and priorities.
If you would like to learn more about the Cash Flow Analyzer and Short-Term Cash Positioning apps - or explore how to customize these tools for your business needs - please reach out to our Senior Consultant Merve Korukcu.
Strategic Insights on S/4HANA Treasury Innovations and Migration Options

On Thursday, November 14th, SAP Netherlands and Zanders hosted a roundtable focused on upgrading to S/4HANA. Nineteen participants representing nine companies, actively engaged in the discussions. This article will focus on the specifics of the discussions.
Exploring S/4HANA Functionalities
The roundtable session started off with the presentation of SAP on some of the new S/4HANA functionalities. New functionalities in the areas of Cash Management, Financial Risk Management, Working Capital Management and Payments were presented and discussed. In the area of Cash Management, the main enhancements can be found in the management of bank relationships, managing cash operations, cash positioning, and liquidity forecasting and planning. These enhancements provide greater visibility into bank accounts and cash positions, a more controlled liquidity planning process across the organization, increased automation, and better execution of working capital strategies. In Financial Risk Management, the discussion highlighted S/4HANA’s support for smart trading processes, built-in market data integration, and more advanced on-the-fly analysis capabilities. All providing companies with a more touchless, automated and straight through process of their risk management process. The session also covered Working Capital enhancements, including a presentation on the Taulia solution offered by SAP, which provides insights into supporting Payables and Receivables Financing. Finally, the session explored innovations in the Payments area, such as payment verification against sanction lists, format mapping tools, the SAP Digital Payments Add-on, and automated corporate-to-bank cloud connectivity.
Migration Strategies: Getting to S/4HANA
While the potential of S/4HANA was impressive, the focus shifted to migration strategies. Zanders presented various options for transitioning from an ECC setup to an S/4HANA environment, sparking a lively discussion. Four use cases were defined, reflecting the diverse architectural setups in companies. These setups include:
- An integrated architecture, where the SAP Treasury solution is embedded within the SAP ERP system
- A treasury sidecar approach, where the SAP Treasury solution operates on a separate box and needs to integrate with the SAP ERP system box
- Treasury & Cash & Banking side car
- Leveraging Treasury on an S/4HANA Central Finance box
The discussion also covered two key migration strategies: the brownfield approach and the greenfield approach. In a brownfield approach, the existing system setup is technically upgraded to the new version, allowing companies to implement S/4HANA enhancements incrementally. In contrast, a greenfield approach involves building a new system from scratch. While companies can reuse elements of their ECC-based SAP Treasury implementation, starting fresh allows them to fully leverage S/4HANA’s standard functionalities without legacy constraints. However, the greenfield approach requires careful planning for data migration and testing, as legacy data must be transferred to the new environment.
Decoupling Treasury: The Sidecar Approach
The greenfield approach also raised the question of whether treasury activities should migrate to S/4HANA first using a sidecar system. This would involve decoupling treasury from the integrated ECC setup and transitioning to a dedicated S/4HANA sidecar system. This approach allows treasury to access new S/4HANA functionalities ahead of the rest of the organization, which can be beneficial if immediate enhancements are required. However, this setup comes with challenges, including increased system maintenance complexity, additional costs, and the need to establish new interfaces.
However, this setup comes with challenges, including increased system maintenance complexity, additional costs, and the need to establish new interfaces. Companies need to weigh the benefits of an early treasury migration against these potential drawbacks as part of their overall S/4HANA strategy. With this consideration in mind, participants reflected on the broader lessons from companies already using S/4HANA.
Lessons from Early Adopters
Companies that have already migrated to S/4HANA emphasized two critical planning areas: testing and training. Extensive testing—ideally automated—should be prioritized, especially for diverse payment processes. Similarly, training is essential to ensure effective change management, reducing potential issues after migration.
These insights highlight the importance of preparation in achieving a smooth migration. As organizations transition to S/4HANA, another important consideration is the potential impact on the roles and responsibilities within treasury teams.
Impact on Treasury Roles
Participants discussed whether S/4HANA would alter roles and responsibilities within treasury departments. The consensus was that significant changes are unlikely, particularly in a brownfield approach. Even in a greenfield approach, roles and responsibilities are expected to remain largely unchanged.
Conclusion
The roundtable highlighted the significant value S/4HANA brings to treasury operations, particularly through enhanced functionalities in Cash Management, Financial Risk Management, Working Capital Management, and Payments.
Participants discussed the pros and cons of brownfield and greenfield migration strategies, with insights into the sidecar approach for treasury as a potential transitional strategy. Early adopters emphasized the critical importance of thorough testing and training for a successful migration, while noting that treasury roles and responsibilities are unlikely to see major changes
If you would like to hear more about the details of the discussion, please reach out to Laura Koekkoek, Partner at Zanders, l.koekkoek@zandersgroup.com
The Benefits of Exposure Attribution in Counterparty Credit Risk

In an increasingly complex regulatory landscape, effective management of counterparty credit risk is crucial for maintaining financial stability and regulatory compliance.
Accurately attributing changes in counterparty credit exposures is essential for understanding risk profiles and making informed decisions. However, traditional approaches for exposure attribution often pose significant challenges, including labor-intensive manual processes, calculation uncertainties, and incomplete analyses.
In this article, we discuss the issues with existing exposure attribution techniques and explore Zanders’ automated approach, which reduces workloads and enhances the accuracy and comprehensiveness of the attribution.
Our approach to attributing changes in counterparty credit exposures
The attribution of daily exposure changes in counterparty credit risk often presents challenges that strain the resources of credit risk managers and quantitative analysts. To tackle this issue, Zanders has developed an attribution methodology that efficiently automates the attribution process, improving the efficiency, reactivity and coverage of exposure attribution.
Challenges in Exposure Attribution
Credit risk managers monitor the evolution of exposures over time to manage counterparty credit risk exposures against the bank’s risk appetite and limits. This frequently requires rapid analysis to attribute the changes to exposures, which presents several challenges:

Zanders’ approach: an automated approach to exposure attribution
Our methodology resolves these problems with an analytics layer that interfaces with the risk engine to accelerate and automate the daily exposure attribution process. The results can also be accessed and explored via an interactive web portal, providing risk managers and senior management with the tools they need to rapidly analyze and understand their risk.

Key features and benefits of our approach
Zanders’ approach provides multiple improvements to the exposure attribution process. This reduces the workloads of key risk teams and increases risk coverage without additional overheads. Below, we describe the benefits of each of the main features of our approach.

Zanders Recommends
An automated attribution of exposures empowers banks teams to better understand and handle their counterparty credit risk. To make the best use of automated attribution techniques, Zanders recommends that banks:
- Increase risk scope: The increased efficiency of attribution should be used to provide a more comprehensive and granular coverage of the exposures of counterparties, sectors and regions.
- Reduce quant utilization: Risk managers should use automated dashboards and analytics to perform their own exposure investigations, reducing the workload of quantitative risk teams.
- Augment decision making: Risk managers should utilize dashboards and analytics to ensure they make more timely and informed decisions.
- Proactive monitoring: Automated reports and monitoring should be reviewed regularly to ensure risks are tackled in a proactive manner.
- Increase information transfer: Dashboards should be made available across teams to ensure that information is shared in a transparent, consistent and more timely manner.
Conclusion
The effective management of counterparty credit risk is a critical task for banks and financial institutions. However, the traditional approach of manual exposure attribution often results in inefficient processes, calculation uncertainties, and incomplete analyses. Zanders' innovative methodology for automating exposure attribution offers a comprehensive solution to these challenges and provides banks with a robust framework to navigate the complexities of exposure attribution. The approach is highly effective at improving the speed, coverage, and accuracy of exposure attribution, supporting risk managers and senior management to make informed and timely decisions.
For more information about how Zanders can support you with exposure attribution, please contact Dilbagh Kalsi (Partner) or Mark Baber (Senior Manager).
Converging on resilience: Integrating CCR, XVA, and real-time risk management

In a world where the Fundamental Review of the Trading Book (FRTB) commands much attention, it’s easy for counterparty credit risk (CCR) to slip under the radar.
However, CCR remains an essential element in banking risk management, particularly as it converges with valuation adjustments. These changes reflect growing regulatory expectations, which were further amplified by recent cases such as Archegos. Furthermore, regulatory focus seems to be shifting, particularly in the U.S., away from the Internal Model Method (IMM) and toward standardised approaches. This article provides strategic insights for senior executives navigating the evolving CCR framework and its regulatory landscape.
Evolving trends in CCR and XVA
Counterparty credit risk (CCR) has evolved significantly, with banks now adopting a closely integrated approach with valuation adjustments (XVA) — particularly Credit Valuation Adjustment (CVA), Funding Valuation Adjustment (FVA), and Capital Valuation Adjustment (KVA) — to fully account for risk and costs in trade pricing. This trend towards blending XVA into CCR has been driven by the desire for more accurate pricing and capital decisions that reflect the true risk profile of the underlying instruments/ positions.
In addition, recent years have seen a marked increase in the use of collateral and initial margin as mitigants for CCR. While this approach is essential for managing credit exposures, it simultaneously shifts a portion of the risk profile into contingent market and liquidity risks, which, in turn, introduces requirements for real-time monitoring and enhanced data capabilities to capture both the credit and liquidity dimensions of CCR. Ultimately, this introduces additional risks and modelling challenges with respect to wrong way risk and clearing counterparty risk.
As banks continue to invest in advanced XVA models and supporting technologies, senior executives must ensure that systems are equipped to adapt to these new risk characteristics, as well as to meet growing regulatory scrutiny around collateral management and liquidity resilience.
The Internal Model Method (IMM) vs. SA-CCR
In terms of calculating CCR, approaches based on IMM and SA-CCR provide divergent paths. On one hand, IMM allows banks to tailor models to specific risks, potentially leading to capital efficiencies. SA-CCR, on the other hand, offers a standardised approach that’s straightforward yet conservative. Regulatory trends indicate a shift toward SA-CCR, especially in the U.S., where reliance on IMM is diminishing.
As banks shift towards SA-CCR for Regulatory capital and IMM is used increasingly for internal purposes, senior leaders might need to re-evaluate whether separate calibrations for CVA and IMM are warranted or if CVA data can inform IMM processes as well.
Regulatory focus on CCR: Real-time monitoring, stress testing, and resilience
Real-time monitoring and stress testing are taking centre stage following increased regulatory focus on resilience. Evolving guidelines, such as those from the Bank for International Settlements (BIS), emphasise a need for efficiency and convergence between trading and risk management systems. This means that banks must incorporate real-time risk data and dynamic monitoring to proactively manage CCR exposures and respond to changes in a timely manner.
CVA hedging and regulatory treatment under IMM
CVA hedging aims to mitigate counterparty credit spread volatility, which affects portfolio credit risk. However, current regulations limit offsetting CVA hedges against CCR exposures under IMM. This regulatory separation of capital for CVA and CCR leads to some inefficiencies, as institutions can’t fully leverage hedges to reduce overall exposure.
Ongoing BIS discussions suggest potential reforms for recognising CVA hedges within CCR frameworks, offering a chance for more dynamic risk management. Additionally, banks are exploring CCR capital management through LGD reductions using third-party financial guarantees, potentially allowing for more efficient capital use. For executives, tracking these regulatory developments could reveal opportunities for more comprehensive and capital-efficient approaches to CCR.
Leveraging advanced analytics and data integration for CCR
Emerging technologies in data analytics, artificial intelligence (AI), and scenario analysis are revolutionising CCR. Real-time data analytics provide insights into counterparty exposures but typically come at significant computational costs: high-performance computing can help mitigate this, and, if coupled with AI, enable predictive modelling and early warning systems. For senior leaders, integrating data from risk, finance, and treasury can optimise CCR insights and streamline decision-making, making risk management more responsive and aligned with compliance.
By leveraging advanced analytics, banks can respond proactively to potential CCR threats, particularly in scenarios where early intervention is critical. These technologies equip executives with the tools to not only mitigate CCR but also enhance overall risk and capital management strategies.
Strategic considerations for senior executives: Capital efficiency and resilience
Balancing capital efficiency with resilience requires careful alignment of CCR and XVA frameworks with governance and strategy. To meet both regulatory requirements and competitive pressures, executives should foster collaboration across risk, finance, and treasury functions. This alignment will enhance capital allocation, pricing strategies, and overall governance structures.
For banks facing capital constraints, third-party optimisation can be a viable strategy to manage the demands of SA-CCR. Executives should also consider refining data integration and analytics capabilities to support efficient, resilient risk management that is adaptable to regulatory shifts.
Conclusion
As counterparty credit risk re-emerges as a focal point for financial institutions, its integration with XVA, and the shifting emphasis from IMM to SA-CCR, underscore the need for proactive CCR management. For senior risk executives, adapting to this complex landscape requires striking a balance between resilience and efficiency. Embracing real-time monitoring, advanced analytics, and strategic cross-functional collaboration is crucial to building CCR frameworks that withstand regulatory scrutiny and position banks competitively.
In a financial landscape that is increasingly interconnected and volatile, an agile and resilient approach to CCR will serve as a foundation for long-term stability. At Zanders, we have significant experience implementing advanced analytics for CCR. By investing in robust CCR frameworks and staying attuned to evolving regulatory expectations, senior executives can prepare their institutions for the future of CCR and beyond thereby avoiding being left behind.
Confirmed Methodology for Credit Risk in EBA 2025 Stress Test

On November 12 2024, the confirmed methodology for the EBA 2025 stress testing exercise was published on the EBA website. This is the final version of the draft for initial consultation that was published earlier.
The timelines for the entire exercise have been extended to accommodate the changes in scope: | |
Launch of exercise (macro scenarios) | Second half of January 2025 |
First submission of results to the EBA | End of April 2025 |
Second submission to the EBA | Early June 2025 |
Final submission to the EBA | Early July 2025 |
Publication of results | Beginning of August 2025 |
Below we share the most significant aspects for Credit Risk and related challenges. In the coming weeks we will share separate articles to cover areas related to Market Risk, Net Interest Income & Expenses and Operational Risk.
The final methodology, along with the requirements introduced by the CRR3 poses significant challenges on the execution of the Credit Risk stress testing. Earlier we provided details on this topic and possible impacts on stress testing results, see our article: “Implications of CRR3 for the 2025 EU-wide stress test” Regarding the EBA 2025 stress test we view the following 5 points as key areas of concern:
1- The EBA stress test requires different starting points; actual and restated CRR3 figures. This raises requirements in data management, reporting and implementation of related processes.
2- The EBA stress test requires banks to report both transitional and fully loaded results under CRR3; this requires the execution of additional calculations and implementation of supporting data processes.
3- The changes in classification of assets require targeted effort on the modelling side, stress test approach and related data structures.
4- Implementation of the Standardized Approach output floor as part of the stress test logic.
5- Additional effort is needed to correctly align Pillar 1 and Pillar 2 models, in terms of development, implementation and validation.
At Zanders, we specialize in risk advisory and our consultants have participated in every single EU wide stress testing exercise, as well as a few others going back to the initial stress tests in 2009 following the Great Financial Crisis. We can support you throughout all key stages of the stress testing exercise across all areas to ensure a successful submission of the final templates.
Based on the expertise in Stress Testing we have gained over the last 15 years, our clients benefit the most from our services in these areas:
- Full gap analysis against latest set of requirements
- Review, design and implementation of data processes & relevant data quality controls
- Alignment of Pillar 2 models to Pillar 1 (including CCR3 requirements)
- Design, implementation and execution of stress testing models
- Full automation of populating EBA templates including reconciliation and data quality checks.
Contact us for more information about how we can help make this your most successful run yet. Reach out to Martijn de Groot, Partner at Zanders.
Insights into cracking model risk for prepayment models

This article examines different methods for quantifying and forecasting model risk in prepayment models, highlighting their respective strengths and weaknesses.
Within the field of financial risk management, professionals strive to develop models to tackle the complexities in the financial domain. However, due to the ever-changing nature of financial variables, models only capture reality to a certain extent. Therefore, model risk - the potential loss a business could suffer due to an inaccurate model or incorrect use of a model - is a pressing concern. This article explores model risk in prepayment models, analyzing various approaches to quantify and forecast this risk.
There are numerous examples where model risk has not been properly accounted for, resulting in significant losses. For example, Long-Term Capital Management was a hedge fund that went bankrupt in the late 1990s because its model was never stress-tested for extreme market conditions. Similarly, in 2012, JP Morgan experienced a $6 billion loss and $920 million in fines due to flaws in its new value-at-risk model known as the ‘London Whale Trade’.
Despite these prominent failures, and the requirements of CRD IV Article 85 for institutions to develop policies and processes for managing model risk,1 the quantification and forecasting of model risk has not been extensively covered in academic literature. This leaves a significant gap in the general understanding and ability to manage this risk. Adequate model risk management allows for optimized capital allocation, reduced risk-related losses, and a strengthened risk culture.
This article delves into model risk in prepayment models, examining different methods to quantify and predict this risk. The objective is to compare different approaches, highlighting their strengths and weaknesses.

Definition of Model Risk
Generally, model risk can be assessed using a bottom-up approach by analyzing individual model components, assumptions, and inputs for errors, or by using a top-down approach by evaluating the overall impact of model inaccuracies on broader financial outcomes. In the context of prepayments, this article adopts a bottom-up approach by using model error as a proxy for model risk, allowing for a quantifiable measure of this risk. Model error is the difference between the modelled prepayment rate and the actual prepayment rate. Model error occurs at an individual level when a prepayment model predicts a prepayment that does not happen, and vice versa. However, banks are more interested in model error at the portfolio level. A statistic often used by banks is the Single Monthly Mortality (SMM). The SMM is the monthly percentage of prepayments and can be calculated by dividing the amount of prepayments for a given month by the total amount of mortgages outstanding.
Using the SMM, we can define and calculate the model error as the difference between the predicted SMM and the actual SMM:

The European Banking Authority (EBA) requires financial institutions when calculating valuation model risk to set aside enough funds to be 90% confident that they can exit a position at the time of the assessment. Consequently, banks are concerned with the top 5% and lowest 5% of the model risk distribution (EBA, 2016, 2015). 2 Thus, banks are interested in the distribution of the model error as defined above, aiming to ensure they allocate the capital optimally for model risk in prepayment models.
Approaches to Forecasting Model Risk
By using model error as a proxy for model risk, we can leverage historical model errors to forecast future errors through time-series modelling. In this article, we explore three methods: the simple approach, the auto-regressive approach, and the machine learning challenger model approach.
Simple Approach
The first method proposed to forecast the expected value, and the variance of the model errors is the simple approach. It is the most straightforward way to quantify and predict model risk by analyzing the mean and standard deviation of the model errors. The model itself causes minimal uncertainty, as there are just two parameters which have to be estimated, namely the intercept and the standard deviation.
The disadvantage of the simple approach is that it is time-invariant. Consequently, even in extreme conditions, the expected value and the variance of model errors remain constant over time.
Auto-Regressive Approach
The second approach to forecast the model errors of a prepayment model is the auto-regressive approach. Specifically, this approach utilizes an AR(1) model, which forecasts the model errors by leveraging their lagged values. The advantage of the auto-regressive approach is that it takes into account the dynamics of the historical model errors when forecasting them, making it more advanced than the simple approach.
The disadvantage of the auto-regressive approach is that it always lags and that it does not take into account the current status of the economy. For example, an increase in the interest rate by 200 basis points is expected to lead to a higher model error, while the auto-regressive approach is likely to forecast this increase in model error one month later.
Machine Learning Challenger Model Approach
The third approach to forecast the model errors involves incorporating a Machine Learning (ML) challenger model. In this article, we use an Artificial Neural Network (ANN). This ML challenger model can be more sophisticated than the production model, as its primary focus is on predictive accuracy rather than interpretability. This approach uses risk measures to compare the production model with a more advanced challenger model. A new variable is defined as the difference between the production model and the challenger model.
Similar to the above approaches, the expected value of the model errors is forecasted by estimating the intercept, the parameter of the new variable, and the standard deviation. A forecast can be made and the difference between the production model and ML challenger model can be used as a proxy for future model risk.
The advantage of using the ML challenger model approach is that it is forward looking. This forward-looking method allows for reasonable estimates under both normal and extreme conditions, making it a reliable proxy for future model risk. In addition, when there are complex non-linear relationships between an independent variable and the prepayment rate, an ML challenger can be more accurate. Its complexity allows it to predict significant impacts better than a simpler, more interpretable production model. Consequently, employing an ML challenger model approach could effectively estimate model risk during substantial market changes.
A disadvantage of the machine learning approach is its complexity and lack of interpretability. Additionally, developing and maintaining these models often requires significant time, computational resources, and specialized expertise.
Conclusion
The various methods to estimate model risk are compared in a simulation study. The ML challenger model approach stands out as the most effective method for predicting model errors, offering increased accuracy in both normal and extreme conditions. Both the simple and challenger model approach effectively predicts the variability of model errors, but the challenger model approach achieves a smaller standard deviation. In scenarios involving extreme interest rate changes, only the challenger model approach delivers reasonable estimates, highlighting its robustness. Therefore, the challenger model approach is the preferred choice for predicting model error under both normal and extreme conditions.
Ultimately, the optimal approach should align with the bank’s risk appetite, operational capabilities, and overall risk management framework. Zanders, with its extensive expertise in financial risk management, including multiple high-profile projects related to prepayments at G-SIBs as well as mid-size banks, can provide comprehensive support in navigating these challenges. See our expertise here.
Ready to take your IRRBB strategy to the next level?
Zanders is an expert on IRRBB-related topics. We enable banks to achieve both regulatory compliance and strategic risk goals by offering support from strategy to implementation. This includes risk identification, formulating a risk strategy, setting up an IRRBB governance and framework, and policy or risk appetite statements. Moreover, we have an extensive track record in IRRBB [EU1] and behavioral models such as prepayment models, hedging strategies, and calculating risk metrics, both from model development and model validation perspectives.
Contact our experts today to discover how Zanders can help you transform risk management into a competitive advantage. Reach out to: Jaap Karelse, Erik Vijlbrief, Petra van Meel, or Martijn Wycisk to start your journey toward financial resilience.
- https://www.eba.europa.eu/regulation-and-policy/single-rulebook/interactive-single-rulebook/11665
CRD IV Article 85: Competent authorities shall ensure that institutions implement policies and processes to evaluate and manage the exposures to operational risk, including model risk and risks resulting from outsourcing, and to cover low-frequency high-severity events. Institutions shall articulate what constitutes operational risk for the purposes of those policies and procedures. ↩︎ - https://extranet.eba.europa.eu/sites/default/documents/files/documents/10180/642449/1d93ef17-d7c5-47a6-bdbc-cfdb2cf1d072/EBA-RTS-2014-06%20RTS%20on%20Prudent%20Valuation.pdf?retry=1
Where possible, institutions shall calculate the model risk AVA by determining a range of plausible valuations produced from alternative appropriate modelling and calibration approaches. In this case, institutions shall estimate a point within the resulting range of valuations where they are 90% confident they could exit the valuation exposure at that price or better. In this article, we generalize valuation model risk to model risk. ↩︎