IBOR reform: Enhancement of SAP Treasury

June 2021
7 min read

One of the key finance related challenges that most corporates currently face is the IBOR reform.


How can SAP technology support organizations with the consequences of IBOR reform? In this article we outline the SAP enhancements relating to the IBOR reform, discuss how to implement these enhancements, and pinpoint specific areas of attention based on our most recent SAP projects on IBOR reform implementation.

SAP provides a roadmap to support interest rate benchmark reforms. SAP has developed a standard solution to support the daily compounding interest calculation with overnight risk-free rates, which have become the new interest rate benchmarks like for example for USD, GBP and CHF currencies. This standard solution has been included in different versions of SAP: from ECC EHP8 to most recent versions of S/4 HANA.

The SAP solution consists of a set of composite SAP notes that need to be installed. It is recommended to install the SAP notes as a part of the support pack. There are three core SAP composite notes to install:

  • 2939657 (common basis for TRM and CML)
  • 2932789 (RFR in TRM)
  • 2880124 (for CML)

Additional SAP notes and/or activation of business functions may be required, especially if implementation takes place in SAP ECC. The implementation of the SAP notes is the first and the core step, but SAP also requires an update of the configuration objects.

Transaction management
The following five points should be followed for transaction management:

1) Activation of the parallel condition of the cash flow calculation.

This activation is done on the level of a respective product type. The following product categories are supported:

  • Money market deals (product category 550 and 580)
  • Bonds (product category 040)
  • SWAP (product category 620)

SAP recommends the creation of the new product types. It is also possible to do the activation by extending the configuration of the existing product types without any issues. However, we would still recommend executing prove-of-concept prior to amending existing product types to ensure there is no regression effect or dumps in the specific SAP environment.

Activation of parallel conditions will enable the following enhancements in SAP:

  • New interest conditions (Compound interest calculation and Average compound interest calculation) with spread components that can be calculated linearly or compounded.
  • Updated interest cash flow calculation according to the new interest conditions (new FiMa).
  • Parallel interest conditions (spread is linear and maintained as separate flow in the deal).
  • New fields and functionalities in the deal maintenance (Weighting, Lookback, Lockout periods etc).

Please note that SAP may require extra business function activation to enable selection of cash flow calculation in SPRO (such as FIN_TRM_INS_LOCBR).

2) For the Interest rate swaps the below notes should be installed:

  • 2971185 (Risk-Free Rates for Interest Rate Swaps: Collective Note)
  • 2973302 – Business Function: Interest Rate Swap Enhancements (FIN_TRM_IR)

It is strongly recommended to install and run report ZSAP_FTR_IRSWAP_NOTES which will indicate if any SAP note or business function is missing or invalid.

3) Amendment of the field selection for the respective condition types.

With the changes there are more fields to be created, therefore field selection needs to be changed accordingly.

4) There are new fields to be created in the deals, therefore you should consider if changes are required in the field selection in Transaction manager.

5) In the case where mirror deals are configured, you should validate that the conditions (interest calculation type of nominal interest and date structure) in the mirror deals are properly mirrored. Activation of the business function FIN-TRM-MME may be required.

Yield Curve Framework
Creation of the new yield curves where the new grid points should represent the reference interest rates based on the overnight risk-free rates. IBOR reform might result in a need for a corporate to change methodology for mark-to-market calculations of the financial instruments. New discounting yield curves will be required for the net present value calculation of FX contracts and interest rate derivatives. New yield curves might also be required for projecting future cash flows for interest rate derivatives in the mark-to-market calculations.

Creation of new yield curves (and evaluation types) requires thorough attention and input from Treasury and Accounting teams and it should be signed off by the auditor. This configuration also requires good communication with market data providers in order to retrieve correct data for the additional data feed.

Market Risk Analyzer
Creation of the new evaluation types. This is required to enable cash flow discounting to be based on the new yield curves. It is not recommended to amend existing calculation types for the audit purposes.

Market data feed
New reference interest rates in SAP. It is essential to have the new set of risk-free-rates as well as IBOR fallback rates, adjusted or term rates received from your market data provider. The scope here fully depends on the existing active deals, your migration approach, and alignment with your financial counterparties.

Treasury Accounting
Changes in assignment of the update types for accrual/deferral. An additional grouping term is required to be added for the respective update types.

In case there is a business need to aggregate daily cash flows, changes to the flow types for the nominal interest are required. With compound interest calculation, SAP calculates and posts interest cash flows daily, while interest settlement occurs based on the terms of the deal. Hence, daily flows (e.g. 30 flows for a monthly settlements) need to be reconciled with a single settlement at the settlement date. This daily cash flow maintenance and reconciliation may lead to extra workload for the back office/accounting team. Therefore, businesses might prefer to aggregate (net) these daily cash flows. A new flow type and derivation rule may be configured to support this requirement. This would be an update of the existing accounting, which represents a potential regression impact.

SAP IHC interest conditions
Consider changes in the IHC interest conditions for debit/credit balances in case a variable reference interest rate based in RFR is applied. Please note that SAP IHC does not support compounding of the interest for the daily balancing.

Reports
We also recommend executing a round of regression testing of SAP TRM. The new functionality potentially can impact your reports’ variants and layouts, therefore potential regression updates are required. Please validate if the fields showing variable interest rates are properly populated with the data, especially the collective processing report for OTC interest rate instrument (TI92).

Custom functionality and SAP queries
It is important to review your bespoke functionalities in SAP, which are related to TRM, MRA, Market data feed etc. Additionally, it is worthwhile to validate your SAP queries for Treasury, especially the ones designed for month end purposes: NPVs, Accrual/Deferral calculation etc. Pay attention to table TRLIT_AD_TRANS and how the table is updated.

IBOR reform and Zanders

Zanders is closely following all IBOR reform related regulations and latest developments. We have also developed a proprietary methodology to support our clients in this regulatory transition, with several projects already successfully completed.

Given our expertise in treasury management, valuations and treasury technology, we are well equipped to support financial and nonfinancial organizations in the IBOR reform from both a functional and a technological perspective. We assist our clients throughout the entire project, from the impact assessment to roadmap definition, and finally the transition itself. Functional support could include definition of the new reference rates, definition of the new yield curves for discounting and projecting future cash flows, formulation of the business cutover plan, support in the new interest calculation methodologies and new market conventions, changing procedures and other many other activities.

To learn more about the IBOR reform, see our previously published insights:

  • https://zanders-migration.appealstaging.co.uk/en/latest-insights/preparing-your-treasury-management-system-for-the-ibor-replacement/
  • https://zanders-migration.appealstaging.co.uk/en/latest-insights/a-new-milestone-on-the-ibor-reform-road/
  • https://zanders-migration.appealstaging.co.uk/en/news/announcement-on-the-end-of-libor/
  • https://zanders-migration.appealstaging.co.uk/en/latest-insights/zanders-ibor-assessment/
  • https://zandersadvisory.com/latest-insights/sap-responds-to-ibor-reform/

In this article we do not cover the migration of existing deals and impact to hedge accounting, as this will be explained in a separate article. Should you have any questions or quires regarding SAP implementation of IBOR reform, please feel free to contact Aleksei Abakumov or +31 35 692 89 89.

FRTB: Improving the Modellability of Risk Factors

June 2021
5 min read

Explore key strategies to minimize non-modellable risk factors (NMRFs) under FRTB’s internal models approach, including enhancing data, creating proxies, and customizing bucketing, to manage your bank’s capital requirements more effectively.


Under the FRTB internal models approach (IMA), the capital calculation of risk factors is dependent on whether the risk factor is modellable. Insufficient data will result in more non-modellable risk factors (NMRFs), significantly increasing associated capital charges.

NMRFs

Risk factor modellability and NMRFs

The modellability of risk factors is a new concept which was introduced under FRTB and is based on the liquidity of each risk factor. Modellability is measured using the number of ‘real prices’ which are available for each risk factor. Real prices are transaction prices from the institution itself, verifiable prices for transactions between arms-length parties, prices from committed quotes, and prices from third party vendors.

For a risk factor to be classed as modellable, it must have a minimum of 24 real prices per year, no 90-day period with less than four prices, and a minimum of 100 real prices in the last 12 months (with a maximum of one real price per day). The Risk Factor Eligibility Test (RFET), outlined in FRTB, is the process which determines modellability and is performed quarterly. The results of the RFET determine, for each risk factor, whether the capital requirements are calculated by expected shortfall or stressed scenarios.

Consequences of NMRFs for banks

Modellable risk factors are capitalised via expected shortfall calculations which allow for diversification benefits. Conversely, capital for NMRFs is calculated via stressed scenarios which result in larger capital charges. This is due to longer liquidity horizons and more prudent assumptions used for aggregation. Although it is expected that a low proportion of risk factors will be classified as non-modellable, research shows that they can account for over 30% of total capital requirements. 

There are multiple techniques that banks can use to reduce the number and impact of NMRFs, including the use of external data, developing proxies, and modifying the parameterisation of risk factor curves and surfaces. As well as focusing on reducing the number of NMRFs, banks will also need to develop early warning systems and automated reporting infrastructures to monitor the modellability of risk factors. These tools help to track and predict modellability issues, reducing the likelihood that risk factors will fail the RFET and increase capital requirements.

Methods for reducing the number of NMRFs

Banks should focus on reducing their NMRFs as they are associated with significantly higher capital charges. There are multiple approaches which can be taken to increase the likelihood that a risk factor passes the RFET and is classed as modellable.

Enhancing internal data

The simplest way for banks to reduce NMRFs is by increasing the amount of data available to them. Augmenting internal data with external data increases the number of real prices available for the RFET and reduces the likelihood of NMRFs. Banks can purchase additional data from external data vendors and data pooling services to increase the size and quality of datasets.

It is important for banks to initially investigate their internal data and understand where the gaps are. As data providers vary in which services and information they provide, banks should not only focus on the types and quantity of data available. For example, they should also consider data integrity, user interfaces, governance, and security. Many data providers also offer FRTB-specific metadata, such as flags for RFET liquidity passes or fails.

Finally, once a data provider has been chosen, additional effort will be required to resolve discrepancies between internal and external data and ensure that the external data follows the same internal standards.

Creating risk factor proxies

Proxies can be developed to reduce the number or magnitude of NMRFs, however, regulation states that their use must be limited. Proxies are developed using either statistical or rules-based approaches.

Rules-based approaches are simplistic, yet generally less accurate. They find the “closest fit” modellable risk factor using more qualitative methods, e.g. using the closest tenor on the interest rate curve. Alternatively, more accurate approaches model the relationship between the NMRF and modellable risk factors using statistical methods. Once a proxy is determined, it is classified as modellable and only the basis between it and the NMRF is required to be capitalised using stressed scenarios.

Determining proxies can be time-consuming as it requires exploratory work with uncertain outcomes. Additional ongoing effort will also be required by validation and monitoring units to ensure the relationship holds and the regulator is satisfied.

Developing own bucketing approach

Instead of using the prescribed bucketing approach, banks can use their own approach to maximise the number of real price observations for each risk factor.

For example, if a risk model requires a volatility surface to price, there are multiple ways this can be parametrised.  One method could be to split the surface into a 5x5 grid, creating 25 buckets that would each require sufficient real price observations to be classified as modellable. Conversely, the bank could instead split the surface into a 2x2 grid, resulting in only four buckets. The same number of real price observations would then need to be allocated between significantly less buckets, decreasing the chances of a risk factor being a NMRF.

It should be noted that the choice of bucketing approach affects other aspects of FRTB. Profit and Loss Attribution (PLA) uses the same buckets of risk factors as chosen for the RFET. Increasing the number of buckets may increase the chances of passing PLA, however, also increases the likelihood of risk factors failing the RFET and being classed as NMRFs.

Conclusion

In this article, we have described several potential methods for reducing the number of NMRFs. Although some of the suggested methods may be more cost effective or easier to implement than others, banks will most likely, in practice, need to implement a combination of these strategies in parallel. The modellability of risk factors is clearly an important part of the FRTB regulation for banks as it has a direct impact on required capital. Banks should begin to develop strategies for reducing the number of NMRFs as early as possible if they are to minimise the required capital when FRTB goes live.

SAP Advanced Payment Management

June 2021
6 min read

S/4 HANA Advanced Payment Management (APM) is SAP’s new solution for centralized payment hubs. Released in 2019, this solution operates as a centralized payment channel, consolidating payment flows from multiple payment sources. This article will serve to introduce its functionality and benefits.


Intraday Bank Statements offers a cash manager additional insight in estimated closing balances of external bank accounts and therefore provides the information to manage the cash more tightly on the company’s bank accounts.

Whilst over the previous years, many corporates have endeavoured to move towards a single ERP system. There are many corporates who operate in a multi-ERP landscape and will continue to do so. This is particularly the case amongst corporates who have grown rapidly, potentially through acquisitions, or that operate across different business areas. SAP’s Central Finance caters for centralized financial reporting for these multi-ERP businesses. SAP’s APM similarly caters for businesses with a range of payment sources, centralizing into a single payment channel.

SAP APM acts as a central payment processing engine, connecting with SAP Bank Communication Management and Multi-Bank Connectivity for sending of external payment Instructions. For internal payments & payments-on-behalf-of, data is fed to SAP In-House Cash. Whilst at the same time, data is transmitted to S/4 HANA Cash Management to give centralized cash forecast data.

Figure 1 – SAP S/4 HANA Advanced Payment Management – Credit SAP

The framework of this product was built up as SAP Payment Engine, which is used for the processing of payment instructions at banking institutions. On this basis, it is a robust product, and will cater for the key requirements of corporate payment hubs, and much more beyond.

Building a business case

When building a business case for a centralized payment hub, it is important to look at the full range of the payment sources. This can include accounts payable/receivable (AP/AR) payments, but should also consider one-off (manual) payments, Treasury payments, as well as HR payments such as payroll. Whilst payroll is often outsourced, SAP APM can be a good opportunity to integrate payroll into a corporate’s own payment landscape (with the necessary controls of course!).

Using a centralized payment hub will help to reduce implementation time for new payment sources, which may be different ERPs. In particular, the ability of SAP APMs Input Manager to consume non-standard payment file formats helps to make this a smooth implementation process.

SAP APM applies a level of consistency across all payments and allows for a common payment control framework to be applied across the full range of payment sources.

A strength of the product is its flexible payment routing, which allows for payment routing to be adjusted according to the business need. This does not require specialist IT configuration or re-routing. It enables corporates to change their payment framework according to the need of the business, without the dependency on configuration and technology changes.
A central payment hub means no more direct bank integrations. This is particularly important for those businesses that operate in a multi-ERP environment, where the burden can be particularly heavy.

Lastly, as with most SAP products, this product benefits from native integration into modules that corporates may already be using. Payment data can be transferred directly into SAP In-House Cash using standard functionality in order to reflect intercompany positions. The richest level of data is presented to S/4 HANA Cash Management to provide accurate and up-to-date cash forecast data for Treasury front office.

Scenarios

SAP APM accommodates four different scenarios:

ScenarioDescription
Internal transferPayment from one subsidiaries internal account to the internal account of another
Payment on-behalf-ofPayment to external party from the internal account of a subsidiary
Payment in-name ofPayment to external party from the external account of a subsidiary. The derivation of the external account is performed in APM.
Payment in-name-of – forwarding onlyPayment to external party from the external account of a subsidiary. The external account is pre-determined in the incoming payment instruction.

A Working Example – Payment-on-behalf-of

An ERP sends a payment instruction to the APM system via iDoc. This is consumed by the input manager, creating a payment order that is ready to be processed.

Figure 3 – Creation of Incoming Payment Order in APM

The payment order will normally be automatically processed immediately upon receipt. First the enrichment & validation checks are executed, which validate the integrity of the payment Instruction.

The payment routing is then executed for each payment item, according to the source payment data. The Payment Routing importantly selects the appropriate house bank account for payment and can be used to determine the prioritization of payments, as well as the method of clearing.

In the case of a payment-on-behalf-of, an external route will be used for the credit payment item to the third party vendor, whilst an internal route will be used to update SAP In-House Cash for the intercompany position.

Figure 4 – Maintenance of Routes

Clearing can be executed in batches, via queues or individual processing. The internal clearing for the debit payment item must be executed into SAP In-House Cash in order to reflect the intercompany position built up. The internal clearing for the credit payment Item can be fed into the general ledger of the paying entity.

Figure 5 – Update of In-House Cash for Payment-On-Behalf or Internal Transfer Scenarios

Outgoing payment orders are created once the routing & clearing is completed. At this stage, any further enrichment & validation can be executed and the data will be delivered to the output manager. The output manager has native integration with SAP’s DMEE Payment Engine, which can be used to produce an ISO20022 payment instruction file.

Figure 6 – Payment Instruction in SAP Bank Communication Management

The outgoing payment instruction is now visible in the centralized payment status monitor in SAP Bank Communication Management.

The full processing status of the payment is visible in SAP APM, including the points of data transfer.

Figure 7 – SAP APM Process Flow

Introduction to Functionality

SAP APM is comprised of 4 key function areas:

  • Input manager & output manager
  • Enrichment and validation
  • Routing
  • Transaction clearing

Figure 2 – SAP Advanced Payment Management Framework – Credit SAP

Input Manager

The input manager can flexibly import payment instruction data into APM. Standard converters exist for iDoc Payment Instructions (PEXR2002/PEXR2003 PAYEXT), ISO20022 (Pain.001.01.03) as well as for SWIFT MT101 messages. However, it is possible to configure new input formats that would cater for systems that may only be able to produce flat file formats.

Enrichment and Validation

Enrichment and validation can be used to perform integrity checks on payment items during the processing through APM. These checks could include checks for duplicate payment instructions. This feeds an initial set of data to S/4 HANA Cash Management (prior to routing) and can be used to return payment status messages (Pain.002) to the sending payment system.

Routing

Agreement-based routing is used to determine the selection of external accounts. This payment routing is highly flexible and permits the routing of payments according to criteria such as amounts and, beneficiary countries. The routing incorporates cut-off time logic and determines the priority of the payment as well as the sending bank account. This stage is not used for “forwarding-only” scenarios, where there is no requirement to determine the subsidiaries house bank account in the APM platform.

Clearing

Clearing involves the sending of payment data after routing to S/4 HANA Cash Management, in-house cash and onto the general ledger. According to selected route, payments can be cleared individually, or grouped into batches.

Further enrichment & validation can be performed, and external payments are routed via the output manager, which can re-use DMEE payment engines to produce payment files. These payment files can be monitored in SAP Bank Communication Management and delivered to the bank via SAP Multi-Bank Connectivity.

Optimizing Trade Execution through SAP Trade Platform Integration

June 2021
7 min read

One of the key finance related challenges that most corporates currently face is the IBOR reform.


In any SAP Treasury implementation, conversation will eventually turn to the integration of external systems and providers for efficient straight-through processing of data and the additional controls it provides. SAP has introduced the TPI functionality to manage one of the more challenging of these interfaces which is the integration between SAP and the external trade platforms.

The general outline for any trade integration solution would contain the following high-level components:

  • The ability to collect the trade requirements in the form of FX orders from all relevant sources.
  • A trade dashboard for the dealers to view and manage all requested orders.
  • Ability to send the FX orders to an external trade platform for trade execution.
  • Capturing of the executed trade in the treasury module, ensuring that the FX order data is also recorded to identify the purpose of the FX transaction.

There are many levels of complexity of how each of these components have been developed in the past for different organizations, from the simplest Excel templates to the most complex bespoke modules with live interfaces to manage the end-to-end trading needs.

The choice of how much an organization would want to invest in these complex solutions would depend on the volume, importance of the trading function, need for enhanced control around trading, and the level of enriched data to be recorded automatically on the deals. Now, with a standard alternative available from SAP, an extensive business case may no longer be necessary to incorporate the more complex of these requirements, as the improved controls and efficiency of processing data is available with less risk and investment than previously considered.

The solution can be broadly defined under the SAP S/4 HANA functionality and the SAP Cloud Platform (SCP) functionality as seen below.

Figure 1

SAP S/4 HANA – Trade Platform Integration
The S/4 HANA functionality covers the first of the components mentioned before. Here SAP has introduced an entirely new database in the SAP environment to manage and control Trade Requests – the SAP equivalents of FX orders.

These Trade Requests may be created automatically off the back of other SAP tools such as Cash Management, Hedge Management Cockpit, Balance Sheet Hedging or simply from manual requests. The resulting Trade Request contains the same data categorizations that apply to a deal in TRM, such as portfolio, characteristics, internal references, and other fields normally found under Administration. All of this data collected prior to trading will be carried to the actual deal once executed, ensuring the dealers will not be responsible for accurately capturing this information on the deal that may not be relevant to them but necessary for further processing.

The clear benefit of this new integration is that it bridges the gap between the determination of trade requirements from Cash Management or FX risk reporting, and the dealers who are to execute and fulfil the trades. This allows the information related to the purpose of the trade (e.g.; the portfolio, exposure position, profit center, etc.) to be allocated to the Trade Request and subsequently to the executed trade automatically without the need of any manual enrichment.

Specially within the context of the Hedge Management Cockpit, this is very useful in the further automatic assignment of trades to hedge relationships, as the purpose of the trade is carried throughout the process.

SAP Cloud Platform – Trade Platform Integration
While the database in S/4 HANA remains the central transaction data source throughout the process, the functionality in SCP provides a set of tools for the dealers to manage the trades requests as needed.

This begins with some business rules to help differentiate how the trades will be fulfilled, either directly externally with a bank counterparty or internally via the treasury center.

All the external trades now can be found on the central Trade Request dashboard “Manage Trade Requests”, which acts as an order book from where the dealers/front office have a clear view on all deal requests that are been triggered from different areas of the organization and where in addition to being able to manage all the trade requests centrally, the status of each trade request is available to ensure no duplicate trading.

Figure 2

Figure 3

From the dashboard, a dealer can choose to group trades into a block, split the trades and edit them as necessary or alternatively the Trade Requests may be cancelled or manually traded as a “Phone Trade”.
The Send function on the dashboard will trigger the interface to the external trade platform for the selected trade requests taking into account the split and block trade requirements. The requests will then be executed and fulfilled on the external platform where the executed trade details such as rate and counterparty are captured back in the application, which in turns triggers the automatic creation of the FX deal in SAP S/4 HANA. The executed trade details can then be displayed on SCP application “Manage Trades”.

Figure 4

Figure 5

Internal trade requests can be automatically fulfilled at a price determined by the business rules defined by the users. This includes pricing based on the associated external deal rate (back-to-back scenario) with a commission, or pricing based on market data with a commission element.

The deals captured in SAP S/4 HANA whether internal or external, all contain the enriched data with all the originating information relating to the trade request, so that the FX deal itself accurately reflects the purpose of the position for further reporting.

Future SAP Roadmap
Although initially only the FX instruments were included in scope, SAP is now extending the ability to execute Money Market Fund purchases and sales through the platform including the associated dividend and rebate flows. This is another step to truly set up the TPI function as a central trade location for front office to operate from, covering not only FX risk requirements, but also the management of cash investment transacting.

Credit risk management is also now on the table, with pre-trade credit risk analyzer information integrated to the TPI application so that counterparty limits may be checked pre-trade to give the opportunity to exclude certain counterparties from quotation. This is certainly an improvement on the historical functionality of SAP TRM where a breach would only be noted after the deal has already been executed.

Conclusion
The recent SAP advancements in the area of TPI provide many opportunities for an organization to incorporate additional control, efficiency and transparency to the dealing process, not only for the front office, but also for the rest of the treasury team. While dealers benefit from a central platform where they can best execute the trades, middle office can get immediate feedback on their FX exposure positions as the deal immediately reflects with the correct characteristics, while the cash management team benefits from a simple ability to request and monitor the FX and investment decisions that have been sent to the dealers. The accounting team stands to benefit greatly as the accounting parameters on the deal are no longer the domain of a front office trader, but rather can be determined by the purpose of the original trade request which dictates the accounting treatment, including the automatic assignment to hedge relationships.

The SAP TPI solution therefore optimizes not only the dealers’ execution role, but also ties together the preceding and dependent processes into one fully straight through process that will benefit the whole treasury organization.

Sourcing Market Data

June 2021
7 min read

One of the key finance related challenges that most corporates currently face is the IBOR reform.


It is no longer just about the source of market data, questions such as integration, validation, storage, consistency and distribution within an organization need to be considered. In this article we will look at some of the considerations when deciding on how to source market data and how in-built applications can reduce risk and cost while improving automation.

Which Market Data Vendor?
There are multiple market data vendors, either directly providing data or consolidating (normalizing) data from multiple sources before making it available to clients. To choose a market data vendor, an organization must firstly understand what the requirements are and this is not only based on the Treasury requirements but wider business and IT requirements:

  • What data and when should it be delivered?
  • IT capabilities to develop and maintain an interface or leverage inbuilt third party/core application capabilities
  • Data validation and distribution

Integration

Market data vendors can provide data using multiple methods, from Excel downloads to a simple file transfer to integrated APIs to import data directly into its applications. The level of integration is driven by the market data requirements, for example, a few FX rates once a month will not justify a level of integration above importing an Excel spreadsheet or even manually entering the rates. However, most organizations require large data sets, sourced on a timely basis and validated without the need for manual intervention.

The way an organization integrates market data will, in some way, depend on the IT strategy and in-house capabilities. Some IT functions have strong in-house development teams capable of building and maintaining APIs to retrieve and import the market data, others will prefer to have the market data integration managed by a third party application. There are costs associated with both options but leveraging the inbuilt capabilities of an application that is already part of the organizations IT landscape can reduce not only the complexity of loading market data but long-term costs of maintaining the solution.

SAP and some top tier TMS applications act as a market data vendor by providing an inbuilt market data interface to access market data. SAP’s Market Rates Management module provides standard integration to both Refinitiv (formerly Thomson Reuters) as well as a more generic option for loading rates from other sources. The key benefit of SAP’s Market Rates Management is that it allows an organization to define its data requirements and import the data from a single source under a single contract while reducing the IT overhead as the module will fall under existing SAP support structures.

Validation and Distribution
Having correct and precise market data is crucial in almost every treasury process while business processes require a consistent data set across all platforms and operations. Market data validation has grown increasingly important, historically, manual, Excel-based or fully bespoke system processes have been used to validate market data, providing a very limited audit trail, introducing user errors and the potential to impact financial postings should an error not be identified. Automated data validation uses rules-based processes executed once the market data has been received that identify, remove, or flag inaccurate or anomalous information, delivering a clean dataset, ensuring the accuracy of the market data in the receiving applications/systems is correct and identical.

The distribution of validated market data to all systems and applications that require it needs to be considered when selecting a market data provider and integration solutions. There may be license implications in distributing data to multiple systems and applications which can increase the recurring costs while the options to distribute the data have the similar IT considerations as the initial integration but potentially on a larger scale, depending on how many different systems and applications require the data. As with the integration to the market data vendor, the ability to leverage 3rd party applications can reduce the costs and complexity of market data distribution.

We can support the validation and distribution process with a tool: Zanders Market Data Platform. This Zanders Inside solution, powered by Brisken, builds a bridge between the market databases and the enterprise application landscape of companies. In this way, the Market Data Platform takes away the operational risks of the market data process. The Market Data Platform runs on the SAP Cloud Platform infrastructure to ensure a secure cloud computing environment to integrate data and business processes to meet all your market data needs.

How does the Market Data Platform work?
The Market Data Platform has many functionalities. First, the platform retrieves the market data from the selected sources. Also, the platform is the source of truth for historical market data, and all activities are logged in the audit center. Subsequently calculations and market data validations are performed. Finally, the hub distributes the market data across the company’s system landscape at the right time and in the right format. The platform can directly be linked to SAP through the cloud connector, and connections to other treasury management systems are also possible, for example with IT2 or with text files. The added value of the Market Data Platform versus other solutions such as SAP Market Rates Management is the additional validation of data e.g., checking completeness and accuracy of the received data on the platform before distributing it for use.

The Zanders Market Data Platform is the solution for your market data validation processes. Would you like to learn more on this new initiative or receive a free demo of our solution? Do not hesitate to reach out to us!

Targeted Review of Internal Models (TRIM): Review of observations and findings for Traded Risk

May 2021
5 min read

Discover the significant deficiencies uncovered by the EBA’s TRIM on-site inspections and how banks must swiftly address these to ensure compliance and mitigate risk.


The EBA has recently published the findings and observations from their TRIM on-site inspections. A significant number of deficiencies were identified and are required to be remediated by institutions in a timely fashion.

Since the Global Financial Crisis 2007-09, concerns have been raised regarding the complexity and variability of the models used by institutions to calculate their regulatory capital requirements. The lack of transparency behind the modelling approaches made it increasingly difficult for regulators to assess whether all risks had been appropriately and consistently captured.

The TRIM project was a large-scale multi-year supervisory initiative launched by the ECB at the beginning of 2016. The project aimed to confirm the adequacy and appropriateness of approved Pillar I internal models used by Significant Institutions (SIs) in euro area countries. This ensured their compliance with regulatory requirements and aimed to harmonise supervisory practices relating to internal models.

TRIM executed 200 on-site internal model investigations across 65 SIs from over 10 different countries. Over 5,800 deficiencies were identified. Findings were defined as deficiencies which required immediate supervisory attention. They were categorised depending on the actual or potential impact on the institution’s financial situation, the levels of own funds and own funds requirements, internal governance, risk control, and management.

The findings have been followed up with 253 binding supervisory decisions which request that the SIs mitigate these shortcomings within a timely fashion. Immediate action was required for findings that were deemed to take a significant time to address.

Assessment of Market Risk

TRIM assessed the VaR/sVaR models of 31 institutions. The majority of severe findings concerned the general features of the VaR and sVaR modelling methodology, such as data quality and risk factor modelling.

19 out of 31 institutions used historical simulation, seven used Monte Carlo, and the remainder used either a parametric or mixed approach. 17 of the historical simulation institutions, and five using Monte Carlo, used full revaluation for most instruments. Most other institutions used a sensitivities-based pricing approach.

VaR/sVaR Methodology

Data: Issues with data cleansing, processing and validation were seen in many institutions and, on many occasions, data processes were poorly documented.

Risk Factors: In many cases, risk factors were missing or inadequately modelled. There was also insufficient justification or assessment of assumptions related to risk factor modelling.

Pricing: Institutions frequently had inadequate pricing methods for particular products, leading to a failure for the internal model to adequately capture all material price risks. In several cases, validation activities regarding the adequacy of pricing methods in the VaR model were insufficient or missing.

RNIME: Approximately two-thirds of the institutions had an identification process for risks not in model engines (RNIMEs). For ten of these institutions, this directly led to an RNIME add-on to the VaR or to the capital requirements.

Regulatory Backtesting

Period and Business Days: There was a lack of clear definitions of business and non-business days at most institutions. In many cases, this meant that institutions were trading on local holidays without adequate risk monitoring and without considering those days in the P&L and/or the VaR.

APL: Many institutions had no clear definition of fees, commissions or net interest income (NII), which must be excluded from the actual P&L (APL). Several institutions had issues with the treatment of fair value or other adjustments, which were either not documented, not determined correctly, or were not properly considered in the APL. Incorrect treatment of CVAs and DVAs and inconsistent treatment of the passage of time (theta) effect were also seen.

HPL: An insufficient alignment of pricing functions, market data, and parametrisation between the economic P&L (EPL) and the hypothetical P&L (HPL), as well as the inconsistent treatment of the theta effect in the HPL and the VaR, was seen in many institutions.

Internal Validation and Internal Backtesting

Methodology: In several cases, the internal backtesting methodology was considered inadequate or the levels of backtesting were not sufficient.

Hypothetical Backtesting: The required backtesting on hypothetical portfolios was either not carried or was only carried out to a very limited extent

IRC Methodology

TRIM assessed the IRC models of 17 institutions, reviewing a total of 19 IRC models. A total of 120 findings were identified and over 80% of institutions that used IRC models received at least one high-severity finding in relation to their IRC model. All institutions used a Monte Carlo simulation method, with 82% applying a weekly calculation. Most institutions obtained rates from external rating agency data. Others estimated rates from IRB models or directly from their front office function. As IRC lacks a prescriptive approach, the choice of modelling approaches between institutes exhibited a variety of modelling assumptions, as illustrated below.

Recovery rates: The use of unjustified or inaccurate Recovery Rates (RR) and Probability of Defaults (PD) values were the cause of most findings. PDs close to or equal to zero without justification was a common issue, which typically arose for the modelling of sovereign obligors with high credit quality. 58% of models assumed PDs lower than one basis point, typically for sovereigns with very good ratings but sometimes also for corporates. The inconsistent assignment of PDs and RRs, or cases of manual assignment without a fully documented process, also contributed to common findings.

Modellingapproach: The lack of adequate modelling justifications presented many findings, including copula assumptions, risk factor choice, and correlation assumptions. Poor quality data and the lack of sufficient validation raised many findings for the correlation calibration.

Assessment of Counterparty Credit Risk

Eight banks faced on-site inspections under TRIM for counterparty credit risk. Whilst the majority of investigations resulted in findings of low materiality, there were severe weaknesses identified within validation units and overall governance frameworks.

Conclusion

Based on the findings and responses, it is clear that TRIM has successfully highlighted several shortcomings across the banks. As is often the case, many issues seem to be somewhat systemic problems which are seen in a large number of the institutions. The issues and findings have ranged from fundamental problems, such as missing risk factors, to more complicated problems related to inadequate modelling methodologies. As such, the remediation of these findings will also range from low to high effort. The SIs will need to mitigate the shortcomings in a timely fashion, with some more complicated or impactful findings potentially taking a considerable time to remediate.

A new way to manage your house bank G/L accounts in SAP S/4HANA release 2009

March 2021
6 min read

S/4 HANA Advanced Payment Management (APM) is SAP’s new solution for centralized payment hubs. Released in 2019, this solution operates as a centralized payment channel, consolidating payment flows from multiple payment sources. This article will serve to introduce its functionality and benefits.


With the introduction of the new cash management in S/4HANA in 2016, SAP has announced the bank account management functionality, which treats house bank accounts as master data. With this change of design, SAP has aligned the approach with other treasury management systems on the market moving the bank account data ownership from IT to Treasury team.

But one stumbling block was left in the design: each bank account master requires a dedicated set of general ledger (G/L) accounts, on which the balances are reflected (the master account) and through which transactions are posted (clearing accounts). Very often organizations define unique GL account for each house bank account (alternatively, generic G/L accounts are sometimes used, like “USD bank account 1”), so creation of a new bank account in the system involves coordination with two other teams:

  1. Financial master data team – managing the chart of accounts centrally, to create the new G/L accounts
  2. IT support – updating the usage of the new accounts in the system settings (clearing accounts)

Due to this maintenance process dependency, even with the new BAM, the creation of a new house bank account remained a tedious and lengthy process. Therefore, many organizations still keep the house bank account management within their IT support process also on S/4HANA releases, negating the very idea of BAM as master data.

To overcome this limitation and to put all steps in the bank account management life cycle in the ownership of the treasury team completely, in the most recent S/4HANA release (2009) SAP has introduced a new G/L account type: “Cash account”. G/L accounts of this new bank reconciliation account type are used in the bank account master data in a similar way as the already established reconciliation G/L accounts are used in customer and vendor master data. However, two new specific features had to be introduced to support the new approach:

  • Distinction between the Bank sub account (the master account) and the Bank reconciliation account (clearing account): this is reflected in the G/L account definition in the chart of accounts via a new attribute “G/L Account Subtype”.
  • In the bank determination (transaction FBZP), the reconciliation account is not directly assigned per house bank and payment method anymore. Instead, Account symbols (automatic bank statement posting settings) can be defined as SIP (self-initiated payment) relevant and these account symbols are available for assignment to payment methods in the bank country in a new customizing activity. This design finally harmonizes the account determination between the area of automatic payments and the area of automatic bank statement processing.
New G/L Account type in the G/L Account master data

In the same release, there are two other features introduced in the bank account management:

  • Individual bank account can be opened or blocked for posting.
  • New authorization object F_BKPF_BEB is introduced, enabling to assign bank account authorization group on the level of individual bank accounts in BAM. The user posting to the bank account has to be authorized for the respective authorisation group.

The impact of this new design on treasury process efficiency probably makes you already excited. So, what does it take to switch from the old to the new setup?

Luckily, the new approach can be activated on the level of every single bank account in the Bank account management master data, or even not used at all. Related functionalities can follow both old and new approaches side-by-side and you have time to switch the bank accounts to the new setup gradually. The G/L account type cannot be changed on a used account, therefore new G/L accounts have to be created and the balances moved in accounting on the cut-over date. However, this is necessary only for the G/L account masters. Outstanding payments do not prevent the switch, as the payment would follow the new reconciliation account logic upon activation. Specific challenges exist in the cheque payment scenario, but here SAP offers a fallback clearing scenario feature, to make sure the switch to the new design is smooth.

Centralized FX risk hedging to a base currency in SAP Treasury

March 2021
7 min read

One of the key finance related challenges that most corporates currently face is the IBOR reform.


Corporate treasuries have multiple strategic options to consider on how to manage the FX risk positions for which SAP’s standard functionality efficiently supports the activities such as balance sheet hedging and back-to-back economic hedging.

These requirements can be accommodated using applications including “Generate Balance Sheet Exposure Hedge Requests”, and the SAP Hedge Management Cockpit which efficiently joins SAP Exposure Management 2.0, Transaction Management and Hedge Accounting functionality, to create an end to end solution from exposure measurement to hedge relationship activation.

The common trait of these supported strategies is that external hedging is executed using the same currency pair as the underlying exposure currency and target currency. But this is not always the case.

Many multi-national corporations that apply a global centralized approach to FX risk management will choose to prioritize the benefits of natural offsetting of netting exposures over other considerations. One of the techniques frequently used is base currency hedging, where all FX exposures are hedged against one common currency called the “base” currency. This allows the greatest level of position netting where the total risk position is measured and aggregated according to only one dimension– per currency. The organization will then manage these individual currency risk positions as a portfolio and take necessary hedging actions against a single base currency determined by the treasury policy.

For any exposure that is submitted by a subsidiary to the Treasury Center, there are two currency risk components – the exposure currency and the target urrency. The value of the exposure currency is the “known” value, while the target currency value is “unknown”.

The immediate question that rises from this strategy is: How do we accurately record and estimate the target currency value to be hedged if the value is unknown?

To begin the journey, we first need to collect and then record the exposures into a flexible database where we can further process the data later. Experience tells us that the collection of exposures is normally done outside of SAP in a purpose-built tool, third party tool or simply an excel collection template that is interfaced to SAP. However after exposure collection, the SAP Exposure Management 2.0 functionality is capable of handling even the most advanced exposure attribute categorizations and aggregation, to form the database from which we can calculate our positions.

Importantly at this step, we need record the exposure from the perspective of the subsidiary, recording both the exposure currency and value, but also the target currency in the exposure, which at this point is unknown in value.

Internal price estimation
When we consider that for a centralized FX risk management strategy, the financial tool or contract to transfer the risk from the subsidiary to the Treasury Center is normally made through an internal FX forward or some other variation of it. At the point where both the exposure currency and target currency values are fixed according to the deal rate, we can say that it is this same rate we are to use to determine the forecasted target currency value based on the forecasted exposure currency value.

The method to find the internal dealing rate would be agreed between the subsidiary and Treasury Center and in line with the treasury policy. Examples of internal rate pricing strategies may use different sources of data with each presenting different levels of complexity:

  • Spot market rates
  • Budget or planning rates
  • Achieved external hedge rates from recent external trading
  • Other quantitative methods

Along with the question of how to calculate and determine the rate, we also need to address where this rate will be stored for easy access when estimating target currency exposure value. For most cases it may be suitable to use SAP Market Data Management tables but may also require a bespoke database table if a more complex derivation of the rate already calculated is required.

Although the complexity of the rate pricing tool may vary anywhere from picking the spot market rate on the day to calculating more complex values per subsidiary agreement, the objective remains the same – how do we calculate the rate, and where do we store this calculated rate for simple access to determine the position.

Position reporting
With exposures submitted and internal rate pricing calculated, we can now estimate our total positions for each participating currency. This entails both accessing the existing exposure data to find the exposure currency values, but also to estimate the target currency values based on the internal price estimation and fixing for each submitted exposure.

Although the hedging strategy may still vastly differ between different organizations on how they eventually cover off this risk and wish to visualize the position reports, the same fundamental inputs apply, and their hedging strategy will mostly define the layout and summarization level of the data that has already been calculated.

These layouts cannot be achieved through standard SAP reports, however by approaching the challenge as shown above, the report is simply an aggregation of the data already calculated into a preferred layout for the business users.

As a final thought, the external FX trades in place can easily be integrated into the report as well, providing more detail about the live hedged and unhedged open positions, even allowing for automatic integration of trade orders into the SAP Trade Platform Integration (TPI) to hedge the open positions, providing a controlled end to end FX risk management solution with Exposure Submission -> Trade Execution straight through processing.

SAP Trade Platform Integration (TPI)
The SAP TPI solution offers significant opportunities, not only for the base currency hedging approach, but also all other hedging strategies that would benefit from a more controlled and dynamic integration to external trade platforms. This topic deserves greater attention and will be discussed in the next edition of the SAP Treasury newsletter.

Conclusion
At first inspection, it may seem that the SAP TRM offering does not provide much assistance to implementing an efficient base currency hedging process. However, when we focus on these individual requirements listed above, we see that a robust solution can be built with highly effective straight through processing, while still benefiting from largely standard SAP capability.

The key is the knowledge of how these building blocks and foundations of the SAP TRM module can be used most effectively with the bespoke developments on internal pricing calculations and position reporting layouts to create a seamless integration between standard and bespoke activities.

Intercompany netting at Inmarsat

March 2021
6 min read

S/4 HANA Advanced Payment Management (APM) is SAP’s new solution for centralized payment hubs. Released in 2019, this solution operates as a centralized payment channel, consolidating payment flows from multiple payment sources. This article will serve to introduce its functionality and benefits.


Inmarsat had one FTE spending 3-4 hours every month, including during the month-end close, manually allocating an excessive number of payments against open invoices on the customer ledger. This was time that should have been spent on value-add activities that could have resulted in closing the books earlier. How did this come about?

In the current setup, credit/debit balances are building up on multiple intercompany payables/receivables accounts with the same entity, reflecting various business transactions (intercompany invoicing, cash concentration, POBO payments, intercompany settlement). This situation makes intercompany reconciliation more difficult and intercompany funding needs, less transparent.

Searching the solution

As part of the Zanders Treasury Technology Support contract, Inmarsat asked Zanders to define and implement a solution, which would reduce the build-up of multiple intercompany receivables/payables from cash concentration, and instead, reflect these movements in the in-house bank accounts of the respective entity.

During the initial set-up of in-house cash (IHC), it was our understanding that all intercompany netting inflows should auto-match with open invoices, if both the Vendor and customer invoices carried the same reference. “Netting” in Inmarsat terms means a settlement of intercompany customer/vendor invoices through IHC.

Unfortunately, a very small percentage of IHC intercompany inflows auto-matched with open customer invoices (14% achieved in May 2020). Sample cases reviewed show that the automatic matching was happening where references on both vendor and customer invoices are same. However, for most cases, even where references were the same, no auto-matching happened.

The IHC Inter-Co Netting issue

In phase 1, the intercompany netting issues were addressed. Intercompany netting is an arrangement among subsidiaries in a corporate group where each subsidiary makes payments to, or receives payment from, a clearing house (Netting Centre) for net obligations due from other subsidiaries in the group. This procedure is used to reduce credit/settlement risk. Also known as multilateral netting, netting and multilateral settlement.

SAP standard system logic/process:

FINSTA Bank statements are internal bank statements for internal Inhouse Cash Accounts and these statements post to the GL and subledger of the participating company codes, so that the inhouse cash transactions are reflected in the Balance Sheet.

Requirement:

Any Intercompany transactions posted through the FINSTA bank statements, should be able to correctly identify the open items on the Accounts Receivable (AR) side to post and clear the correct line items.

Root Cause Analysis:

We found that a payment advice segment present in FINSTA was overriding the clearing information found as per interpretation algorithm ‘021’. As such this was forcing the system to rely on the information in the payment advice notes to find a clearing criterion.

The documents should be cleared based on the information passed to the payment notes table FEBRE.

As a solution, we set the variable DELETE_ADVICE with an ‘X’ value in user exit EXIT_SAPLIEDP_203, so that SAP relied on the interpretation algorithm via a search on the FEBRE table and not the payment advice, for identifying the documents uniquely, and then clearing them. Information from the FEBRE table that includes the document reference, feeds into the interpretation algorithm to uniquely identify the AR open item to clear. This information is passed on to table FEBCL that has the criteria to be used for clearing.

With the above change maintained, SAP will always use the interpretation algorithm maintained in the posting rule for deriving the open items.

Prior to the fix, the highest percentage auto-match for 2020 was 16%. Post fix, we increased the automatch to 85%.

Table 1: interpretation algorithm

Client’s testimonial

Christopher Killick, ERP Functional Consultant at Inmarsat, expressed his gratitude for the solution offered by our Treasury Technology Support services in a testimonial:

“In the autumn of 2019, Inmarsat was preparing for takeover by private equity. At the same time, our specialized treasury resources were stretched. Fortunately, Zanders stepped in to ensure that the myriad of complex changes required were in place on time.

  • Make a number of general configuration improvements to our treasury management and in-house cash setup.
  • Educate us on deal management and business partner maintenance.
  • Update and vastly improve our Treasury Management User Guide.
  • Run a series of educational and analytical workshops.
  • Map out several future improvements that would be of great benefit to Inmarsat – some of which have now been implemented.
  • Without this support it is likely that Inmarsat would no longer be using SAP TRM.

Inmarsat’s relationship with Zanders has continued through a Treasury Technology Support Contract, that is administered with the utmost professionalism and care. In the past six months or so, a large number of changes have been implemented. Most of these have been highly complex, requiring real expertise and this is where the true benefit of having an expert treasury service provider makes all the difference.”

Conclusions

Since the start of the TTS support contract, Zanders has been intimately engaged with Inmarsat to help support and provide expert guidance with usage and continuous improvement of the SAP solution. This is just a small step in optimising the inter-company netting, but a big steps towards automation of the core IHB processes.

If you want to know more about optimising in-house bank structures or inter-company netting then please get in contact with Warren Epstein.

SAP migration tools for treasury data

March 2021
6 min read

S/4 HANA Advanced Payment Management (APM) is SAP’s new solution for centralized payment hubs. Released in 2019, this solution operates as a centralized payment channel, consolidating payment flows from multiple payment sources. This article will serve to introduce its functionality and benefits.


Because of these limitations, many implementation partners develop their custom in-house solutions to address the requirements of their clients. SAP is constantly working on improving its standard tools, through updates, and new functionalities. This article provides insight into SAP’s standard data migration tools as well as Zanders’ approach and tools, which successfully help our clients with the migration of treasury data.

Data migration objects: master and transactional

Data migration is the process of transferring data from a source (e.g. a legacy system or other type of data storage) to the target system – SAP. However, data migration is not simply a ‘lift and shift’ exercise, the data must also be transformed and made complete in order to efficiently facilitate the required business operations in the new system.
Since the vast majority of business processes can be supported via SAP, the variety of master data objects that are required becomes extremely large. SAP SCM (Supply Chain Management) module necessitates, for example, information about the materials, production sequencing or routing schedules while HCM (human capital management) requires data regarding employees and organizational structure. This article will focus in detail on the TRM (treasury and risk management) module and the typical master data objects that are required for its successful operation.

Core Treasury related Master data objects include but not limited to:

Business Partners:

Business Partner data contains information about the trading counterparties a corporate operates a business with. This data is very diverse and includes everything starting from names, addresses and bank accounts to types of approved transactions and currencies they should take place in. The business partner data is structured in a specific way. There are several concepts which should be defined and populated with data:

  1. Business Partner Category: defines what kind of party the business partner (private individual, subsidiary, external organization, etc.) is and basic information such as name and address
  2. Business Partner Role: defines the business classification of a business partner (Employee”, “Ordering Party” or “Counterparty”). This determines which kinds of transactions can occur with this business partner.
  3. Business Partner Relationship: This represents the relationship between two business partners.
  4. Business Partner Group Hierarchy: The structure of a complex organization with many subsidiaries or office geographies can be defined here.

Figure 1: the organizational structure of a company with various branches, according to the region to which they belong. Source: SAP Help Portal

House bank accounts:

This master data object contains information regarding the bank accounts at the house banks. It consists of both basic information such as addresses, phone numbers and bank account numbers, as well as more complicated information, such as the assignment of which bank account should be used for transactions within certain currencies.
In-house cash (IHC):

IHC data includes:

  • Bank accounts
  • Conditions: interest, limits etc.

Another important part of data migration is transactional data, which includes Financial transactions (deals), FX exposure figures etc.

Financial transactions:

Transactional data includes active and expired deals, which have been booked in the legacy system. The migration of such data may also require consolidation of information from several sources and its enhancement meanwhile maintaining its accuracy during the transfer. The amount of data is usually very large, adding another layer of complexity to the migration of this master data object.

The above examples of the master and transactional data objects relevant to SAP TRM give an insight into the complexity and volume of data required for a full and successful data migration. To execute such a task, there are a few approaches that can be utilized, which are supported by the data migration solutions discussed below.

Legacy Data migration solutions

At Zanders, we may propose different solutions for data migration, which are heavily dependent on a client specific characteristics. The following factors are taken into account:

  • Specificity of the data migration object (complexity, scope)
  • Type and quantity of legacy and target systems (SAP R/3, ECC, HANA, non-SAP, Cloud or on premise etc.)
  • Frequency to which the migration solution is to be used (one off or multiple times)
  • The solution ownership (IT support or Business)

After analysis of the above factors, the following SAP standard solutions from the below list may be proposed.

SAP GUI Scripting: is an interface to SAP for Windows and Java. Users can automate manual tasks through recording scripts per a specific manual process, and with a complete and correct dataset, the script will create the data objects for you. Scripting is usually used to support the business with different parts of the data migration or enhancement and is often developed and supported in-house for micro and recurrent migration activities.

SAP LSMW (Legacy System Migration Workbench) was a standard SAP data upload solution used in SAP ECC. It allowed the import of data, its required conversion and its export to the target SAP system. LSMW supports both batch and direct input methods. The former required the data to be formatted in a standardized way and stored in a file. This data was then uploaded automatically, with the downside of following a regular process involving transaction codes and processing screens. The latter required the use of an ABAP program, which uploads the data directly into the relevant data tables, omitting the need of transaction codes and processing screens seen in batch input methods.

SAP S/4 HANA Migration cockpit is a recommended standard data migration tool for SAP S/4 HANA. With this new iteration the tool became much more user friendly and simple to use. It supports the following migration approaches:

  1. Transfer data using files: SAP provides templates for the relevant objects.
  2. Transfer data using staging tables. Staging tables are created automatically in SAP HANA DB schema. Populate the tables with the business data and load into SAP S/4 HANA.
  3. Transfer data directly from SAP ERP system to SAP S/4 HANA (new feature from SAP S/4 Hana 1909)
  • extra option available from HANA 2020 -> Migrate data using Staging tables which can be pre-populated using with XML templates or SAP / third party ETL (extract, transfer, load) tools.

From HANA 2020 SAP enhances the solution with:

  • One harmonized application in Fiori
  • Transport concept. The data can be released between SAP clients and systems
  • Copying of the migration projects

SAP provides a flexible solution to integrate custom objects and enhancements for data migration via the Migration object modeler.

SAP migration cockpit has a proper set of templates to migrate Treasury deals. Currently SAP supports the following financial transaction types: Guarantees, Cap/Floor, CPs, Deposit at notice, Facilities, Fixed Term Deposits, FX, FX options, Interest Rate Instrument, IRS, LC, Security Bonds, Security Class, Stock.

Standard SAP tools are relatively competent solutions for data migration, however due to the complexity and scope of TRM related master data objects, they prove to not be sophisticated enough for certain clients. For example, they do support basic business partner setup. However, for most clients the functionality to migrate complex business partner data is required. In many cases, implementation partners, including Zanders, develop their own in-house solutions to tackle various TRM master data migration issues.

Zanders pre-developed solution – BP upload tool

Within SAP Treasury and Risk management, the business partner plays an important role in the administration. Unfortunately, with all new SAP installations it is not possible to perform a mass and full creation of the current business partners with the data required for Treasury.

SAP standard tools require enhancements to accommodate for the migration of the required data of Business partners, especially their creation and assignment of finance specific attributes and dependencies, which requires substantial time-consuming effort when performed manually.

Zanders acknowledges this issue and has developed a custom tool to mass create business partners within SAP. Our solution can be adjusted to different versions of SAP: from ECC to S/4 HANA 2020.

The tool consists of:

  1. Excel pre-defined template with a few tabs which represent different part of the BP master data: Name, Address, Bank data, Payment instructions, Authorizations etc.
  2. Custom program which can perform three actions: Create basic data for BP, enhance/amend or delete existing BPs in SAP.
  3. Test and production runs are supported with the full application log available during the run. The log will show if there is any error in the BP creation.

The migration of the master and transaction data is a complex but vital process for any SAP implementation project. This being said, the migration of the data (from planning to realization) should be viewed as a separate deliverable within a project.

Zanders has unique experience with Treasury data transformation and migration, and we are keen to assist our clients in selecting the best migration approach and the best-fit migration tool available from SAP standard. We are also able assist clients in the development of their own in-house solution, if required.

Should you have any questions, queries or interest in SAP projects please contact Aleksei Abakumov or Ilya Seryshev.

Fintegral

is now part of Zanders

In a continued effort to ensure we offer our customers the very best in knowledge and skills, Zanders has acquired Fintegral.

Okay

RiskQuest

is now part of Zanders

In a continued effort to ensure we offer our customers the very best in knowledge and skills, Zanders has acquired RiskQuest.

Okay

Optimum Prime

is now part of Zanders

In a continued effort to ensure we offer our customers the very best in knowledge and skills, Zanders has acquired Optimum Prime.

Okay
This site is registered on wpml.org as a development site.