Intercompany netting at Inmarsat

March 2021
4 min read

At Inmarsat, intercompany netting is gross amount based and settled via their SAP In-House bank solution, however, the current setup made intercompany reconciliation difficult and intercompany funding needs less transparent. We offered a solution.


Inmarsat had one FTE spending 3-4 hours every month, including during the month-end close, manually allocating an excessive number of payments against open invoices on the customer ledger. This was time that should have been spent on value-add activities that could have resulted in closing the books earlier. How did this come about?

In the current setup, credit/debit balances are building up on multiple intercompany payables/receivables accounts with the same entity, reflecting various business transactions (intercompany invoicing, cash concentration, POBO payments, intercompany settlement). This situation makes intercompany reconciliation more difficult and intercompany funding needs, less transparent.

Searching the solution

As part of the Zanders Treasury Technology Support contract, Inmarsat asked Zanders to define and implement a solution, which would reduce the build-up of multiple intercompany receivables/payables from cash concentration, and instead, reflect these movements in the in-house bank accounts of the respective entity.

During the initial set-up of in-house cash (IHC), it was our understanding that all intercompany netting inflows should auto-match with open invoices, if both the Vendor and customer invoices carried the same reference. “Netting” in Inmarsat terms means a settlement of intercompany customer/vendor invoices through IHC.

Unfortunately, a very small percentage of IHC intercompany inflows auto-matched with open customer invoices (14% achieved in May 2020). Sample cases reviewed show that the automatic matching was happening where references on both vendor and customer invoices are same. However, for most cases, even where references were the same, no auto-matching happened.

The IHC Inter-Co Netting issue

In phase 1, the intercompany netting issues were addressed. Intercompany netting is an arrangement among subsidiaries in a corporate group where each subsidiary makes payments to, or receives payment from, a clearing house (Netting Centre) for net obligations due from other subsidiaries in the group. This procedure is used to reduce credit/settlement risk. Also known as multilateral netting, netting and multilateral settlement.

SAP standard system logic/process:

FINSTA Bank statements are internal bank statements for internal Inhouse Cash Accounts and these statements post to the GL and subledger of the participating company codes, so that the inhouse cash transactions are reflected in the Balance Sheet.

Requirement:

Any Intercompany transactions posted through the FINSTA bank statements, should be able to correctly identify the open items on the Accounts Receivable (AR) side to post and clear the correct line items.

Root Cause Analysis:

We found that a payment advice segment present in FINSTA was overriding the clearing information found as per interpretation algorithm ‘021’. As such this was forcing the system to rely on the information in the payment advice notes to find a clearing criterion.

The documents should be cleared based on the information passed to the payment notes table FEBRE.

As a solution, we set the variable DELETE_ADVICE with an ‘X’ value in user exit EXIT_SAPLIEDP_203, so that SAP relied on the interpretation algorithm via a search on the FEBRE table and not the payment advice, for identifying the documents uniquely, and then clearing them. Information from the FEBRE table that includes the document reference, feeds into the interpretation algorithm to uniquely identify the AR open item to clear. This information is passed on to table FEBCL that has the criteria to be used for clearing.

With the above change maintained, SAP will always use the interpretation algorithm maintained in the posting rule for deriving the open items.

Prior to the fix, the highest percentage auto-match for 2020 was 16%. Post fix, we increased the automatch to 85%.

Table 1: interpretation algorithm

Client’s testimonial

Christopher Killick, ERP Functional Consultant at Inmarsat, expressed his gratitude for the solution offered by our Treasury Technology Support services in a testimonial:

“In the autumn of 2019, Inmarsat was preparing for takeover by private equity. At the same time, our specialized treasury resources were stretched. Fortunately, Zanders stepped in to ensure that the myriad of complex changes required were in place on time.

  • Make a number of general configuration improvements to our treasury management and in-house cash setup.
  • Educate us on deal management and business partner maintenance.
  • Update and vastly improve our Treasury Management User Guide.
  • Run a series of educational and analytical workshops.
  • Map out several future improvements that would be of great benefit to Inmarsat – some of which have now been implemented.
  • Without this support it is likely that Inmarsat would no longer be using SAP TRM.

Inmarsat’s relationship with Zanders has continued through a Treasury Technology Support Contract, that is administered with the utmost professionalism and care. In the past six months or so, a large number of changes have been implemented. Most of these have been highly complex, requiring real expertise and this is where the true benefit of having an expert treasury service provider makes all the difference.”

Conclusions

Since the start of the TTS support contract, Zanders has been intimately engaged with Inmarsat to help support and provide expert guidance with usage and continuous improvement of the SAP solution. This is just a small step in optimising the inter-company netting, but a big steps towards automation of the core IHB processes.

If you want to know more about optimising in-house bank structures or inter-company netting then please get in contact with Warren Epstein.

SAP migration tools for treasury data

March 2021
4 min read

At Inmarsat, intercompany netting is gross amount based and settled via their SAP In-House bank solution, however, the current setup made intercompany reconciliation difficult and intercompany funding needs less transparent. We offered a solution.


Because of these limitations, many implementation partners develop their custom in-house solutions to address the requirements of their clients. SAP is constantly working on improving its standard tools, through updates, and new functionalities. This article provides insight into SAP’s standard data migration tools as well as Zanders’ approach and tools, which successfully help our clients with the migration of treasury data.

Data migration objects: master and transactional

Data migration is the process of transferring data from a source (e.g. a legacy system or other type of data storage) to the target system – SAP. However, data migration is not simply a ‘lift and shift’ exercise, the data must also be transformed and made complete in order to efficiently facilitate the required business operations in the new system.
Since the vast majority of business processes can be supported via SAP, the variety of master data objects that are required becomes extremely large. SAP SCM (Supply Chain Management) module necessitates, for example, information about the materials, production sequencing or routing schedules while HCM (human capital management) requires data regarding employees and organizational structure. This article will focus in detail on the TRM (treasury and risk management) module and the typical master data objects that are required for its successful operation.

Core Treasury related Master data objects include but not limited to:

Business Partners:

Business Partner data contains information about the trading counterparties a corporate operates a business with. This data is very diverse and includes everything starting from names, addresses and bank accounts to types of approved transactions and currencies they should take place in. The business partner data is structured in a specific way. There are several concepts which should be defined and populated with data:

  1. Business Partner Category: defines what kind of party the business partner (private individual, subsidiary, external organization, etc.) is and basic information such as name and address
  2. Business Partner Role: defines the business classification of a business partner (Employee”, “Ordering Party” or “Counterparty”). This determines which kinds of transactions can occur with this business partner.
  3. Business Partner Relationship: This represents the relationship between two business partners.
  4. Business Partner Group Hierarchy: The structure of a complex organization with many subsidiaries or office geographies can be defined here.

Figure 1: the organizational structure of a company with various branches, according to the region to which they belong. Source: SAP Help Portal

House bank accounts:

This master data object contains information regarding the bank accounts at the house banks. It consists of both basic information such as addresses, phone numbers and bank account numbers, as well as more complicated information, such as the assignment of which bank account should be used for transactions within certain currencies.
In-house cash (IHC):

IHC data includes:

  • Bank accounts
  • Conditions: interest, limits etc.

Another important part of data migration is transactional data, which includes Financial transactions (deals), FX exposure figures etc.

Financial transactions:

Transactional data includes active and expired deals, which have been booked in the legacy system. The migration of such data may also require consolidation of information from several sources and its enhancement meanwhile maintaining its accuracy during the transfer. The amount of data is usually very large, adding another layer of complexity to the migration of this master data object.

The above examples of the master and transactional data objects relevant to SAP TRM give an insight into the complexity and volume of data required for a full and successful data migration. To execute such a task, there are a few approaches that can be utilized, which are supported by the data migration solutions discussed below.

Legacy Data migration solutions

At Zanders, we may propose different solutions for data migration, which are heavily dependent on a client specific characteristics. The following factors are taken into account:

  • Specificity of the data migration object (complexity, scope)
  • Type and quantity of legacy and target systems (SAP R/3, ECC, HANA, non-SAP, Cloud or on premise etc.)
  • Frequency to which the migration solution is to be used (one off or multiple times)
  • The solution ownership (IT support or Business)

After analysis of the above factors, the following SAP standard solutions from the below list may be proposed.

SAP GUI Scripting: is an interface to SAP for Windows and Java. Users can automate manual tasks through recording scripts per a specific manual process, and with a complete and correct dataset, the script will create the data objects for you. Scripting is usually used to support the business with different parts of the data migration or enhancement and is often developed and supported in-house for micro and recurrent migration activities.

SAP LSMW (Legacy System Migration Workbench) was a standard SAP data upload solution used in SAP ECC. It allowed the import of data, its required conversion and its export to the target SAP system. LSMW supports both batch and direct input methods. The former required the data to be formatted in a standardized way and stored in a file. This data was then uploaded automatically, with the downside of following a regular process involving transaction codes and processing screens. The latter required the use of an ABAP program, which uploads the data directly into the relevant data tables, omitting the need of transaction codes and processing screens seen in batch input methods.

SAP S/4 HANA Migration cockpit is a recommended standard data migration tool for SAP S/4 HANA. With this new iteration the tool became much more user friendly and simple to use. It supports the following migration approaches:

  1. Transfer data using files: SAP provides templates for the relevant objects.
  2. Transfer data using staging tables. Staging tables are created automatically in SAP HANA DB schema. Populate the tables with the business data and load into SAP S/4 HANA.
  3. Transfer data directly from SAP ERP system to SAP S/4 HANA (new feature from SAP S/4 Hana 1909)
  • extra option available from HANA 2020 -> Migrate data using Staging tables which can be pre-populated using with XML templates or SAP / third party ETL (extract, transfer, load) tools.

From HANA 2020 SAP enhances the solution with:

  • One harmonized application in Fiori
  • Transport concept. The data can be released between SAP clients and systems
  • Copying of the migration projects

SAP provides a flexible solution to integrate custom objects and enhancements for data migration via the Migration object modeler.

SAP migration cockpit has a proper set of templates to migrate Treasury deals. Currently SAP supports the following financial transaction types: Guarantees, Cap/Floor, CPs, Deposit at notice, Facilities, Fixed Term Deposits, FX, FX options, Interest Rate Instrument, IRS, LC, Security Bonds, Security Class, Stock.

Standard SAP tools are relatively competent solutions for data migration, however due to the complexity and scope of TRM related master data objects, they prove to not be sophisticated enough for certain clients. For example, they do support basic business partner setup. However, for most clients the functionality to migrate complex business partner data is required. In many cases, implementation partners, including Zanders, develop their own in-house solutions to tackle various TRM master data migration issues.

Zanders pre-developed solution – BP upload tool

Within SAP Treasury and Risk management, the business partner plays an important role in the administration. Unfortunately, with all new SAP installations it is not possible to perform a mass and full creation of the current business partners with the data required for Treasury.

SAP standard tools require enhancements to accommodate for the migration of the required data of Business partners, especially their creation and assignment of finance specific attributes and dependencies, which requires substantial time-consuming effort when performed manually.

Zanders acknowledges this issue and has developed a custom tool to mass create business partners within SAP. Our solution can be adjusted to different versions of SAP: from ECC to S/4 HANA 2020.

The tool consists of:

  1. Excel pre-defined template with a few tabs which represent different part of the BP master data: Name, Address, Bank data, Payment instructions, Authorizations etc.
  2. Custom program which can perform three actions: Create basic data for BP, enhance/amend or delete existing BPs in SAP.
  3. Test and production runs are supported with the full application log available during the run. The log will show if there is any error in the BP creation.

The migration of the master and transaction data is a complex but vital process for any SAP implementation project. This being said, the migration of the data (from planning to realization) should be viewed as a separate deliverable within a project.

Zanders has unique experience with Treasury data transformation and migration, and we are keen to assist our clients in selecting the best migration approach and the best-fit migration tool available from SAP standard. We are also able assist clients in the development of their own in-house solution, if required.

Should you have any questions, queries or interest in SAP projects please contact Aleksei Abakumov or Ilya Seryshev.

FRTB: Harnessing Synergies Between Regulations

March 2021
5 min read

Discover how leveraging synergies across key regulatory frameworks like SIMM, BCBS 239, SA-CVA, and the IBOR transition can streamline your compliance efforts and ease the burden of FRTB implementation.


Regulatory Landscape

Despite a delay of one year, many banks are struggling to be ready for FRTB in January 2023. Alongside the FRTB timeline, banks are also preparing for other important regulatory requirements and deadlines which share commonalities in implementation. We introduce several of these below.

SIMM

Initial Margin (IM) is the value of collateral required to open a position with a bank, exchange or broker.  The Standard Initial Margin Model (SIMM), published by ISDA, sets a market standard for calculating IMs. SIMM provides margin requirements for financial firms when trading non-centrally cleared derivatives.

BCBS 239

BCBS 239, published by the Basel Committee on Banking Supervision, aims to enhance banks’ risk data aggregation capabilities and internal risk reporting practices. It focuses on areas such as data governance, accuracy, completeness and timeliness. The standard outlines 14 principles, although their high-level nature means that they are open to interpretation.

SA-CVA

Credit Valuation Adjustment (CVA) is a type of value adjustment and represents the market value of the counterparty credit risk for a transaction. FRTB splits CVA into two main approaches: BA-CVA, for smaller banks with less sophisticated trading activities, and SA-CVA, for larger banks with designated CVA risk management desks.

IBOR

Interbank Offered Rates (IBORs) are benchmark reference interest rates. As they have been subject to manipulation and due to a lack of liquidity, IBORs are being replaced by Alternative Reference Rates (ARRs). Unlike IBORs, ARRs are based on real transactions on liquid markets rather than subjective estimates.

Synergies With Current Regulation

Existing SIMM and BCBS 239 frameworks and processes can be readily leveraged to reduce efforts in implementing FRTB frameworks.

SIMM

The overarching process of SIMM is very similar to the FRTB Sensitivities-based Method (SbM), including the identification of risk factors, calculation of sensitivities and aggregation of results. The outputs of SbM and SIMM are both based on delta, vega and curvature sensitivities. SIMM and FRTB both share four risk classes (IR, FX, EQ, and CM). However, in SIMM, credit is split across two risk classes (qualifying and non-qualifying), whereas it is split across three in FRTB (non-securitisation, securitisation and correlation trading). For both SbM and SIMM, banks should be able to decompose indices into their individual constituents. 

We recommend that banks leverage the existing sensitivities infrastructure from SIMM for SbM calculations, use a shared risk factor mapping methodology between SIMM and FRTB when there is considerable alignment in risk classes, and utilise a common index look-through procedure for both SIMM and SbM index decompositions.

BCBS 239

BCBS 239 requires banks to review IT infrastructure, governance, data quality, aggregation policies and procedures. A similar review will be required in order to comply with the data standards of FRTB. The BCBS 239 principles are now in “Annex D” of the FRTB document, clearly showing the synergy between the two regulations. The quality, transparency, volume and consistency of data are important for both BCBS 239 and FRTB. Improving these factors allow banks to easily follow the BCBS 239 principles and decrease the capital charges of non-modellable risk factors. BCBS 239 principles, such as data completeness and timeliness, are also necessary for passing P&L attribution (PLA) under FRTB.

We recommend that banks use BCBS 239 principles when designing the necessary data frameworks for the FRTB Risk Factor Eligibility Test (RFET), support FRTB traceability requirements and supervisory approvals with existing BCBS 239 data lineage documentation, and produce market risk reporting for FRTB using the risk reporting infrastructure detailed in BCBS 239.

Synergies With Future Regulation

The IBOR transition and SA-CVA will become effective from 2023. Aligning the timelines and exploiting the similarities between FRTB, SA-CVA and the IBOR transition will support banks to be ready for all three regulatory deadlines.

SA-CVA

Four of the six risk classes in SA-CVA (IR, FX, EQ, and CM) are identical to those in SbM. SA-CVA, however, uses a reduced granularity for risk factors compared to SbM. The SA-CVA capital calculation uses a similar methodology to SbM by combining sensitivities with risk weights. SA-CVA also incorporates the same trade population and metadata as SbM. SA-CVA capital requirements must be calculated and reported to the supervisor at the same monthly frequency as for the market risk standardised approach.

We recommend that banks combine SA-CVA and SbM risk factor bucketing tasks in a common methodology to reduce overall effort, isolate common components of both models as a feeder model, allowing a single stream for model development and validation, and develop a single system architecture which can be configured for either SbM or SA-CVA.

IBOR Transition

Although not a direct synergy, the transition from IBORs will have a direct impact to the Internal Models Approach (IMA) for FRTB and eligibility of risk factors. As the use of IBORs are discontinued, banks may observe a reduction in the number of real-price observations for associated risk factors due to a reduction in market liquidity. It is not certain if these liquidity issues fall under the RFET exemptions for systemic circumstances, which apply to modellable risk factors which can no longer pass the test. It may be difficult for banks to obtain stress-period data for ARRs, which could lead to substantial efforts to produce and justify proxies. The transition may cause modifications to trading desk structure, the integration of external data providers, and enhanced operational requirements, which can all affect FRTB.

We recommend that banks investigate how much data is available for ARRs, for both stress-period calculations and real-price observations, develop any necessary proxies which will be needed to overcome data availability issues, as soon as possible, and Calculate IBOR capital consequences through the existing FRTB engine.

Conclusion

FRTB implementation is proving to be a considerable workload for banks, especially those considering opting for the IMA. Several FRTB requirements, such as PLA and RFET, are completely new requirements for banks. As we have shown in this article, there are several other important regulatory requirements which banks are currently working towards. As such, we recommend that banks should leverage the synergies which are seen across this regulatory landscape to reduce the complexity and workload of FRTB.

Zanders Project Management Framework

February 2021
7 min read

If you want to go fast, go alone. If you want to go far, go together


At the birth of any project, it is crucial to determine the most suitable project management framework by which the treasury objectives can be achieved. Whether the focus is on TMS implementation, treasury transformation or risk management, the grand challenge remains – to ensure the highest quality of the delivered outcome while understanding the realistic timelines and resources. In this article we shed a light on the implications of project management methodologies and address its main concepts and viewpoints, accompanied by experiences from past treasury projects.

In recent years, big corporates have been strategically cherry-picking elements from various methodologies, as there is no one-size-fits-all. At Zanders, our treasury project experience has given us an in-depth knowledge in this area. Based on this knowledge, and depending on several variables – project complexity, resource maturity, culture, and scope – we advise our clients on the best project management methodology to apply to a specific treasury project.

We have observed that when it comes to choosing the project management methodology for a new treasury project, most corporates tend to choose what is applied internally or on previous projects. This leverages the internal skillsets and maturity around that framework. But is this really the right way to choose?

Shifting from traditional methodologies

As the environment that businesses operate in is undergoing a rapid and profound change, the applicability and relevance of the traditional project management methodologies have been called in to question. In the spirit of becoming responsive to unforeseen events, companies sensed the urgency to seek methods that are geared to rapid delivery and with the ability to respond to change quickly.

Embracing agile

The agile management framework aims to enhance project delivery by maximizing team productivity, while minimizing the waste inherent in redundant meetings, repetitive planning or excessive documentation. Unlike the traditional command and control-style management, which follows a linear approach, the core of agile methodology lies in a continuous reaction to a change rather than following a fixed plan.

This type of framework is mostly applied in an environment where the problem to be solved is complex, its solution is non-linear as it has many unknowns, and the project requirements will most likely change during the lifetime of the project as the target is on a constant move.

The illustration of an agile process (figured above) portrays certain similarities to the waterfall approach, in the sense of breaking the entire project in to several phases. However, while these phases in the waterfall approach are sequential, the activities in agile methodology can be run in parallel.

Agile principles promote changing requirements and sustainable development, and deliver working software frequently which can add value sooner. However, from a treasury perspective, you often cannot go live in pieces/functionalities since it increases risk or, when a requirement comes late in process, teams might not have the resources or availability to support the new requirement, creating delivery risk.

Evolving Agile and its forms

Having described the key principles of agile methodology, it is vital to state that over the years it has become a rather broad umbrella-term that covers various concepts that abide by the main agile values and principles.

One of the most popular agile forms is the Kanban approach, the uniqueness of which lies in the visualization of the workflow by building a so-called (digital) Kanban board. Scrum is another project management framework that can be used to manage iterative and incremental projects of all types. The Product Owner works with the team to identify and prioritize system functionality by creating a Product Backlog, with an estimation of software delivery by the functional teams. Once a Sprint has been delivered, the Product Backlog is analyzed and reprioritized, and the next set of deliverables is selected for the next Sprint. Lean framework focuses on delivering value to the customer through effective value-added analysis. Lean development eliminates waste by asking users to select only the truly valuable features for a system, prioritize the features selected, and then work on delivering them in small batches.

Waterfall methodologies – old but good

Even though agile methodologies are now widely accepted and rising in popularity, certain types of projects benefit from highly planned and predictive frameworks. The core of this management style lies in its sequential design process, meaning that an upcoming phase cannot begin until the previous one is formally closed. Waterfall methodologies are characterized by a high level of governance, where documentation plays a crucial role. This makes it easier to track the progress and manage the project scope in general. Projects that highly benefit from this methodology are characterized by their ability to define the fixed-end requirements up-front and are relatively smaller in size. For a project to move to the next phase, all current documentation must be approved by all the involved project managers. The excessive documentation ensures that the team members are familiar with the requirements of the coming phase.

Depending on the scope of the project, this progressive method breaks down the workload into several discrete steps, as shown here:

Project Team Structures

There are also differences between the project structures and the roles used in the two presented frameworks.

In waterfall, the common roles – outside of delivery or the functional team – to support and monitor the project plan are the project managers (depending on the size of the project there can be one or many, creating a project management office (PMO) structure) and a program director. In agile, the role structure is more intricate and complex. Again, this depends on the size of the treasury project.

As stated previously, agile project management relies heavily on collaborative processes. In this sense, a project manager is not required to have a central control, but rather appointing the right people to right tasks, increasing cross-functional collaboration, and removing impediments to progress. The main roles differ from the waterfall approach and can be labelled as Scrum master, Agile coach and Product owner.

Whatever the chosen approach is for a treasury project, one structure is normally seen in both – the steering committee. In more complex and bigger treasury projects (with greater impact and risk to the organization) sometimes a second structure or layer on top of the steering committee (called governance board) is needed. The objective of each one differs.

The Project Steering Committee is a decision-making structure within the project governance structure that consists of top managers (for example, the leads of each treasury area involved directly in the project) and decision makers who provide strategic direction and policy guidance to the project team and other stakeholders. They also:

  • Monitor progress against the project management plan.
  • Define, review and monitor value delivered to the business and business case.
  • Review and approve changes made to project resource plan, schedules, and scope. This normally depends on the materiality of the changes.
  • Review and approve project deliverables.
  • Resolve conflicts between stakeholders.

The Governance Board, when needed, is more strategical by nature. For example, in treasury projects they are normally represented by the treasurer, CFO, and CEO. Some of the responsibilities are to:

  • Monitor and help unblock major risks and potential project challenges.
  • Keep updated and understand broader impacts coming out from the project delivery.
  • Provide insights and solutions around external factors that might impact the treasury project (e.g. business strategic changes, regulatory frameworks, resourcing changes).

Other structures might be needed to be designed or implemented to support project delivery. More focused groups require different knowledge and expertise. Again, no one solution fits all and it depends on the scope and complexity of the treasury project.

The key decision factors that should be considered when selecting the project structure are:

Roles and responsibilities: Clearly define all roles and responsibilities for each project structure. That will drive planning and will clearly define who should do what. A lack of clarity will create project risks.

Size and expertise: Based on roles and responsibilities, and using a clear RAPID or RACI matrix, define the composition of these structures. There should not be a lot of overlap in terms of people in the structure. In most cases ‘less is more’ if expertise and experience is ensured.

The treasury project scope, complexity and deliverables should drive these structures. Like in the organizational structure of a company, a project should follow the same principles. A pyramid structure should be applied (not an inverted one) in which the functional (hands-on) team should be bigger than other structures.

Is a hybrid model desirable? Our conclusion

While it is known that all methodologies ultimately accomplish the same goal, choosing the most suitable framework is a critical success factor as it determines how the objectives are accomplished. Nowadays, we see that a lot of organizations are embracing a hybrid approach instead of putting all their hopes into one method.

Depending on the circumstances of the treasury project, you might find yourself in a situation where you employ the waterfall approach at the very beginning of the project. This creates a better structure for planning, ensures a common understanding of the project objectives and creates a reasonable timeline of the project. When it comes to the execution of the project, however, it becomes apparent that there needs to be space for some flexibility and early business engagement, as the project happens to be in a dynamic environment. Hence, it becomes beneficial to leverage an agile approach. Such project adapts a “structured agile” methodology, where the planning is done in the traditional way of management, while the execution implements some agile articles.

Machine learning in risk management

February 2021
4 min read

At Inmarsat, intercompany netting is gross amount based and settled via their SAP In-House bank solution, however, the current setup made intercompany reconciliation difficult and intercompany funding needs less transparent. We offered a solution.


The current trend to operate a ‘data-driven business’ and the fact that regulators are increasingly focused on data quality and data availability, could give an extra impulse to the use of ML models.

ML models

ML models study a dataset and use the knowledge gained to make predictions for other datapoints. An ML model consists of an ML algorithm and one or more hyperparameters. ML algorithms study a dataset to make predictions, where hyperparameters determine the settings of the ML algorithm. The studying of a dataset is known as the training of the ML algorithm. Most ML algorithms have hyperparameters that need to be set by the user prior to the training. The trained algorithm, together with the calibrated set of hyperparameters, form the ML model.

ML models have different forms and shapes, and even more purposes. For selecting an appropriate ML model, a deeper understanding of the various types of ML that are available and how they work is required. Three types of ML can be distinguished:

  • Supervised learning.
  • Unsupervised learning.
  • Semi-supervised learning.

The main difference between these types is the data that is required and the purpose of the model. The data that is fed into an ML model is split into two categories: the features (independent variables) and the labels/targets (dependent variables, for example, to predict a person’s height – label/target – it could be useful to look at the features: age, sex, and weight). Some types of machine learning models need both as an input, while others only require features. Each of the three types of machine learning is shortly introduced below.

Supervised learning

Supervised learning is the training of an ML algorithm on a dataset where both the features and the labels are available. The ML algorithm uses the features and the labels as an input to map the connection between features and labels. When the model is trained, labels can be generated by the model by only providing the features. A mapping function is used to provide the label belonging to the features. The performance of the model is assessed by comparing the label that the model provides with the actual label.

Unsupervised learning

In unsupervised learning there is no dependent variable (or label) in the dataset. Unsupervised ML algorithms search for patterns within a dataset. The algorithm links certain observations to others by looking at similar features. This makes an unsupervised learning algorithm suitable for, among other tasks, clustering (i.e. the task of dividing a dataset into subsets). This is done in such a manner that an observation within a group is more like other observations within the subset than an observation that is not in the same group. A disadvantage of unsupervised learning is that the model is (often) a black box.

Semi-supervised learning

Semi-supervised learning uses a combination of labeled and unlabeled data. It is common that the dataset used for semi-supervised learning consist of mostly unlabeled data. Manually labeling all the data within a dataset can be very time consuming and semi-supervised learning offers a solution for this problem. With semi-supervised learning a small, labeled subset is used to make a better prediction for the complete data set.

The training of a semi-supervised learning algorithm consists of two steps. To label the unlabeled observations from the original dataset, the complete set is first clustered using unsupervised learning. The clusters that are formed are then labeled by the algorithm, based on their originally labeled parts. The resulting fully labeled data set is used to train a supervised ML algorithm. The downside of semi-supervised learning is that it is not certain the labels are 100% correct.

Setting up the model

In most ML implementations, the data gathering, integration and pre-processing usually takes more time than the actual training of the algorithm. It is an iterative process of training a model, evaluating the results, modifying hyperparameters and repeating, rather than just a single process of data preparation and training. After the training is performed and the hyperparameters have been calibrated, the ML model is ready to make predictions.

Machine learning in financial risk management

ML can add value to financial risk management applications, but the type of model should suit the problem and the available data. For some applications, like challenger models, it is not required to completely explain the model you are using. This makes, for example, an unsupervised black box model suitable as a challenger model. In other cases, explainability of model results is a critical condition while choosing an ML model. Here, it might not be suitable to use a black box model.

In the next section we present some examples where ML models can be of added value in financial risk management.

Data quality analysis

All modeling challenges start with data. In line with the ‘garbage in, garbage out’ maxim, if the quality of a dataset is insufficient then an ML model will also not perform well. It is quite common that during the development of an ML model, a lot of time is spent on improving the data quality. As ML algorithms learn directly from the data, the performance of the resulting model will increase if the data quality increases. ML can be used to improve data quality before this data is used for modeling. For example, the data quality can be improved by removing/replacing outliers and replacing missing values with likely alternatives.

An example of insufficient data quality is the presence of large or numerous outliers. An outlier is an observation that significantly deviates from the other observations in the data, which might indicate it is incorrect. Outlier detection can easily be performed by a data scientist for univariate outliers, but multivariate outliers are a lot harder to identify. When outliers have been detected, or if there are missing values in a dataset, it might be useful to substitute some of these outliers or impute for missing values. Popular imputation methods are the mean, median or most frequent methods. Another option is to look for more suitable values; and ML techniques could help to improve the data quality here.

Multiple ML models can be combined to improve data quality. First, an ML model can be used to detect outliers, then another model can be used to impute missing data or substitute outliers by a more likely value. The outlier detection can either be done using clustering algorithms or by specialized outlier detection techniques.

Loan approval

A bank’s core business is lending money to consumers and companies. The biggest risk for a bank is the credit risk that a borrower will not be able to fully repay the borrowed amount. Adequate loan approval can minimize this credit risk. To determine whether a bank should provide a loan, it is important to estimate the probability of default for that new loan application.

Established banks already have an extensive record of loans and defaults at their disposal. Together with contract details, this can form a valuable basis for an ML-based loan approval model. Here, the contract characteristics are the features, and the label is the variable indicating if the consumer/company defaulted or not. The features could be extended with other sources of information regarding the borrower.

Supervised learning algorithms can be used to classify the application of the potential borrower as either approved or rejected, based on their probability of a future default on the loan. One of the suitable ML model types would be classification algorithms, which split the dataset into either the ‘default’ or ‘non-default’ category, based on their features.

Challenger models

When there is already a model in place, it can be helpful to challenge this model. The model in use can be compared to a challenger model to evaluate differences in performance. Furthermore, the challenger model can identify possible effects in the data that are not captured yet in the model in use. Such analysis can be performed as a review of the model in use or before taking the model into production as a part of a model validation.

The aim of a challenger model is to challenge the model in use. As it is usually not feasible to design another sophisticated model, mostly simpler models are selected as challenger model. ML models can be useful to create more advanced challenger models within a relatively limited amount of time.

Challenger models do not necessarily have to be explainable, as they will not be used in practice, but only as a comparison for the model in use. This makes all ML models suitable as challenger models, even black box models such as neural networks.

Segmentation

Segmentation concerns dividing a full data set into subsets based on certain characteristics. These subsets are also referred to as segments. Often segmentation is performed to create a model per segment to better capture the segment’s specific behavior. Creating a model per segment can lower the error of the estimations and increase the overall model accuracy, compared to a single model for all segments combined.

Segmentation can, among other uses, be applied in credit rating models, prepayment models and marketing. For these purposes, segmentation is sometimes based on expert judgement and not on a data-driven model. ML models could help to change this and provide quantitative evidence for a segmentation.

There are two approaches in which ML models can be used to create a data-driven segmentation.  One approach is that observations can be placed into a certain segment with similar observations based on their features, for example by applying a clustering or classification algorithm. Another approach to segment observations is to evaluate the output of a target variable or label. This approach assumes that observations in the same segment have the same kind of behavior regarding this target variable or label.

In the latter approach, creating a segment itself is not the goal, but optimizing the estimation of the target variable or classifying the right label is. For example, all clients in a segment ‘A’ could be modeled by function ‘a’, where clients in segment ‘B’ would be modeled by function ‘b’. Functions ‘a’ and ‘b’ could be regression models based on the features of the individual clients and/or macro variables that give a prediction for the actual target variable.

Credit scoring

Companies and/or debt instruments can receive a credit rating from a credit rating agency. There are a few well-known rating agencies providing these credit ratings, which reflects their assessment of the probability of default of the company or debt instrument. Besides these rating agencies, financial institutions also use internal credit scoring models to determine a credit score. Credit scores also provide an expectation on the creditworthiness of a company, debt instrument or individual.

Supervised ML models are suitable for credit scoring, as the training of the ML model can be done on historical data. For historical data, the label (‘defaulted’ or ‘not defaulted’) can be observed and extensive financial data (the features) is mostly available. Supervised ML models can be used to determine reliable credit scores in a transparent way as an alternative to traditional credit scoring models. Alternatively, credit scoring models based on ML can also act as challenger models for traditional credit scoring models. In this case, explainability is not a key requirement for the selected ML model.

Conclusion

ML can add value to, or replace, models applied in financial risk management. It can be used in many different model types and in many different manners. A few examples have been provided in this article, but there are many more.

ML models learn directly from the data, but there are still some choices to be made by the model user. The user can select the model type and must determine how to calibrate the hyperparameters. There is no ‘one size fits all’ solution to calibrate a ML model. Therefore, ML is sometimes referred to as an art, rather than a science.

When applying ML models, one should always be careful and understand what is happening ‘under the hood’. As with all modeling activities, every method has its pitfalls. Most ML models will come up with a solution, even if it is suboptimal. Common sense is always required when modeling. In the right hands though, ML can be a powerful tool to improve modeling in financial risk management.

Working with ML models has given us valuable insights (see the box below). Every application of ML led to valuable lessons on what to expect from ML models, when to use them and what the pitfalls are.

Machine learning and Zanders

Zanders already encountered several projects and research questions where ML could be applied. In some cases, the use of ML was indeed beneficial; in other cases, traditional models turned out to be the better solution.

During these projects, most time was spent on data collection and data pre-processing. Based on these experiences, an ML based dataset validation tool was developed. In another case, a model was adapted to handle missing data by using an alternative available feature of the observation.

ML was also used to challenge a Zanders internal credit rating model. This resulted in useful insights on potential model improvements. For example, the ML model provided more insight in variable importance and segmentation. These insights are useful for the further development of Zanders’ credit rating models. Besides the insights what could be done better, the ML model also emphasized the advantages of classical models over the ML-based versions. The ML model was not able to provide more sensible ratings than the traditional credit rating model.

In another case, we investigated whether it would be sensible and feasible to use ML for transaction screening and anomaly detection. The outcome of this project once more highlighted that data is key for ML models. The available data was numerous, but of low quality. Therefore, the used ML models were not able to provide a helpful insight into the payments, or to consistently detect divergent payment behavior on a large scale.

Besides the projects where ML was used to deliver a solution, we investigated the explainability of several ML models. During this process we gained knowledge on techniques to provide more insights into otherwise hardly understandable (black box) models.

Corrections and reversals in SAP Treasury

December 2020
4 min read

At Inmarsat, intercompany netting is gross amount based and settled via their SAP In-House bank solution, however, the current setup made intercompany reconciliation difficult and intercompany funding needs less transparent. We offered a solution.


As part of an SAP Treasury system implementation or enhancement, we review existing business processes, define bottlenecks and issues, and propose (further) enhancements. Once we have applied these enhancements in your SAP system, we create a series of trainings and user manuals which layout the business process actions needed to correctly use the system.

“It’s only those who do nothing that make no mistakes, I suppose”

Joseph Conrad

quote

This legendary saying of Joseph Conrad is still very valid today, as everyone makes mistakes. Therefore, we help our clients define smooth, seamless and futureproof processes which consider the possibility of mistakes or requirements for correction, and include actions to correct them.

Some common reasons why treasury payments require corrections are:

  • No need for a cash management transfer between house bank anymore
  • Incorrect house/beneficiary bank details were chosen
  • Wrong currency / amount / value date / payment details
  • Incorrect payment method

One of our practices is to first define a flowchart structure in form of decision tree, where each node represents either a treasury process (e.g. bank-to-bank transfer, FX deal, MM deal, Securities etc.), a transaction status in SAP, or an outcome which represents a solution scenario.

We must therefore identify the scope of the manual process, which depends on the complexity of the business case. At each stage of the transaction life cycle, we must identify whether it may be stuck and how it can be rectified or reversed.

Each scenario will bring a different set of t-codes to be used in SAP, and a different number of objects to be touched.

Below is an example of a bank-to-bank cash management transfer which is to be cancelled in SAP.

Figure 1: Bank-to-bank payment reversal

Scenario 2: A single payment request created via t-code FRFT_B and an automatic payment run is executed (F111), BCM is used but the payment batching (FBPM1) is not yet executed.

Step 1: define the accounting document to be reversed

T-code F111, choose the payment run created (one of the options) -> go to Menu -> Edit -> Payments -> Display log (display list) -> note the document number posted in the payment run.

Step 2: Reverse the payment document

T-code FB08: Enter the document number defined in step 1, choose company code, fiscal year and reversal reason, and click POST/SAVE.

SAP creates the corresponding offsetting accounting document.

Step 3: Reverse clearing of the payment request

T-code F8BW: Enter the document number defined in step 1, choose company code, fiscal year and click EXECUTE

The result is the payment request is uncleared.

Step 4: Reverse the payment request

T-code F8BV: enter the payment request (taken from FTFR_B or F111 or F8BT) and press REVERSE.

This step will reverse the payment request itself. Also, you may skip this step if you tick “Mark for cancellation” in STEP 3.

Step 5: Optional step, depending on the client setup of OBPM4 (selection variants)

Delete entries in tables: REGUVM and REGUHM. This is required to disable FBPM1 payment batching in SAP BCM for the payment run which is cancelled. The execution of this step depends on the client setup.

Call functional module (SE37): FIBL_PAYMENT_RUN_MERGE_DELETE with:

  • I_LAUFD : Date of the payment run as in F111
  • I_LAUFI : Identification of the payment run as in F111
  • I_XVORL : empty/blank

The number of nodes and branches comprising the decision tree may vary based on the business case of a client. Multiple correctional actions may also be possible, meaning there is no unique set of the correctional steps applicable for all the corporates.

If you interested in a review of your SAP Treasury processes, their possible enhancements and the corresponding business user manuals, please feel free to reach out to us. We are here to support you!

Average Rate FX Forwards and their processing in SAP

December 2020
4 min read

At Inmarsat, intercompany netting is gross amount based and settled via their SAP In-House bank solution, however, the current setup made intercompany reconciliation difficult and intercompany funding needs less transparent. We offered a solution.


The observation period for the average rate calculation is usually long and can be defined flexibly with daily, weekly or monthly periodicity. Though this type of contract is always settled as non-delivery forward in cash, it is a suitable hedging instrument in certain business scenarios, especially when the underlying FX exposure amount cannot be attributed to a single agreed payment date. In case of currencies and periods with high volatility, ARF reduces the risk of hitting an extreme reading of a spot rate.

Business margin protection

ARF can be a very efficient hedging instrument when the business margin needs to be protected, namely in the following business scenarios:

  • Budgeted sales revenue or budgeted costs of goods sold are incurred with reliable regularity and spread evenly in time. This exposure needs to be hedged against the functional currency.
  • The business is run in separate books with different functional currencies, FX exposure is determined and hedged against the respective functional currency of these books. Resulting margin can be budgeted with high degree of reliability and stability, is relatively small and needs to be hedged from the currency of the respective business book to the functional currency of the reporting entity.

Increased complexity

Hedging such FX exposure with conventional FX forwards would lead to a very high number of transactions, as well as data on the side of underlying FX exposure determination, resulting in a data flood and high administrative effort. A hedge accounting according the IFRS 9 rules is almost impossible due to high number of hedge relationships to manage. The complexity increases even more if treasury operations are centralized and the FX exposure has to be concentrated via intercompany FX transactions in the group treasury first.

If the ARF instruments are not directly supported by the used treasury management system (TMS), the users have to resort to replicating the single external ARF deal with a series of conventional FX forwards, creating individual FX forwards for each fixation date of the observation period. As the observation periods are usually long (at least 30 days) and rate fixation periodicity is usually daily, this workaround leads to a high count of fictitious deals with relatively small nominal, leading to an administrative burden described above. Moreover, this workaround prevents automated creation of deals via an interface from a trading platform and automated correspondence exchange based on SWIFT MT3xx messages, resulting in a low automation level of treasury operations.

Add-on for SAP TRM

Currently, the ARF instruments are not supported in SAP Treasury and Risk management system (SAP TRM). In order to bridge the gap and to help the centralized treasury organizations to further streamline their operations, Zanders has developed an add-on for SAP TRM to manage the fixing of the average rate over the observation period, as well as to correctly calculate the fair value of the deals with partially fixed average rate.

The solution consists of dedicated average rate FX forward collective processing report, covering:

  • Particular information related to ARF deals, including start and end of the fixation period, currently fixed average rate, fixed portion (percentage), locked-in result for the fixed portion of the deal in the settlement currency.
  • Specific functions needed to manage this type of deals: creation, change, display of rate fixation schedule, as well as creating final fixation of the FX deal, once the average rate is fully calculated through the observation period.

Figure 1 Zanders FX Average Rate Forwards Cockpit and the ARF specific key figures

The solution builds on the standard SAP functionality available for FX deal management, meaning all other proven functionalities are available, such as payments, posting via treasury accounting subledger, correspondence, EMIR reporting, calculation of fair value for month-end evaluation and reporting. Through an enhancement, the solution is fully integrated into market risk, credit risk and, if needed, portfolio analyser too. Therefore, correct mark-to-market is always calculated for both the fixed and unfixed portion of the deal.

Figure 2 Integration of Zanders ARF solution into SAP Treasury Transaction manager process flow

The solution builds on the standard SAP functionality available for FX deal management, meaning all other proven functionalities are available, such as payments, posting via treasury accounting subledger, correspondence, EMIR reporting, calculation of fair value for month-end evaluation and reporting. Through an enhancement, the solution is fully integrated into market risk, credit risk and, if needed, portfolio analyser too. Therefore, correct mark-to-market is always calculated for both the fixed and unfixed portion of the deal.

Zanders can support you with the integration of ARF forwards into your FX exposure management process. For more information do not hesitate to contact Michal Šárnik.

Managing Virtual Accounts using SAP In-House Cash

December 2020
4 min read

How to setup virtual accounts in SAP, part III. In the previous part of this series on ‘How to setup virtual accounts in SAP’, we delved into the details of a scenario where virtual accounts are managed on GL account level using SAP FI module only. This article investigates how SAP In-house cash (SAP IHC) module can be used to manage virtual accounts in your ERP.


SAP IHC is a module that facilitates a full suite of payment factory processes. It can be seen as an intercompany position subledger with a set of fancy features like POBO payment routing, bank statement allocation, arms-length intercompany interest calculations, out of the box payment and bank statement interfaces with participants (Opco’s) etcetera.

The process where virtual accounts are managed in IHC is depicted below:

In this process, we rely on a simple set of building blocks:

  • In-house cash accounts to manage intercompany positions between Treasury and OpCo’s,
  • GL accounts to represent external cash and the IC positions.
  • Processing of external bank statements,
  • Distribution of internal bank statements from IHC towards the OpCo’s ERP system,
  • On the external bank statement for the Master Account, an identifier needs to be available that conveys to which virtual account the actual collection was originally credited. This identifier ultimately tells us which OpCo these funds originally belongs to and which IHC account to credit.

The idea here is that Treasury will receive the external bank statement and automatically post the receipts into the correct IHC account using the identifier. By posting items on the IHC account, the intercompany positions are updated. Then, at the end of the day, a set of internal bank statements is generated in IHC and sent through an interface to the OpCo’s ERP. The OpCo’s ERP processes these statements, clears out the customers invoices and updates the IC position with treasury.

The two major benefits of using IHC over the solution as described in the previous articles of this series are:

  1. The OpCo’s do not require any direct integration with the bank and can rely on internal interfacing with Treasury. Especially in companies with a fragmented ERP landscape this can become a valuable proposition.
  2. IHC can very aptly integrate virtual account management processes with internal netting payments, payments on behalf of (POBO) and payment in name of processes.

Implementing virtual accounts in SAP

In the explanation below we assume that the basic FI-CO settings for the company code a.o. are already in place. Also, it is by no means a complete inventory of all the settings that are required to get IHC up and running. It focusses more on the configurational parts that specifically cater for the VA requirements specifically.

Master data – general ledger accounts

Three sets of GL accounts need to be created: balance sheet accounts for the representation of the intercompany positions, one set for virtual account clearing purposes between the EBS and the IHC accounting process, and the GL account to represent the cash position with the external bank. These GL accounts need to be assigned to the appropriate company codes and can now be used to in the bank statement import process and the IHC accounting process.

In the Treasury entity we should create a single GL (per position currency) representing the IC position with all its OpCo’s because the granularity of IC position per OpCo is managed in the IHC subledger. This approach results in less of an increase of accounts in the chart of account.

Transaction code FS00

House bank maintenance bank account maintenance

In order to be able to process bank statements and generate GL postings in your SAP system, we need to maintain the house bank data first. A house bank entry comprises of the following information that needs to be maintained carefully:

  1. The house bank identifier: a 5-digit label that clearly identifies the bank branch.
  2. Bank country: The ISO country code where the bank branch is located.
  3. Bank key: The bank key is a separate bank identifier that contains information like SWIFT BIC, local routing code and address related data of your house bank.

Transaction code FI12

Secondly, under the house bank entry, the bank accounts can be created, including:

  1. The account identifier: a 5-digit label that clearly identifies the bank account.
  2. Bank account number and IBAN: This represents the bank account number as assigned to you by the bank.
  3. Currency: the currency of the bank account.
  4. G/L Account: the general ledger account that is going to be used to represent the balance sheet position on this bank account. Or the IC position with Treasury.

Transaction code FI12 in SAP ECC or NWBC in S/4 HANA

The idea here is that we maintain one house bank and bank account in the treasury company code that represents the Master account as held with your house bank. This house bank will have the G/L account assigned to it that represents the house banks external cash position.

In each of the OpCo’s company codes, we maintain one house bank and bank account that represents each of the IHC bank accounts as held with the treasury center. This house bank will have the G/L account assigned to it that represents the intercompany position with the Treasury entity.

Electronic bank statement settings

The electronic bank statement (EBS) settings will ensure that, based on the information present on the bank statement, SAP is capable of posting the items into the general or sub ledgers according to the requirements. There are a few steps in the configuration process that are important for this to work:

1) Posting rule construction

Posting rules construction starts with setting up Account symbols and assigning GL accounts to it. The idea here is to define at two account symbols, the first one to represent the external Cash position (BANK), and the second one for the virtual account clearing between IHC and EBS (VACLR)

A separate account symbol for customers is not required in SAP.

For the account symbol for BANK we do not assign a GL account number directly in the settings; instead we will assign a so-called mask by entering the value “+++++++++”. What this does in SAP is for every time the posting rule attempts to post to “BANK”, the GL account as assigned in the house bank account settings is used (FI12 or NWBC setting above).

For the account symbol VACLR we can assign a dedicated O/I clearing GL that is used to clear out the EBS posting against the IHC posting (more on that later). These GL accounts should have already been created in the first step (FS00).

Now that we have the account symbols prepared, we can start tying together these symbols into posting rules. We need to create 3 posting rules.

Posting rule 1 is going to debit the BANK symbol and it is going to credit VACLR symbol

Posting rule 2 is going to debit the BANK symbol and it is going to credit a BLANK symbol. The posting type however is going the be set to value 8 “Clear Credit Subledger Account”. What this setting is going to attempt is to clear out any open item sitting in the customer sub-ledger using algorithms. We will explain more on these algorithms below.

As you can imagine, posting rule 1 is applicable for the Treasury entity. Posting rule 2 is going to be used in the OpCo’s EBS process.

Transaction code OT83

2) Posting rule assignment

In the next step we can assign the posting rules to the so-called “Bank Transaction Codes” (or BTC’s like NTRF) that are typically observed in the body of the bank statements to identify the nature of the transactions.

To understand under which Bank Transaction Code these collections are reported on the statement, you typically need to carefully analyze some sample statement output or check with your bank’s implementation team for feedback.

Important to note here is to assign an algorithm to posting rule 2. This algorithm will attempt to search the payment notes of the bank statement for “reference numbers” which it can use to trace back the original customer invoice open item. Once SAP has identified the correct outstanding invoice, it can clear this one off and identify it as being paid.

If SAP is unsuccessful to automatically identify the open item, it can be manually post processed in FEBAN or FEB_BSPROC.

Transaction code OT83

3) Bank account assignment

In the last part, we can assign the posting rules assignments to the bank accounts. This way we can differentiate different rule assignments for different accounts if that is needed.

Transaction code OT83

4) Search strings

If the posting rule assignment needs more granularity than the level provided in step 2 above (on BTC level), we can setup search strings. Search strings can be configured to look at the payment notes section of the bank statement and find certain fixed text or patterns of text. Based on such search strings, we can then modify the posting behavior by for instance overruling the posting rule assignment as defined in step 2.

Whether this is required depends on the level of information that is provided by the bank in its bank statements.

Transaction code OTPM

Prepare IHC to parallel post certain bank statement items into IHC accounts

In IHC there are two ways to parallel post bank statement items into IHC accounts; as payment items or as payment orders.

This can be controlled by setting a specific function module on BTE2810. If we set function module “BKK_IHB_BASTA_IN_POST”, SAP will post an IHC payment item. If we assign “IHC_APPL_XBS_POST”, SAP will post an IHC payment order.

Additional information can be found in note 2370212.

In the subsequent part of the article we assume that we use the payment item logic.

Transaction BF42

IHC account determination from payment notes

In this section of the configuration we can determine which IHC account should be used to post the bank statement items towards using payment notes search strings.

For example, if the master account bank statement payment notes for VA collections for a particular VA contains a string “From VA 54353” and we know this belongs to IHC account “F4000EUR01”, we can setup a rule in this part of the configuration for that. This will ensure that all items on a bank statement containing this text string will get posted into IHC account F4000EUR01.

Maintenance view TBKKIHB1

Assign external BTC to posting category

Here we can identify the external banks BTC codes (NTRF, NCMZ a.o.) which are applicable for the VA movements to post into IHC. Secondly, we can identify with which posting category to post them into the IHC accounts.

Once we identified the BTC code related to our VA collections (e.g. NCMZ), we can link them to the correct posting categories here. You could use standard categories 90 (Balancing Ext. Acct (D)) for debits and 91 (Balancing Ext. Acct (C)) for credits.

Alternatively, you can setup and link your own custom posting categories here to more precisely control how our VA collections are posted into IHC. This is out of scope for this article though.

Importing and processing bank statements

We should now be in good shape to import our first statements. We could download them from our electronic banking platform. We could also be in a situation where we already receive them through some automated H2H interface or even through SWIFT. In any case, the statements need to be imported in SAP. This can be achieved through transaction code FF.5. The most important parameters to understand here are the following:

  1. File parameters: Here we define the filename and storage path where our statement is saved. We also need to define what format this file is going to be, i.e. MT940, CAMT.053 or one of the many other supported formats
  2. Posting Parameters: Here we can define whether the line items on the bank statements are going to be posted to general or sub-ledger.
  3. Algorithms: Here we need to set the range of customer invoice reference number (XBLNR) for the EBS Algorithm to search the payment notes for any such occurrence in a focused manner. If we would leave these fields empty, the algorithm would not work properly and would not find any open invoice for automatic clearing.

Once these parameters are maintained in the import variant, the system will start to load the statements and generate the required postings.

Transaction code FF.5 / FEBP

Display IHC account statement

Now that we successfully loaded an external bank statement, we can now check whether the items are posted into the IHC account. This can be done via transaction code F9K3. For each IHC account we can now look at the “Account Turnover” and observe all the VA collections that are posted on the account.

Transaction code F9K3

Prepare the IHC account for FINSTA statement distribution

We need to enable the distribution of internal IHC statements to the OpCo’s ERP on the IHC account master record. This can be achieved via F9K2. On the “Account Statement” tab we can adjust the statement format to “FINSTA” and dispatch type to “ALE” to ensure we are going to send FINSTA statements over an ALE connection. This would be the most common combination; other combinations can be configured and selected here as well.

Transaction code F9K2

Setting up ALE partner profiles

Finally, we can configure the system to determine to which system the FINSTA’s need to be send. This can be done in WE20, partner type GP (business partner).

Here we need to setup the outbound parameters for the FINSTA message type. An appropriate port needs to be selected that represents the ERP of the OpCo.

Transaction code WE20

Trigger the distribution of a FINSTA statement

Now that we have some transactions posted on the IHC account and the FINSTA settings enabled, we can trigger the system to send the FINSTA statements to the receiving ERP system. This can be done in F9N7.

Here we can select the correct IHC account and statement date and run the program to generate the FINSTA statement.

Once the finsta is generated and sent to the receiving ERP, it can be processed there via FEBP there.

Transaction code F9N7

Closing remarks

This is the third part of a series on how to set up virtual accounts in SAP. Please find below the other articles on this subject:

How to set cash pool and in-house bank interest rates

October 2020
4 min read

At Inmarsat, intercompany netting is gross amount based and settled via their SAP In-House bank solution, however, the current setup made intercompany reconciliation difficult and intercompany funding needs less transparent. We offered a solution.


The pricing of intercompany treasury transactions is subject to transfer pricing regulation. In essence, treasury and tax professionals need to ensure that the pricing of these transactions is in line with market conditions, also known as the arm’s length principle, thereby avoiding unwarranted profit shifting.

We have has been assisting dozens of multinationals on this topic through our Transfer Pricing Solution (TPS). The TPS enables them to set interest rates on intercompany transactions in a compliant and automated way. Since its go-live, clients have priced over 1000 intercompany loans with a total notional of over EUR 60 billion using this self-service solution.

Cash Pooling Solution

In February 2020, the OECD published the first-ever international consensus on financial transactions transfer pricing. One of the key topics of the document relates to the determination of internal pooling interest rates. As a reaction, Zanders has launched a co-development initiative with key clients to design a Cash Pooling Solution that determines the arm’s length interest rates for physical cash pools, notional cash pools and in-house banks.

The goal of this new solution is to present treasury and tax professionals with a user-friendly workflow that incorporates all compliance areas as well as treasury insights into the pooling structure. The three main compliance areas for treasury professionals are:

  1. Ensuring that participants have a financial incentive to participate in the pooling structure. Entities participating in the pool should be ‘better off’ than they would be if they went directly to a third-party bank. In other words, participants’ pooled rates should be more favorable than their stand-alone rates. The OECD sets out a step-by-step approach to improve interest conditions for participating entities to distribute the synergies towards the participants.First, the total pooling benefit should be calculated. This total pooling benefit is the financial advantage for a group compared to a non-pooled cash management set-up. The total pooling benefit can be broken down into a netting benefit and an interest rate benefit. The netting benefit arises from offsetting debit and credit balances. The interest rate benefit arises from more beneficial interest rate conditions on the cash pool or in-house bank position, compared to stand-alone current accounts.
    Once the total pooling benefit has been calculated, it should be allocated over the leader entity and the participating entities. Therefore, a functional analysis of the pooling structure should be made to identify which entities contribute most in terms of their balances, creditworthiness and the administration of the pool. The allocated amount should be priced into the interest rates. A deposit rate will thus receive a pooling premium. A withdrawal rate will incorporate pooling discount.
  2. Ensuring a correct tax treatment of the cash pool transactions. Pooling structures are primarily in place to optimize cash and liquidity management. Therefore, tax authorities will expect to see the balances of cash pool participants fluctuate around zero. Treasury professionals should monitor positions to prevent participants from having a structural balance in the pool. If the balance has a longer-term character, tax authorities can classify such pooling position as a longer-term intercompany loan. Consequently, monitoring structural balances can lower tax risk significantly.
  3. Appropriate documentation should be in place for each time treasury determines the pooling interest rates. The documentation should include the methodology as well as all specifics of the transfer pricing analysis. Proper documentation will enable the multinational to substantiate the interest rates during tax audits.

Multinationals are confronted with a significant compliance burden to comply with these new guidelines. Different hurdles can be identified, ranging from access to the appropriate market data to a considerable and recurring time investment in determining and documenting the internal deposit and withdrawal rates for each pooling structure.

It remains to be seen how auditors treat these new guidelines, but the recent increased focus on transfer pricing seems to indicate that this will be a topic that may need additional attention in the coming years.

Zanders Inside solutions

In order to support treasury and tax professionals in this area, Zanders Inside launched its cloud-based Cash Pooling Solution. This solution will focus on each of the three compliance areas as described above. In addition, the solution leverages a high degree of automation to support the entire end-to-end process. It offers a cost-effective alternative for the manual process that multinationals go through. Please watch our video showing how the Cash Pooling Solution tackles the challenge of OECD compliancy.

Why treasury projects can fail

October 2020
7 min read

If you want to go fast, go alone. If you want to go far, go together


Even a small project like a SaaS cash flow forecast system implementation is complex if we take into account that the business, regulatory and technical landscape is constantly evolving and that in most cases business resources cannot fully dedicate their time to projects. In today’s world, project teams also need to respond to change quickly and deliver value as soon as possible, both to the project and to business stakeholders. The challenge is to do that while also being in control of timelines, budget, scope, and – most importantly – creating quality and value to treasury and the business.

As described in project management theory, a project is typically constrained by three elements. This is called the triple constraint, iron triangle or project triangle. The three elements are scope, time, and cost. This theory is project management methodology agnostic. It does not matter whether the project is delivered from a waterfall or agile approach; if any of these elements are changed, something else must change.

The theory of the project management triangle is true, but far too simple. Reality differs; only working with these three variables is a limitation when the aim is to deliver a project that also meets the quality criteria defined by the business. In that case, you also need to consider other variables that we will highlight in this article, as these will impact delivered quality, as well as scope, timelines, and cost.

Variables to consider for a successful project

Based on our experience, we know that all these variables will change and need to be adjusted during any project. However, these changes (to scope, timelines or cost) should come from evolving business requirements, a changing landscape and the additional value that they can bring to the business – and not due to a wrong project approach, or variables, processes and structures that were incorrectly or insufficiently defined at the start of the project.

What we see is that most projects are not delivered within the defined scope, time, and cost, due to incorrect project planning, lack of the right project structure or skillsets, and an unclear approach – and not because business is evolving.

So, how can we use this to manage projects effectively? We know, after all, that conditions inevitably change.

In most cases, the variables in a treasury project don’t change in the core functional area. Implementation teams (whether they are strategical, tactical, or operational) usually have a high level of treasury expertise. They speak the same language as the business and that helps to structure the scope, project, and delivery. The right business engagement allows you to keep the efforts, timelines, scope and cost under control.

Typically, it is other areas outside of treasury functional delivery that undermine delivery within timelines, scope, and cost. There are several reasons for this, including these key factors:

  • Lack of a detailed approach defined at the start of the project
  • Lack of time and focus given to these areas throughout the project
  • Lack of a right skillset (both external and internal) allocated to the project and specific areas

Support areas

Which are the areas that undermine delivery within timelines, scope, and cost? We call these ‘support areas’ to the functional treasury delivery team. Since they are considered support areas, most of the time there is a lack of focus or lack of treasury skilled people to define the right approach at an early stage.

These support areas are:

  • Project management
  • Business and change management
  • Data management
  • Testing
  • Infrastructure and integration

The way that it should work, theoretically, is that these areas support the core treasury delivery. Apart from making sure that their own deliverables support the overall functional delivery, their tasks should take the workload and pressure away from the core team and leaving it to the business to focus on priority items and risk areas.

What happens in practice is that, due to the reasons mentioned above (i.e. not using the right skillsets and an incorrect detailed approach defined for each of these areas), instead of taking off the pressure or workload from the core team, these areas require more time and put more pressure in the core functional team, requiring the time and focus of the key people to be shared across high and low priority and risk areas.

Questions you need to ask

Are all the deliverables critical to a successful project? Project empire-building happens in some projects, so review and prioritize deliverables. Non-critical deliverables will impact cost, timelines, and quality, shifting project focus and resources.

Therefore, at the start of a project, we always need to ask ourselves:

  1. Do we have the right approach for each project area (core functional, project management, etc.)?
  2. Is the approach based on similar projects, with a similar landscape, scope and complexity?
  3. Is the approach defined by someone with treasury skills and experience?

An ERP implementation is not the same as a treasury system implementation, since the data required, the processes and risk areas differ. Integration-wise, counterparties involved in treasury system projects differ in terms of file formatting and content, where security and encryption is key. From a testing perspective: how can you prioritize test activities if you don’t know the difference between trade settlement and trade capture?

The same applies to non-system related projects. A treasury transformation project does not require data to be loaded into a new treasury management system, however required reporting (project management dashboards) or stakeholder mapping and management differ from a non-treasury non system related project.

An agnostic approach and agnostic skillsets do not really work. The triangle comes into play when something affects one of its variables. For example, if we need to produce more reports for project management, or have more non-critical deliverables while the time frame is too short to deliver all, the project will likely need more resources, or perhaps a scope reduction. Either way, it will certainly impact quality.

Why do projects succeed?

All successful projects have things in common, and these are incorporated in our implementation framework, for areas like data, testing, project management, business change.

This article is the first of a series on implementation projects. In the next parts, we will start delving into each one of the areas mentioned, sharing our insights and explaining the right approach to impact costs, timelines and scope with changes that will increase the delivered quality and value for the business and that are coming from evolving business requirements with a clear ROI.

Fintegral

is now part of Zanders

In a continued effort to ensure we offer our customers the very best in knowledge and skills, Zanders has acquired Fintegral.

Okay

RiskQuest

is now part of Zanders

In a continued effort to ensure we offer our customers the very best in knowledge and skills, Zanders has acquired RiskQuest.

Okay

Optimum Prime

is now part of Zanders

In a continued effort to ensure we offer our customers the very best in knowledge and skills, Zanders has acquired Optimum Prime.

Okay
This site is registered on wpml.org as a development site.