Integrating systems is an art form. There is no right or wrong way to do this, no standard mechanism that can be applied the world over due to different software, business models and master data architectures. This makes the process of integration inherently difficult in terms of establishing the technical, data and functional design that will deliver a reliable connection between a transactional system and the system of record.
So when tackling the integration of the most critical systems in a commodity trading company, to achieve seamless end-to-end flow, its going to be an even greater test of patience, process, data quality and personalities.
So, what are a few good practices that can be applied to make an integration initiative more likely to succeed?
Have a workshop to agree why the integration is required. This does not have to be lengthy, but as with all software development, there must be a clear benefits case. This will help all involved to agree why the integration is being undertaken and help to set the initiative back on track if people lose sight of the goal.
Write it down! What do you want the interface to do? Write down the “strategy” of interface: periodicity, data to be collected, mapping rules, what’s allowed and not allowed, how will the transactions be “treated” or enriched on arrival in the ERP ?
Treat the integration activity as a software development project in its own right. By following a standard software development method, there will be a design, build, test and operational phase. Documenting each phase and explaining the process to all involved will ensure that at least the fundamental design decisions will be part of common understanding.
As part of the workshop and documentation, develop and understanding of the reporting that you are intending to get in the system of record. If you are not sending the correct fields from the transaction system, the reports will not be able detail or aggregate the data that you need for decision making or statutory reporting.
Designing the data flow can be challenging. There may be an enormity of data available to choose from, which could all be carried across the interface. However, the likelihood is only a sub-set will be required for reporting values, quantities, and associated categorization. The dimensions or data slicing required in the ERP should drive what flags are sent with the transaction e.g., Counterparty, Trader, Country of Origin, Grade etc.
There is a data quality and behavioural aspect to integrating a set of systems on a company’s estate. Pre-integration matching of records in would be done by hand, therefore allowing human “fuzzy-logic” to assess and controls how things were mapped. Integration does cannot work well with fuzzy logic. Therefore, the quality of data in the origin system and in the target system has to be equal and mapped. This can be a major change to the way in which operational teams share, reference, and talk about data. The language across teams may ned to change.
Standard testing approaches will require the identification of “normal” transactions to be processed. The testing will quite often be slow initially, until the iterative process of mapping is complete. Once a vanilla transaction is processed for each area, the temptation is to stop. This is a mistake. The best testing is yet to come i.e., that part of the exercise which assesses whether the integration can handle a variety of different scenarios including those which are a mistake. So once the error transaction has been processed, can you achieve a reversal with ease? If an error has been processed, you need to test the correction transaction as well.
Finalizing the records for a period requires that a series of “closing” activities are performed including payables, receivables, accruals, bank reconciliations and the trial balance. Having a calendar of events which clearly states who, what, where, when and how helps to increase visibility and the cycle of activities that cross team actions take place. The interface needs to be tested as part of this month end cycle so that it performs the tasks in sequence and as if it has replaced the re-keying of the journal entries to the P&L.
This is a short list of the things that can be done to increase the chance of success, when connecting a CTRM to an ERP. There is no substitute for experience in this process especially when it comes to environment management during testing, change management for process re-design and the jargon-busting required to get the CTRM and ERP teams to talk the same language. Once the integration Rubicon has been crossed, there is no going back as the productivity increases are hard to unlearn and the improvements in the ‘Record to Report’ process are now an expected norm.