The Reconciliation Revolution - Gresham Technologies

9
The Reconciliation Revolution Part Two: Big Data-Driven Insight June 2018 Bill Blythe Global Business Development Director

Transcript of The Reconciliation Revolution - Gresham Technologies

The Reconciliation RevolutionPart Two: Big Data-Driven Insight

June 2018

Bill Blythe Global Business Development Director

The Reconciliation Revolution series:

Part One—The New Era of Post Trade Processing The first section addresses the demands of massively increasing data volumes, regulatory requirements and cost competitiveness and why legacy technology can no longer be effective.

Part Two—Big-Data Driven InsightData is no longer a means to an end, but the center of a firm’s value. That means data integrity must be at the forefront of design. As Post-Trade evolves, reconciliation must move not only with it, but feature design and implementation that is strategic and data centric.

Part Three—The New Toys on the BlockReconciliation may be viewed as “old fashioned”, but disruptive technologies are poised to influence this and other areas of post-trade significantly. In part 3 we tackle NLP, AI, Machine Learning and RPA as well as preparing for Blockchain.

Big-Data Driven Insight

The new post-trade landscape augurs nothing less than a paradigm shift for reconcilia-tions. In the midst of this shift, banking institutions should reimagine their relationship with data—beginning with maximizing the value, and minimizing the cost, of their data estate. As discussed in Part 1, to do this firms must address new ways they consume, process, govern, and ultimately deploy their transaction data. Inevi-tably, reconciliation technology platforms must improve, as well.

Achieving ‘Data integrity and control’ is the natural next step.

A key part of this change is the way reconciliation is viewed functionally within the bank. Of course, traditional steps within reconciliation—harnessing the data, performing matching, identifying and resolving excep-tions, compiling reports—are still present, and indeed necessary. But in today’s environment, simply running this process to a static end-result, and then moving on, is insufficient.

Instead, data integrity demands we rise to a higher standard. The concept requires a platform that processes data more quickly and consistently, with connectivity

to a wider array of systems. It needs to do this while caching underlying computational elements, ensuring data lineage, and providing built-in analytics to allow personnel to flexibly slice and dice the results.

The platform needs to fit within the internal enterprise data management (EDM) frameworks many banks have now developed, and help meet global data governance standards like the BCBS 239 Principles.

Finally, it should be designed for the future, as we speak to in Part 3 of this guide.

Transforming reconciliations at a global bank is innately complex and at times chaotic. Wrangling rogue spreadsheets, rational-ising legacy systems, building out a new tradebook, catching desks up with new regulation—often, a combination of these varied projects and tasks will be involved as firms embark upon this change. Robust technology combines with implementation expertise to achieve data integrity effectively.

The right platform will enable firms to rapidly and iteratively achieve control over their data with crucial benefits:

• regulatory compliance

• elimination of loss events

• match-rate efficiencies leading to operational savings

• precise visibility into trading across the bank

• readiness that enables scalability and faster innovation.

As we’ve seen in the past, without the right partner these efforts can prove extremely costly, create a new set of problems and set a bank back for years.

At Gresham, we believe this forward-thinking approach—one born of rich experience with tier-one banking clients, and which rightly perceives your data as the bedrock of the enter-prise—is required today. Here is how we do it.

Firms must address new ways they consume, process, govern, and ultimately deploy their transaction data.

The Reconciliation Revolution - Part Two | 1

Orders of magnitude different than 10 years ago

Designing and implementing a new data integrity platform begins with a handful of priorities: speed to market, automation, risk reduction, and generation of new insights. Early on, the process must document and deliver on firm-specific objectives, adhere to quantitative perfor-mance measurables, and use innovation to avoid the pitfalls that have historically beset reconciliation systems of the past.

Clareti was designed to be deployed rapidly with all of these needs in mind.

The universe of requirements is expanding as firms continue to access more diverse data feeds and introduce more complex, unstructured datasets in trading applications, and expectations at post-trade continue to build up. A wider array of activities now demand data integrity. At tier-one insti-tutions, the two most common and onerous applications

for data integrity—multiple, or n-way intersystem trade reconciliations, and proactive prevention of regulatory risk—are great cases in point. In both areas, an order-of-magnitude difference in performance is typically required to justify spend and meet requirements. Today’s technology must, therefore, go well beyond ordinary expectations.

These will require different elements and expertise in implementation, given their end-state objectives.

Take one illustration: a tier-one Swiss bank recently worked with Gresham on a replacement for an in-house, n-way reconciliation system that was buckling under the pressure of intaking data from 4,000 feeds. The old platform limited controls onboarding capability to five to ten per month, provided no intraday matching, and had poor auditing capability.

End-of-life risks for the platform’s hardware and software created an immediate need to act. Implementing

Clareti improved all of these areas, greatly improving performance and volume capacity while decreasing operating costs to the tune of $9.25 million in savings. The bank can now complete 60 to 70 controls per month (with over 1,500 done to date) across a wide range of cash equities, exchange-traded derivatives (ETDs), foreign exchange and money market (FXMM), several over-the-counter (OTC) bilateral and centrally-cleared instruments, and numerous other data streams.

2 | The Reconciliation Revolution - Part Two

Tier-one Swiss Bank Use Case

Needed a replacement for an in-house, n-way reconciliation system

Existing system was buckling under the pressure of intaking data from 4,000 feeds

Clareti was implemented, improving performance, volume capacity, and decreasing operating costs by $9.25 million

“Gresham demonstrated an innovative and exciting technology approach and a clear vision for the future”

Anne Collard, Global Head of Product & Channel Management,Payments & Cash Management, ANZ Bank

Meanwhile, another client —a tier-one Dutch institution—was looking to rapidly improve its validation processes for ETD regulatory risk controls, involving multiple internal systems and dozens of exchange data feeds from London, New York, and Amsterdam. Its solution was unable to cope with high incoming data volumes and non-standardised controls and formats for the ETDs. As a result, the firm had reached a standstill, taking more than seven months—or 220 days—on average to onboard a single exchange. By moving to a single system with Gresham, the bank greatly reduced its regulatory risk and can onboard two exchanges per day, allowing it to reasonably achieve its full target of 85 exchange feeds.

From ‘what’ to ‘how’

As this pair of client cases show, reconciliation needs are more bespoke than ever. But as the stakes and scope grow larger for technology transformation, three implementation require-ments are universally shared as banks seek to catch up to the new normal for post-trade:

• Automation-assisted design and configuration

• Rapid pre-production and deployment

• Immediately-available analytics focused on both transactions and platform performance

It is worth speaking to each of these in turn, as together they form the foundation for a successful data integrity and control framework.

Design and configuration

To start with, integrating dozens of data feeds and datasets into a single reconciliation—the system configuration—cannot be done well without forethought in design. Likewise, it cannot be done efficiently without a high level of automation. This holds true from end to end: from initial schema analysis and mapping rules, through to user interface flexibility and documentation.

Figure 1 below shows a typical reconciliation deployment cycle on Clareti, prior to systems integration testing (SIT), user acceptance testing (UAT), and project-specific tasks required before on-boarding the recon-ciliation into production.

The Reconciliation Revolution - Part Two | 32 | The Reconciliation Revolution - Part Two

Figure 1 – Clareti deployment cycle

0 minutes

Data acquisition and ETLprocess

1 hour

Fine tuning of matchingrules

5 minutes

Designand configurematching rules

30 minutes

User interfacedesign

10 minutes

Configure file upload

2 hours

SIT

Tier-one Dutch institution Use Case Looking to improve its validation processes for ETD regulatory risk controls

Existing solution was unable to take incoming data volumes and non-standardised controls With Gresham, the bank reduced regulatory risk and can now achieve

Clareti Deployment

The nimble timing of these steps is down to proven, innovative automation—technologies known in sum as the Clareti Onboarding Accelerator. The time taken to import, map, enrich, and persist the data feeds intelligently suggests the appropriate matching rules and design of the user interface (UI) is reduced significantly—often doing in a matter of hours what once took weeks. These steps to configuration include:

Eliminating ETL: History dictates that a bank must first quell the seemingly interminable technical headache of ETL.

Most traditional vendor solutions require manual creation of data schemas, which requires the data normalisation up front. This process, in turn, requires lengthy manual analysis of the data up front and a business requirements document to be produced in order to provide the technical resources (who manually create schemas) with the required information. Once the schema has been manually created, the data must then be normalised for loading into the schema. This is typically done by separate ETL teams within the bank, which is a costly and time-consuming requirement. ETL teams transform data for all systems across the bank, which leads to competing projects all hoping to get their require-ments to the top of the queue.

With Clareti, the Accelerator analyses the data, determines the data types, and automati-cally creates the data schemas for persistent storage. Simply put, it eliminates ETL as we know it.

Matching Rule Suggestion: Next, Clareti matching rules are automatically presented to the configurer for selection. Clareti’s heuristic engine scans the data, determining matching possibilities. Matching rules selected can then be applied to the data and results viewed in the user interface (UI) before committing to the configuration. The results are completed and available in real-time.

By contrast, other vendor solutions require matching rules to be configured manually. Complex decisions are required to be made following an intensive and manual analysis of the data, and results of the coded rules are not available for viewing until the configuration is promoted to live running and loaded with data. Even then, the match results cannot be viewed until the UI is configured.

User Interface Creation: As a third step, the UI is automatically produced and displayed to the configurer who can drag and drop columns in the order they require. The dynamic, modifiable UI can be changed by users at run time, with individual user prefer-ences stored. This avoids delays and frequent revisiting by the reconciliation devel-opment team when subsequent UI changes are requested, and

is evermore crucial as different personnel seek to access and analyze data that is relevant to them on the platform.

Automated Documentation: The final step in Clareti’s automated reconciliation deployment is the production of the discrete reconciliation configuration. The configu-ration is produced as an XML-based document and is easily safe-stored in a source control system, from which a full business documentation is readily derived into a human-readable configuration file. If changes are made, the documentation is automati-cally updated ensuring the documentation never goes stale. This functionality reflects the growing demand for comprehensive, accurate, and automatically updated audit control information.

This is a surprisingly problematic area for traditional providers. The configuration for reconciliations is buried deep within the product and is often difficult, if not impossible, to extract. Documentation is required in advance of the configuration and any changes to the specification of the reconciliation must be manually adjusted. This area is often overlooked and poorly updated, resulting in a mismatch between the actual configuration and the documentation that causes additional issues down the line.

With Clareti, the Accelerator analyses the data, deter-mines the data types, and automatically creates the data schemas for persistent storage - it eliminates ETL as we know it.

4 | The Reconciliation Revolution - Part Two

The Reconciliation Revolution - Part Two | 6

Unfortunately, if your process is fraught with complexities from the beginning, these same complexities will also rear their heads at its end

The Reconciliation Revolution - Part Two | 5

5

Rapid speed to market

If automation is the watchword for implementation design, then the next steps—pre-production testing and deployment—are all about efficiency.

Pre-Production: Often overlooked and under-estimated, pre-production testing is frequently the cause of project delays and spiralling costs. In part because of prudent configu-ration (as above), Clareti vastly reduces the need for lengthy testing—enabling the benefits of deploying controls more quickly, and the costs in testing to be controlled.

Traditional systems rely on the reuse of rules for efficiency. Down the line, however, is a major knock-on effect of this choice: regression testing that can often span many weeks. Because these rules are shared rather than self-contained, the testing of an individual recon-ciliation actually becomes the testing of all reconciliations that share its rules. As a result, unit-testing is only the first step in determining whether the reconciliation operates as intended. Even after confir-mation, it is still necessary to regression-test the entire reconciliation population, assuring that changes made to support the latest developed reconciliation have not caused new problems with reconcilia-tions that previously existed. To minimise this cost, new recon-

ciliations tend to be batched, also leading to significant delays in being able to deploy and make these reconciliations productive.

Clareti is designed to remove this full regression testing. All components of the reconcili-ation are self-contained (and automatically notated in the full configuration documentation), and promotion of reconcilia-tions can be undertaken with the confidence knowing that any newly-deployed recon-ciliation cannot contaminate others. Ultimately, this saves both time and money.

Deploying to Production: Having gone through configu-ration and pre-testing, the last thing a bank is looking for is more risk and cost as it prepares to take a new platform operational. Unfortunately,if your process is fraught with complexities from the beginning, these same complexities will also rear their heads at its end.

As mentioned earlier, because many legacy vendors focused on traditional Nostro and Depot reconciliations, they are limited by fixed and rigid data models that were designed for SWIFT messages, and their ability to react quickly and to deploy new controls efficiently is severely constrained. They require complex exporting and importing of configuration, and the promotion of the recon-ciliation cannot be completed without a full system restart—which also results in downtime.

Clareti’s discrete configu-ration file, by contrast, can simply be copied into the new environment. The promotion of a reconciliation from final UAT to production is completed in under a minute. There is no required system downtime for the promotion of new reconcili-ations.

The Reconciliation Revolution - Part Two | 5

“Gresham’s Clareti solution really stood out for its ability to deploy new reconcilia-tions quickly and cost effectively. The Gresham team under-stood our needs from the outset and is working closely with us to help us gain maximum benefit from the roll out of Clareti including enhanced governance and control”

David Worsfold, Head of Operations CMC Markets

The Reconciliation Revolution - Part Two | 6

Predictive analytics

The last cornerstone piece of an effective data integrity platform is its analytics—insights that create the real return on investment.

Today, it is simply not enough to invest, stand up a new reconciliation (or even individual control), and call the job done. Instead, these tools are required, not preferred—and they should be available immediately. Market structure, trading patterns, and client needs are evolving; measurable insight into post-trade processes becomes all the more crucial. The potential to glean meaningful infor-mation from the terabytes of transactional data significantly increases. Your reconciliation platform—pulling together enterprise-wide controls and processes, potentially

hundreds of millions of trans-actions per day—should be the first place to look.

These insights should reflect both business activities and the platform’s performance. Front-office personnel need advanced analytics that can help users slice and dice the data output; for example, by country exposure (see figure 2) or to discover transactional and balance trends. At the same time, operations, finance, and compliance personnel will look for a view into the progress of the recon-ciliation itself, e.g. ageing and match rates (see figure 4); and business analysts will ultimately deploy this infor-mation to form predictions and make decisions going forward. Clareti Analytics provides real-time data visualisation and embedded tools, including an intuitive dashboard interface, and

generation of dynamic reports in seconds that can be read on tablet, phone, or desktop.

Indeed, analytics and presen-tation will grow only more important as disruptive new technologies begin to augment post-trade, and reconciliation is less and less a “black box” buried within the firm. As we discuss in Part 3 of this guide, the ability to really harness these news-making tools effectively—artificial intelligence (AI) applications, natural-language processing (NLP), blockchain, deep learning, and distributed ledgers, among others—all comes back to a fundamental and decisive choice: to pursue data integrity.

Figure 2: Data output by Country

Sign up for The Recs Revolution Series

UK+44 (0)20 7653 0222

Europe+352 691 358 277

North America +1 646 943 5955

Asia Pacific - Singapore+65 6832 5166

Asia Pacific - Sydney+61 (0)2 8514 7007

greshamtech.com @greshamtech

Contact us to learn more.