By Peter Lawrey, CEO at Chronicle Software.
Regulated trading systems need to record the reasons behind every decision they make with minimal impact on performance. Chronicle Software has worked with investment banks and hedge funds to deliver trading services with a record everything model and low microsecond latencies.
Today’s more stringent regulatory environment is placing greater emphasis on monitoring firms’ trading in global capital markets. Reporting alone is no longer enough; the emphasis is on capturing data pertaining to all aspects of the trading workflow.
This trend is presenting trading technologists and compliance teams with a major challenge. Given the vast volumes of data involved in today’s increasingly electronic markets, capturing, storing and easily querying all the data regulators may require is no trivial task. Recent regulations like the EU’s MiFID II have greatly expanded the requirement for data capture, while increasing the length of time firms need to store data in case of regulatory enquiry.
While regulators have been more prescriptive with respect to the trading data they require firms to capture and store, financial institutions have recognized the need to implement data monitoring and capture infrastructures that are capable of handling not only all expected requirements but also ‘unknown unknowns’. And they need to achieve this while ensuring traceability and reproducibility, all without any diminution in system performance. It’s no mean feat.
Best execution is one area where new regulatory requirements are forcing firms to upgrade their approach to data capture, storage and accessibility. The EU’s MiFID II – which came into force in 2018 and expanded its predecessor’s remit to include non-equity securities – and the US Dodd Frank Act from 2014 have each raised the bar with respect to trade data capture, creating the requirement for firms to up the ante on the range and granularity of data they handle.
MiFID II’s best execution requirement holds firms to far more rigorous standards relating to quality of execution than its predecessor, requiring firms to take ‘sufficient steps’ to ensure best execution for clients. This ‘sufficient steps’ requirement creates the obligation for firms to collect and make available to regulators any and all data pertaining to how a given trade was executed. Firms are required to maintain histories of this data going back as far as seven years.
The regulation specifies that firms must retain records relating to the entire trade lifecycle and be able to reconstruct transactions upon request. This drives the requirement for highly granular time-stamping of order and transaction data, as well as market data that may have impacted the decision on whether to make the trade. Along with MiFID II’s broader application to non-equity instruments – potentially adding a further 15 million OTC securities – these requirements present firms with the task of capturing a vast pool of information to cover all regulatory eventualities as well as ‘unknown unknowns’.
This presents a raft of technological challenges if firms are to avoid damaging the performance of their highly tuned execution systems. In most cases, any attempt to log all trading messages can have a detrimental effect on trading system performance. As a result, developers of high-performance trading systems often turn off logging as it tends to slow things down.
Even where firms are logging, capturing and storing their trading messages – and monitoring the latency of their trading infrastructures, to demonstrate compliance with MiFID II’s system performance requirements – existing solutions often need a special decoder to extract meaningful data. This makes it difficult and expensive – decoding may cost more than the original data capture system – to respond to regulators’ enquiries.
In some extreme instances, it may be impossible to go back to older records for analysis, due to the inability to identify which subsets are needed and the difficulty in extracting those data sets. If you are recording data for two years, and no-one has checked it, how do you know it’s okay? How can you cut it down to make it manageable enough to analyse? In the worst-case scenario, these issues create such an obstacle that the proper analysis never gets done, placing the firm at risk of non-compliance.
To circumvent all this, what’s needed is a deterministic trade messaging platform that records all messages going in and going out of the system. The platform should combine full audit capability with latency monitoring and robust failover.
At the heart of this approach is the traceability – the ability to record everything the trading application is doing. Most market solutions today record output only and have difficulty in ensuring that data duplications don’t impact the functionality of the application. By implementing a deterministic system that eliminates duplicates and reduces redundancy of information, firms are able to ensure complete auditability with no loss of performance.
The benefits of taking this approach extend beyond traceability, reproducibility and auditability. These fundamental building blocks combine to slash time to market for firms seeking to address best execution or indeed any regulatory imperative involving data capture and audit.
Because this kind of platform won’t function if data is missing, firms can have confidence in any data coming through the system, whatever the volume of throughput. This inherent comprehensiveness of data means analysts don’t have to go back and fill gaps where data is missing, as they typically do with available commercial systems, which is expensive, difficult and in some cases impossible.
This deterministic approach allows firms to assess why trading decisions were taken at the time of execution. Firms need detailed logging of data relating to all possible scenarios, which can require exhaustive examination of a vast amount of information. This ability to understand why they didn’t pursue particular lines of interest allows firms to demonstrate to regulators their intent to fulfil their best execution obligations.
Logging all data with confidence also helps future-proof against unforeseen circumstances. In essence, the deterministic approach offers insurance against the ‘unknown unknowns’ that could supersede regulatory query and possibly protect the firm against existential threats. Finally, this confidence in the data translates into more confidence to pursue business opportunities that may previously have been deemed too risky to consider.
At Chronicle, we’ve tested throughputs of our deterministic messaging platform of bursts of 128 GB of data at over 100 million messages per second. This gives our clients confidence not only in the traceability, reproducibility and auditability of their data to meet MiFID II’s best execution obligations and other regulatory requirements, but also the peace of mind that their platforms will perform as advertised when the proverbial stuff hits the fan.
If you are dealing with millions of updates per second and need to get your systems in order to meet your best execution or other regulatory audit obligations, you should be talking to us.