" class="no-js "lang="en-US"> EXCLUSIVE: ‘A short history of financial time’ – James Beeken, Cisco; Leon Lobo, NPL and Hugh Cumberland, Txtsmarter in ‘The Fintech Magazine’ - Fintech Finance
Thursday, March 28, 2024

EXCLUSIVE: ‘A short history of financial time’ – James Beeken, Cisco; Leon Lobo, NPL and Hugh Cumberland, Txtsmarter in ‘The Fintech Magazine’

Regulators require timestamping of trading data. But how do we know it’s accurate? We asked three experienced clock watchers – Cisco’s James Beeken, Leon Lobo at NPL and Txtsmarter’s Hugh Cumberland

James Beeken, Cisco | Fintech Finance

Time might be a human construct – a convenient way for our brains to create a framework for our lives – and Einstein might have planted the notion that it’s entirely relative.

But when it comes to financial services, being able to track it accurately, down to a quadrillionth of a second, is both a regulatory requirement, and, in parts of the industry, a distinct competitive advantage.

James Beeken, product specialist for the ultra-low-latency product range at Cisco, which provides the physical network architecture to facilitate trading through the world’s exchanges, explains: “You have two things going on in the market. One is the regulatory obligation to ensure the market operates fairly, by the rules. It achieves that by obliging all trading entities to have the ability to reference back to a universal clock source and time-stamp activity to a defined degree of accuracy. That data has to be stored for some considerable while. If a market event occurs that requires investigation – a flash crash, for example – the regulator can go to all the parties that were in and around that incident, access their information and recreate the scenario to understand exactly what happened.

“But trading organisations – our client base, including banks and high-frequency traders (HFTs) – that are looking to eke more margin out of market opportunity, also need to understand how their server infrastructure, strategy, and therefore their overall business, is performing. To do that, they need to monitor and analyse their network down to a picosecond level of detail. In recent years, the network has progressed from a millisecond to microsecond, nanosecond, and now picosecond realm of analytical requirement. Not only do we need to know exactly how our current live networks are performing, we also need to be able to understand the exact  impact of change, of new bits of hardware, firmware and strategy.”

Being able to time-stamp a data exchange, with counterparties maybe many thousands of miles apart – indeed, in different time zones – pre-supposes both have accurate clocks with which to record it.

Leon Lobo, NPL | Fintech Finance

That’s where the UK’s National Physical Laboratory (NPL) and similar guardians of the international timescale (also known as the Co-ordinated Universal Time standard, or UTC) , come in. The NPL is the keeper of UK time – arbiter of the country’s definitive second since UTC replaced Greenwich Mean Time (GMT) as the international standard of civil time in 1972.

The NPL uses caesium fountain atomic clocks and primary frequency standard apparatus to realise the internationally accepted scientific definition of a second.

“Currently, the caesium fountains are accurate and stable at one part in 10 to the 16 level, so the 16th decimal place,” says Leon Lobo, head of the National Timing Centre at the NPL. “But we are developing the next generation of clocks, which are accurate and stable at one part in 18, the 18th decimal place. That’s vital to develop commercial devices at picosecond or femtosecond [quadrillionth of a second] level.”

He’s acutely aware how the world’s financial system relies on keeping that UK second ticking in synchronicity with those of other UTC labs.

“MiFID II, RTS-25 talks about time-stamp traceability for all reportable events to UTC, which is formulated by data submitted monthly by all UTC labs around the world,” explains Lobo. “UTC NPL is the UK’s national timescale, and UTC USNO is the US Naval Observatory, which feeds the GPS constellation for time and positioning. All of these national labs are delivering the time, whether directly over the internet, over RF broadcast, or via GNSS constellations, like GALILEO, GPS, GLONASS and BEIDOU. It’s not about how an organisation receives it, though – GPS receivables or direct feeds from a national lab – but about being able to demonstrate traceability of the time-stamp for regulatory compliance. It is incredibly important to consider the entire chain, back to source. And without a common source, it’s incredibly difficult for regulators to unpick who did what when.

“If you have an infrastructure in one datacentre, and an infrastructure in another datacentre, and you’re time-stamping all this activity, those time-stamps have to be relevant to each other,” he continues. “Organisations go to GPS signals, or direct feeds from organisations like NPL, to look at the whole business performance, across the estate, globally, and understand exactly what that performance is.”

“It’s also important  to know if you are getting the signal you should be, a the right time. That’s where calibration and monitoring come in,” adds Lobo.

Accurate internal infrastructures, right through to the time-stamping engine are what Cisco and other providers in this sphere, like Txtmaster, strive for every day because, as Lobo points out, without it ‘you can lose all your traceability. Then your uncertainty balloons. You could be telling it to time-stamp a few milliseconds later, and, given the regulatory requirement for high frequency trading, which is 100 microseconds, you’re not compliant.” Hugh Cumberland, Txtsmarter | Fintech Finance

Not compliant and, potentially, out of business, says Hugh Cumberland, director for the UK and EMEA at Txtsmarter, an enterprise mobile-communications compliance management service. It addresses the requirement for companies to put in place uninterrupted retention of iMessage, Android, WhatsApp, SMS, MMS and social media communications.

“I know of one HFT that, during night trading, used an incorrect algorithm for 45 minutes, and that bankrupted it. So, there is a lot at stake.

“In capital markets and financial services, it’s all about risk and how much you could lose if you are working off an incorrect timescale,” says Cumberland. “But, more than that, if the regulator calls on you, and says ‘show me your records’ and you produce a set that are time-stamped incorrectly, that could look like you were indulging in market abuse, or in insider trading. Failing to follow the regulations is sufficient to get a fine – and fines are running into billions of dollars annually. It’s also the reputational risk. You don’t want to be known as the firm that can’t keep its records in order to the standard required.”

At Cisco, which works across industries, Beeken says it’s financial services that are driving the strongest demand.

“Not only have you got one eye on the obligation you have to the regulator, you’ve also got to have a very acute eye on whether your whole infrastructure is performing to its optimum,” he says.

“These firms are writing algorithms that trade electronically to the market, and running those algorithms through a whole network architecture designed to allow them to take in a price, analyse and understand that price, make a decision and issue an instruction back to the market to execute it,” he says. “Now, if something goes wrong in that whole process, and you’re not hitting the price your business strategy or algorithm intends, you’re not making the most of the opportunity on the market. If you’re in that situation, you need to be able to look at your entire trading environment, end-to-end, to try and very quickly understand where the problem is.

Is it in the network? Is it in the switch? Is it in the algorithm? Or is the actual strategy at fault, itself? The only way you can do that is if you’ve got highly-accurate, detailed analytical information about how each part of your network is performing. If you have a problem, you will go immediately back to that analytical data, to understand what’s going on.”

Cumberland agrees that ‘you need to design an infrastructure that will allow you to measure at that picosecond level of granularity and then monitor it with the appropriate devices and software’. But no matter how hard Lobo and his timekeeping colleagues work to shave another fraction of a second off the most accurate atomic clock, Cumberland is sceptical whether organisations will be able to take advantage of it.

“One of the biggest issues is the method of dissemination,” he says. “In the labs, we’re looking not just at caesium fountains, but also the strontium lattice and ytterbium traps, which give two more magnitudes of accuracy than firms are currently able to achieve. The state-of-the-art for dissemination is getting to the most accurate I think we will ever be able to work with. Even using dark fibre, there will be some loss and inaccuracy, and this means there’s a point at which, however accurate the clocks have become, you can’t take advantage of that.”

Lobo stresses that latency – the limitation of physical infrastructure to transmit in real time, albeit a differential of picoseconds – should not be confused with time-stamping.

“The element of time not only applies to the regulatory piece around knowing what happened and when, relative to what we know as the global timescale, but also making sure the operational capability is working optimally. That’s where the whole piece around minimising latency, the race to zero, comes in and whether at a network or a system, or even operating system, level that latency will hit a physical limit.”

The issue of synchronisation is becoming ever-more pertinent, says Lobo – not just for  internal trading mechanisms but in other areas of life that clients are connected to and have money invested in.

“When you look at phase synchronisation of the energy grid, or synchronisation of telecom and broadcast networks, timing is critical, and will become more so as we move to 5G, 6G and beyond,” says Lobo. “Whether smart cities, autonomous vehicle infrastructures, wide-area Internet of Things, or sensor networks, it is becoming incredibly important to not just have timing at the required level, but to ensure resilience –  and that‘s to do with understanding the metrics around availability, integrity, continuity and security, in order to operate infrastructure.

“There’s huge benefit to be gained from having the right time, at a level you never had it before,” he adds, “to achieve commercial gains like better data products, risk analytics, forensics and measurements as well as system implementations. After the picosecond, you need to be able to measure and develop systems that are typically orders of magnitude better than that.”

Back in 1884, passengers at York Station in northern England were just grateful they could rely on trains running to the newly-adopted national Greenwich Mean Time. Otherwise they’d have had to account for the fact that clocks in York were four minutes, 20 seconds slower than in London. The advent of a railway network made it a matter of urgency for everyone in the country to use the same time – for safety as much as convenience. Almost 140 years on (if you accept that time is linear) and the loss or gain of a quadrillionth of a second can have an equally big impact on the capital markets. What a difference ‘time’ makes, eh?


 

This article was published in The Fintech Magazine #19, Page 65-66

People In This Post

Companies In This Post

  1. Mastercard and Singapore Airlines Partner to Elevate Travel Experiences for Cardholders Across Southeast Asia Read more
  2. NatWest Partners With Tesco to Help Farmers Reduce Costs and Decarbonise Read more
  3. Digital ID Will Better Protect Australians from Cybercrime and Scammers Read more
  4. Mastercard Ups the Ante on Ecommerce Security; Scales Its Tokenized Online Checkout Experience to Tackle Online Payment Fraud in Australia Read more
  5. Equifax and Homely Partner to Power Platform for Aspiring Homeowners Read more