Tick Data and TCA Transformation: Erasing the Need to Move Mountains

PinIt

Performing transaction cost analysis (TCA) means moving at the speed of sound, drawing on the analysis of massive troves of tick history data.

In case you hadn’t heard, an enormous barrier to financial services IT modernization has virtually disappeared — and it all revolves around tick data. But first, a brief diversion through history…

The ticker tape machine was invented in 1867, revolutionizing markets as the world’s first electric financial communications medium and unleashing the capacity to forge wealth through analysis of steadily and readily available market data. The “tape” was a roll of paper spit from a machine that reported every single trade on the stock exchange as it happened transmitted via telegraph; “ticker” came from the sound the machine made while printing. Incidentally, after use, firms would tear up their old ticker tapes for disposal, accumulating huge piles of paper ribbons. Lo and behold, in New York, when there was something to celebrate — like the dedication of the Statue of Liberty — Wall Street workers threw the ribbons out their windows over parade routes running through the Financial District, thus inventing the Ticker Tape Parade.

Here in the 21st Century, nobody uses ticker tape anymore. All of that time-stamped financial exchange information is still incredibly valuable, but it is now reported and recorded digitally. We do still call it “tick” data, however, in honor of its provenance.

See also: Why AI is a Huge Alpha Opportunity for Money Managers

Big (Really Really Big) Data

Tick data and tick histories are still essential for a lot of front, middle, and back office functions in financial services institutions. But just like those mounds of old-timey parade confetti, all that tick data really piles up. There are plenty of firms today storing petabytes of their tick data on-prem at great cost. Due to its sheer volume, the idea of modernizing and migrating it to the cloud is unfathomable for most organizations — like trying to move Mt. Everest.

But guess what? Most of the main tick data providers — FactSet, Bloomberg, LSEG (formerly known as Refinitiv), ICE, and BMLL, among others — have moved to the cloud. And they are already on Snowflake with marketplace listings and native applications readily available to other Snowflake customers. So, all of their enormous swaths of tick data — decades worth of information — can be seamlessly accessed, shared, and analyzed on the platform.

This means there is no longer any need for anyone to move their personal Everest. There is no need for a massive tick data “cloud migration.” Instead of having to migrate mountains of data, firms can just access it through marketplace sharing — and save upwards of 60% of costs pursuant to storage, maintaining data centers, and processing. This fact alone presents a huge opportunity, and while I’m not suggesting that every organization using tick data has to be on Snowflake, it is currently the only platform I know of where this capability has been made so readily available. I suspect others will follow suit in the months to come. After all, there are abundant market intelligence benefits to embracing the “Snowflaky” principle that the data shouldn’t move and that everything else should come to the data. This is especially true when it comes to finance functions, where decisions about where tick data resides and how it is accessed actually make an enormous difference in a firm’s ability to fully capture its inherent value.

TCA Transformation

To hone in on a traditional tick data use case specifically applicable to capital markets or investment arms in banking, not to mention every trading firm on the planet, let’s look at transaction cost analysis (TCA).

TCA is key to how institutions determine the profit, loss, cost effectiveness, and overall performance of their trading strategies in order to refine and scale those strategies. If you think about how money gets made in the stock market at the grand scale of banks, the cost analysis of each trade involves billions of dollars.

The key components necessary for conducting TCA reside in tick histories with data possessing a time stamp indicating when a trade was executed, the price, volume (shares involved), and a unique identifier. The corporation that’s committed to doing the trade is going to use this information to calculate and analyze their explicit costs (commissions, taxes, levies) and implicit costs, such as:

  • The bid-ask spread: The difference between the price at which you buy and they sell. The wider the spread, the higher the cost.
  • Market impact:  A large order can influence market price in securities. So, if you buy a large number of shares, the price might actually increase before the entire order gets filled.
  • Timing and slippage: The difference between when you think a trade is going to get executed and the price at which that will happen and the price when the actual execution occurs. This is generally tied to market volatility and/or processing delays and is highly indicative of how the speed of data movement impacts how much money is made.

All of these elements (in addition to opportunity cost) feed into TCA for:

  • Pre-trade analysis: Estimating potential costs and risks; assessing liquidity, volatility, and market depth; picking an execution strategy before a trade is made.
  • In-trade monitor: Determining whether adjustments should be made to the strategy when trading in real time.
  • Post-trade analysis: Evaluating the actual costs incurred after the trade is completed, which will involve comparing the execution price against volume-weighted averages and arrival prices and contextualizing against strategy.

All of this, meanwhile, has to be done within set levels and guidelines and demonstrate to regulatory authorities that firms are achieving the best execution price for their clients.

So, performing TCA means moving at the speed of sound, drawing on massive troves of data, with a whole host of no-nonsense regulators looking over your shoulder all the while.

But here in the 21st Century, all of that analysis no longer has to rely on mountains of tick data moving through a hodgepodge of data center farms and a patchwork of legacy systems, databases, proprietary formats, and applications. Now, firms can get all the benefits of the new wave of cloud data capabilities without missing a beat — effectively migrating without migrating. Some of those benefits include:

  • All that time-series data and all that trade history is available instantaneously, updated immediately and continuously from the providers — whether Refinitive, FactSet, or Bloomberg. There is no lift-and-shift; it’s all already there.
  • There are no time penalties when trying to blend information together — there is no moving data around or waiting on API calls.
  • There are impressive market data optimization features at the ready, like native time series analysis that “allow time stamp data to be easily aggregated and analyzed to the nanosecond,” according to Snowflake.

And I haven’t even touched on access to new AI components for simulation, anomaly patterns, or rag models for trade strategy guidance and efficacy measurement.

Put all of that together, and TCA is absolutely transformed. And TCA is just one of dozens of areas where tick data makes a huge impact on how banks function.

You don’t have to move the tick data mountain, and you can just go to it. And the impact on cost efficiencies, risk management, transparency, and capability when all that legacy chaos is replaced with an instantaneously accessible, compliant, single source of truth is every bit as revolutionary as the ticker tape machine.

Anand Pandya

About Anand Pandya

Anand Pandya is Global Head of Financial Services at Hakkōda, a data engineering consultancy specializing in Snowflake. He works with companies in the finance sector to deliver data-driven innovation. With 20+ years of experience as a leader in data-driven solutions and business intelligence, Anand has been integral to driving innovative technology from an executive standpoint. His expertise lies in data management, focusing on organizational resource management and influencing business strategies. In his career, he’s learned that well-governed data is a blend of people, processes, technology, and data.

Leave a Reply

Your email address will not be published. Required fields are marked *