I’ve written here before about the enormous growth in high-frequency equity trading that has taken place in the last few years, and about some of the side effects of that growth, such as the “Flash Crash” in May, 2010, or the trading disruptions on August 1 of this year. After these events, it is customary for government regulators to issue reports on what went wrong; the Securities and Exchange Commission (SEC) issued a report on the Flash Crash in October, 2010.
You might visualize these regulatory agencies working in a way similar to air traffic control, monitoring the activity and health of the financial markets continuously throughout the day. That picture is plausible enough, but it does not reflect reality, at least up to the present.
Earlier this week, The Washington Post published a report that the SEC was about to “go high tech” by installing a system, called MIDAS, that would, for the first time, provide the regulator with real-time market data. To date, the agency’s information systems have been left far behind by developments in the markets it is supposed to monitor.
As computing power and big data have revolutionized stock trading in recent years, one market player has lagged far behind: the Securities and Exchange Commission, whose job policing the markets has been hampered by a serious technology gap.
Although the amount of data to be handled has increased significantly in the recent past, the technology of digital, nearly real-time financial market data is not new. When I first began working in the industry, in the mid-1970s, market participants had this kind of data available, although often it was in the form of a video feed (essentially, a TV picture of a data display). Even before that technology, market data was distributed by electro-mechanical stock “tickers”. By the mid-1980s, there was a substantial movement toward digital distribution of market data; that change meant that the data could not only be looked at by traders, but also fed into spreadsheets and other applications. (I did some work on digital data distribution in the late 1980s and early 1990s.)
Today, of course, the high-frequency trading that has become so significant is entirely based on the rapid processing of real-time data; speed is of the essence, as I noted back in 2010:
The time frames used in these strategies are in some cases so short (measured in milliseconds) that firms aggressively bid for computer locations physically close to the exchange’s data center: the network propagation delay (at the speed of light!) has to be taken into account.
Given how important this type of trading has become, it is somewhat surprising that the SEC has not had the capability to monitor it. Although it’s been a few years since I was actively involved in the industry, I was taken aback by the article, and apparently I am not alone:
“I scratch my head and say, ‘How could the SEC not have had this in place already?’ ” said Joseph C. Saluzzi, co-head of a brokerage firm called Themis Trading. “Why are they still playing catchup?”
Whatever the reasons for the delay, it is encouraging that the SEC is getting this facility in place; however, as the article points out, the system is just a starting point. There is still more work to be done to create a comprehensive market surveillance system.
But experts who track the agency say more needs to be done. They are eager for the launch of the “consolidated audit trail,” a system that would require broker-dealers to report all their activity to a central repository and track the identities of those dealers and their clients.
Most critically, the SEC needs to be able to recruit and retain staff members that can use the data effectively, and to cultivate an organizational culture that is less legally-focused, and more data-driven.