SEC Gets Real-Time Market Data

December 27, 2012

I’ve written here before about the enormous growth in high-frequency equity trading that has taken place in the last few years, and about some of the side effects of that growth, such as the “Flash Crash” in May, 2010, or the trading disruptions on August 1 of this year.  After these events, it is customary for government regulators to issue reports on what went wrong; the Securities and Exchange Commission (SEC) issued a report on the Flash Crash in October, 2010.

You might visualize these regulatory agencies working in a way similar to air traffic control, monitoring the activity and health of the financial markets continuously throughout the day.  That picture is plausible enough, but it does not reflect reality, at least up to the present.

Earlier this week, The Washington Post published a report that the SEC was about to “go high tech” by installing a system, called MIDAS,  that would, for the first time, provide the regulator with real-time market data. To date, the agency’s information systems have been left far behind by developments in the markets it is supposed to monitor.

As computing power and big data have revolutionized stock trading in recent years, one market player has lagged far behind: the Securities and Exchange Commission, whose job policing the markets has been hampered by a serious technology gap.

Although the amount of data to be handled has increased significantly in the recent past, the technology of digital, nearly real-time financial market data is not new.   When I first began working in the industry, in the mid-1970s, market participants had this kind of data available, although often it was in the form of a video feed (essentially, a TV picture of a data display).  Even before that technology, market data was distributed by electro-mechanical stock “tickers”.   By the mid-1980s, there was a substantial movement toward digital distribution of market data; that change meant that the data could not only be looked at by traders, but also fed into spreadsheets and other applications.  (I did some work on digital data distribution in the late 1980s and early 1990s.)

Today, of course, the high-frequency trading that has become so significant is entirely based on the rapid processing of real-time data; speed is of the essence, as I noted back in 2010:

The time frames used in these strategies are in some  cases so short (measured in milliseconds) that firms aggressively bid for computer locations physically close to the exchange’s data center: the network propagation delay (at the speed of light!) has to be taken into account.

Given how important this type of trading has become, it is somewhat surprising that the SEC has not had the capability to monitor it.  Although it’s been a few years since I was actively involved in the industry, I was taken aback by the article, and apparently I am not alone:

“I scratch my head and say, ‘How could the SEC not have had this in place already?’ ” said Joseph C. Saluzzi, co-head of a brokerage firm called Themis Trading. “Why are they still playing catchup?”

Whatever the reasons for the delay, it is encouraging that the SEC is getting this facility in place; however, as the article points out, the system is just a starting point.  There is still more work to be done to create a comprehensive market surveillance system.

But experts who track the agency say more needs to be done. They are eager for the launch of the “consolidated audit trail,” a system that would require broker-dealers to report all their activity to a central repository and track the identities of those dealers and their clients.

Most critically, the SEC needs to be able to recruit and retain staff members that can use the data effectively, and to cultivate an organizational culture that is less legally-focused, and more data-driven.

Growth in High-Frequency Trading

August 9, 2012

In trying to examine the background and underlying causes of financial market events such as the disruption in equity trading on August 1, or the “flash crash” of May 6, 2010, the question of whether we are looking at some fundamental change in the way the markets work is frequently raised.  Trading volumes have certainly increased over the past few years, but firms engaged in high-frequency trading tend to be rather reticent about the exact nature and extent of their operations.

In one of my posts about the 2010 flash crash, I mentioned an analysis carried out by Nanex, a firm that specializes in systems and software to distribute and process market data, giving them a particularly good view of the overall market.

Our business is supplying a real-time data feed comprising trade and quote data for all US equity, option, and futures exchanges.

I have just come across a rather striking demonstration, put together by Nanex, of the growth of high-frequency trading (HFT) activity in US markets over the period January 2007 through January 2012.    The top graph on the page is an animated GIF graphic, which shows daily HFT activity for the various US exchanges (denoted by different colors).  At the beginning of the period, in 2007, the lines are relatively flat.  As time progresses, one begins to see more activity, first at the beginning and end of the trading day, then spreading throughout the day. Assuming Nanex’s data are correct (and I have no reason to doubt that), there has been a very substantial increase in HFT activity over the last few years.

The graphs on the lower part of the page show a more interesting aspect of this growth.  The graph on the left shows the growth, over time, of quote activity (that is, the posting of bid and offer prices for securities).  The graph on the right shows actual trades over the same period,  It is clear that most of the activity growth is in quotes, rather than trades.   There was some suspicion, in the 2010 flash crash, that some market participants might have engaged in “quote stuffing”, the generation of spurious quotes that would effectively clog up competitors’ systems.   There is certainly nothing in this evidence to disprove that suspicion.

Nanex also points out that spurious quote messages, like spam, are close to free for the sender, but not for other market participants.  As noted above, the large increase in quote activity does not correspond to an increase in actual trades, meaning that the cost per trade of processing quote messages has gone up substantially.

We think that a 10-fold increase in costs without any benefits would be considered “detrimental” by most business people. We think the regulators would agree with us as well.

This analysis, together with other evidence, suggests that HFT activity is not as benign as some proponents would have us believe.   As Prof. Ed Felten at Princeton observed back in 2010, it can be difficult to disentangle just what happened after the fact, but I’ll try to fit some of the pieces together in future posts.

Speedy Trading — Again

August 2, 2012

On Wednesday, August 1, traders on the New York Stock Exchange [NYSE] and other US equity exchanges had a more than usually interesting morning.   Shortly after the market opened, unusually high trading volumes and large price swings occurred in more than 100 stocks, including well-known names like Coca-Cola, IBM, and McDonalds, as well as smaller firms like Wizzard Software, Kronos Worldwide, and Trinity Industries.  The wild price gyrations in some stocks were reminiscent of the “flash crash” of May 6, 2010 (which I wrote about several times here), and of the more recent problems connected with Facebook’s initial public offering [IPO] in May of this year.  As in the 2010 incident, the problem appears to have been connected to am automated trading system; according to a report in the New York Times,

An automated stock trading program suddenly flooded the market with millions of trades Wednesday morning, spreading turmoil across Wall Street and drawing renewed attention to the fragility and instability of the nation’s stock markets.

The problem appears to have originated in the market-making unit of Knight Capital Group, a New Jersey broker that executes “algorithmic” trades on behalf of other firms.  According to a report from Bloomberg News Service, Knight said the problem was due to a “technology issue”:

Knight’s initial review indicated a “technology issue” occurred in the company’s market-making unit related to the routing of shares to the NYSE, according to an e-mail from spokeswoman Kara Fitzsimmons.

Trading in several issues was suspended for a time; after reviewing trades in over 100 stocks, the NYSE announced that it would void trades in six issues that exhibited particularly wide price swings.

Trades that occurred 30 percent above or below the opening price in Wizard Software Corp., China Cord Blood Corp. (CO), Reaves Utility Income Fund, E-House China Holdings Ltd., American Reprographics Co. and Quicksilver Resources Inc. will be voided, according to a statement on the NYSE website.

Knight Capital itself was badly affected; according to a report at the Washington Post, the “trading glitch” will cost the firm $440 million.

A technical problem that briefly threw dozens of stocks into chaos Wednesday will cost Knight Capital Group $440 million, the trading firm said Thursday. Knight’s stock plunged for a second day, erasing 70 percent of its value from two days ago.

That’s a hell of a glitch.  The situation is also embarrassing for the Knight Capital CEO, Thomas Joyce, who has been a vocal critic of NASDAQ for the problems it experienced during the Facebook IPO.

Apart from the immediate disruption and costs associated with these incidents, there is a concern that they are eroding ordinary investors’ confidence in equity markets, and that they may be a warning that those markets have lost sight of their primary purpose.  I’ll talk about some of those issues in future posts.

World Bank Research to be Open Access

April 14, 2012

I’ve written here before about the encouraging trend to make more scholarly research available online at no charge, including efforts by JStor, The Royal Society, and the National Academies Press.  Now, according to an article at Ars Technica, the World Bank has decided to make its research and knowledge products, as well as the data underlying them,  available free of  charge under a new Open Access Policy.

…  the Bank says it will apply to “manuscripts and all accompanying data sets… that result from research, analysis, economic and sector work, or development practice… that have undergone peer review or have been otherwise vetted and approved for release to the public.

Most of the material will be made available under a liberal Creative Commons license [CC-BY].  The Bank has set up a new Web site, the Open Knowledge Repository, to make its work available for browsing and download.  (At the time I am writing this, there appears to be a problem with the site’s SSL certificate for secure [https:] access; you may get a security warning from your browser.)  There are currently more than 2,100 papers and books available in the Repository, and more will be added over the coming months.  Data sets will be available, too, and will probably be of considerable value to researchers, given the World Bank’s special insight into the process of economic development.

“Making our knowledge widely and readily available will empower others to come up with solutions to the world’s toughest problems,” World Bank Group President Robert B. Zoellick said in the Bank’s announcement.

It is great to see another significant institution move toward making information more widely and easily available.

Toxic Environments

March 18, 2012

I have finally had a chance to read Greg Smith’s letter, “Why I Am Leaving Goldman Sachs”, published as an Op-Ed this past week in the New York Times.  In it, Mr. Smith, an executive director of the US equity derivatives business at one of the world’s  leading merchant banks, says that he is resigning because, in his view, the culture of the firm has changed significant;ly for the worse since he joined it twelve years ago.

… I believe I have worked here long enough to understand the trajectory of its culture, its people and its identity. And I can honestly say that the environment now is as toxic and destructive as I have ever seen it.

Mr. Smith says that the culture of the firm has changed, from one which put the customer’s interest first , to one where making the maximum profit for the firm, at the customer’s expense if necessary, has become paramount.

I attend derivatives sales meetings where not one single minute is spent asking questions about how we can help clients. It’s purely about how we can make the most possible money off of them.

Much of the Wall Street reaction to Mr. Smith’s letter has been fairly predictable.  He has been pictured as a naive hypocrite, who never understood what the business was about, but was happy enough to deposit his bonus checks.  My own reaction, having worked for about thirty years in the financial services industry, is that no one there should be at all surprised by what Greg Smith said.  I have no specific knowledge of Goldman Sachs, but the scene he describes sounds all too familiar.

As William Cohan points out in an article in the Washington Post, the idea of Goldman Sachs, or any other investment bank, duping its clients is not exactly new.  He cites the example of Goldman’s role in and around the bankruptcy of Penn Central in 1970.  Goldman was the underwriter for Penn Central’s commercial paper.  Because of its relationship with the firm, Goldman was privy to information about Penn Central’s deteriorating liquidity position, information it did not share with its customers even as it continued to flog the commercial paper.  The SEC investigated following Penn Central’s bankruptcy.

According to the SEC, Goldman “gained possession of material adverse information, some from public sources and some from nonpublic sources indicating a continuing deterioration of the financial condition of the [railroad]. Goldman, Sachs did not communicate this information to its commercial paper customers, nor did it undertake a thorough investigation of the company. If Goldman, Sachs had heeded these warnings and undertaken a reevaluation of the company, it would have learned that its condition was substantially worse than had been publicly reported.”

The SEC sued Goldman, and the suit was settled within a short time.  Goldman was also sued by some of its customers.  Many of these suits were also settled, but some, for whatever reason, were allowed to proceed to a trial, which Goldman lost.

Incredibly, Goldman thought it could win the lawsuits and allowed them to go to trial, where much of the firm’s dirty laundry was aired. In the end, it lost the suit brought by the three companies and paid the plaintiffs 100 cents on the dollar, plus interest.

Cohan argues that, if Greg Smith had been paying attention, he could have figured out that Goldman’s actions did not always match its lofty principles.  At one level, it is hard to argue with this.  Certainly since I started work in the industry in the mid-1970s, there has never been any shortage of skunks and weasels on Wall Street.

On another level, though, I think Smith is right: the culture of Wall Street has gotten worse, and there are at least some identifiable reasons for this.  Once upon a time, firms like Goldman Sachs were partnerships, meaning that the money they were risking belonged to the partners that owned and managed the firm.  Now, most of these firms are public companies, whose (very highly paid) managers are risking the stockholders’ money; they have also been permitted to become bank holding companies, with access to lending from the Federal Reserve, meaning they can risk taxpayers’ money, too.   The rise of proprietary trading in ever more exotic and opaque financial instruments has made effective oversight more difficult.  The bonus system rewards those who produce short-term profits, even when those profits are based on theoretical valuations of long-term transactions.  (I’ve written about this before.  These are sometimes called “IBG” trades on the floor: “I’ll Be Gone” by the time the deal craters.)  It is hard, offhand, to think of a more complete collection of perverse incentives, to say nothing of agency problems and moral hazards.

Really, the only thing surprising about this is that anyone is surprised.

NASDAQ Hack, Revisited

October 23, 2011

Back in February, I posted a note here about a security breach that had been discovered in some computer networks owned by NASDAQ (originally, the National Association of Security Dealers Automated Quotation system).    The NASDAQ Stock Market is the largest US trading platform for stocks not listed on the New York Stock Exchange [NYSE].  It is the largest screen-based trading exchange in the US, listing 2800+ issues, and the largest in the world by trading volume.  It did not appear that the attack had compromised the actual NASDAQ trading system, but the total scope of the attack was still being analyzed.  One system that was affected was Directors’ Desk, a sort of bulletin-board system for senior corporate managements.

According to a recent article at Reuters, it now appears that the attackers used their successful access to Directors’ Desk as a first step to facilitate snooping on corporate directors and others to obtain confidential information.

Hackers who infiltrated the Nasdaq’s computer systems last year installed malicious software that allowed them to spy on the directors of publicly held companies, according to two people familiar with an investigation into the matter.

The new details showed the cyber attack was more serious than previously thought, as Nasdaq OMX Group had said in February that there was no evidence the hackers accessed customer information.

The breach is suspected to have been part of what is sometimes called a “blended” attack: an initial target is compromised, which may not only  yield some confidential information itself, but also other information that may lead to breaking into other systems.  (For example, the attacker might get users’ personal information that would facilitate guessing poorly-chosen passwords.)

By infecting Directors Desk, the hackers were able to access confidential documents and the communications of board directors, said Kellermann, chief technology officer at security technology firm AirPatrol Corp.

It is still not clear exactly how long the security breach existed before it was detected, nor does anyone know exactly what information was compromised.  The investigation is continuing, with the assistance of the FBI and the NSA.


SEC Issues Attack Disclosure Guidelines

October 16, 2011

One of the things that can make assessing the overall state of system and network security difficult is the reluctance of some organizations to reveal that they have been attacked.  Sometimes, they prefer to keep the attack secret, or at least try to, presumably because they feel that disclosure would be embarrassing and damaging to their public image.  Some state laws require disclosure, especially in cases where personal data is exposed, but even in these cases there is a tendency to do the least disclosure possible.

Public corporations — those whose stock is publicly traded — have for many years had a duty, under US securities law and associated regulations, to disclose material events that might affect the firm’s business or prospects.  For example, if another firm  were to introduce an improved competing product, or if the corporation were sued on the grounds of patent infringement, a disclosure to investors would be required.

Now, according to an article at ThreatPost, the Kaspersky Lab security news service, the US Securities and Exchange Commission [SEC] has issued guidance that suggests circumstances under which corporations may need to disclose attacks, or potential attacks.

The Securities and Exchange Commission has issued new guidance to help public companies determine when they may need to disclose an attack–or even a potential attack–in order to make potential investors aware of possible risks to the company’s business.

The SEC has issued the material as guidance, not as a regulation.  It is still up to the companies themselves to determine exactly what they should disclose; but the publication of this guidance will probably motivate a bit more openness.  As the actual guidance document says, the disclosure determination is to be made within the framework of existing law and regulation.

Although no existing disclosure requirement explicitly refers to cybersecurity risks and cyber incidents, a number of disclosure requirements may impose an obligation on registrants to disclose such risks and incidents. In addition, material information regarding cybersecurity risks and cyber incidents is required to be disclosed when necessary in order to make other required disclosures, in light of the circumstances under which they are made, not misleading.

We live in an environment where people, and companies, are becoming more and more reliant on technology to carry our their everyday business; moreover, businesses in general actively promote conveniences made possible by technology.  So I think there can be little argument that a system security breach could potentially have a very material effect on a firm’s prospects, and I welcome this move by the SEC as a logical extension of the disclosure framework that has been in place for many years.

%d bloggers like this: