Tuesday, June 19, 2012

Design of electronic stock exchanges (after the Facebook IPO)

The technical (as opposed to financial and disclosure) difficulties in the recent Facebook IPO have led to renewed discussion in the press about market design in a world of high velocity trading. Here's one of the more thoughtful stories: Could Computers Protect the Market From Computers?

"The Securities and Exchange Commission, which oversees the capital markets, has proposed a "consolidated audit trail" requiring exchanges to report every trade to a central repository, where they could later be analyzed.

"The project is a "very high priority" for the SEC, says an official, but the agency doesn't know when the rules for it will be completed. The main obstacle: agreeing on how to standardize the various formats that brokers and exchanges use to gather trading data.

"It will cost money to improve and modernize market structure," says Bryan Harkins, chief operating officer at Direct Edge, the fourth-largest stock exchange. "But the short-term money pales in comparison to boosting investor confidence in the long term."

"The SEC has estimated that a centralized order-tracking system would cost approximately $4 billion to set up and $2.1 billion a year to maintain.

"Mr. Leinweber of Berkeley has a simpler, and probably cheaper, solution in mind. He proposes that supercomputers—like those at national laboratories such as Berkeley's—should track every trade in real time. If volume began surging dangerously, the system would flash a "yellow light." Regulators or stock exchanges could then slow trading down, giving the market time to clear and potentially averting a crisis."

4 comments:

dWj said...

"The main obstacle: agreeing on how to standardize the various formats that brokers and exchanges use to gather trading data. "

This surprises me. For at least ten years, http://fixprotocol.com/ has at least been supported by all the big ECNs, and the CME has used what it claims to be FIX, though it violates the protocol just enough to be a nuisance to programmers. There are ways in which this might not be the full or optimal answer to "how should we store data", but I would think it wouldn't be too hard to get an agreement that would be built around it, with not too many details then left to be hammered out.

dWj said...

Incidentally, the Turing tests at the bottom of the comment submission form are getting harder and harder. I'm starting to worry I'm not actually human.

dWj said...

Two more comments:

1) One of those "big ECNs" that has supported FIX for a long time is Archipelago, which at this point is NYSE's primary electronic platform. Nasdaq's primary order entry system is very old, and as of ten years ago wasn't even running on TCP/IP (not that network layer is important here; you would presumably just log transport layer information), and it's possible NASDAQ is still not well set up for FIX.

2) In a situation with an obvious Schelling point, coordination failure can probably be taken as a revealed preference.

stock market said...

There is lot of articles on the web about this. But I like yours more, although i found one that’s more descriptive.