Happy Anniversary

November 25, 2009

Yesterday marked the 150th anniversary of the first publication of Charles Darwin’s historic book, On the Origin of Species, by means of Natural Selection.   (Fairness dictates that I mention that Alfred Russel Wallace independently developed much the same idea.)  It’s hard to over-estimate the importance of this work; the theory, along with work in genetics by Mendel and the much later discovery of DNA by Crick and Watson, changed biology from a glorified form of stamp collecting into a real science.  Today, the theory, in its developed form, is really the foundation of all life sciences.  The Origin, like Newton’s Principia, ranks as a landmark in the development of modern science.


There’s Always Time to Do It Over

November 24, 2009

Some of us who have worked in technology for a few years, or more than a few, sometimes think that the expression “penny wise and pound foolish” must have been invented just for our line of work.  It is all too easy for the allure of some near-term cost reduction to override sensible planning and design work that would make a development of considerably greater value over its lifetime.  (A related phenomenon is captured by the expression, “There’s never time to do it right, but there’s always time to do it over.”)   Fred Brooks wrote about this back in the 1970s in his classic book on project management, The Mythical Man-Month, so it is hardly news.

A recent story in the Richmond [Virginia] Times-Dispatch indicates that the problem is alive and well.  It seems that, in 2005,  Virginia entered into a service agreement with Northrop Grumman to run many of the state’s computer systems:

In a unique public-private venture, Virginia agreed in 2005 to let the giant defense and information contractor Northrop Grumman run nearly all the state’s IT systems.

The 10-year, $2.3 billion project aims to modernize 85 state government agencies’ computer networks, PCs, phones, servers and e-mail systems, while holding down costs.

Now there is nothing wrong with this kind of deal in principle, and it is a good thing that the state government is considering creative ways to provide the services that it requires.

The actual deal that Virginia got, however, seems to leave something to be desired.  Part of the system, used by the state Department of Transportation [DOT] and the Division of Motor Vehicles [DMV, responsible for licensing drivers and vehicles], relies on a computer network linking offices all over the state, including many in relatively rural southern and western Virginia.  Unfortunately, the new agreement apparently did not provide for any redundancy in communications facilities, even though the previous facilities did have such redundancy, and network outages have been a big problem:

In just five weeks this fall, the Virginia Department of Motor Vehicles suffered 12 computer system outages, putting individual offices out of business for a total of more than 100 hours. One outage lasted 29 hours, another 17.

“The problem of no-redundancy . . . accounts for 90 percent of our outages,” said David W. Burhop, the DMV’s chief information officer.

During the first six months of the year, state Department of Transportation workers faced 101 significant IT outages totaling 4,677 hours: an average of more than 46 hours per outage. One took 360 hours to fix.

Given that six months is roughly 180 days, and given that the average outage lasted more than 46 hours, this means that, to a first approximation, there was on average a service outage somewhere all the time.  That this interferes with the state’s employees’ work is obvious; and going to the DMV is enough of a hassle for the average citizen without adding computer-induced problems to the mix.

The problem here is not that a function previously run by the state was outsourced to a private-sector firm, but that a cost saving was apparently achieved, at least in part, by providing a service of considerably lower quality.  Buying a complex technological product like a state computer system is a different sort of problem than buying a loaf of bread.  Comparing the costs is the easy part; making sure that you understand and have properly specified what you are buying is what makes the difference.


Microsoft Advisory on IE

November 23, 2009

Microsoft has now issued a Security Advisory (977981) about the Internet Explorer vulnerability I mentioned yesterday.  The advisory does not supply a patch for the flaw, or a projected timetable for a patch.  It does, however, provide some suggested mitigation techniques.


Regulating Derivatives, Part 1

November 23, 2009

Recently, The Economist had a briefing article on the financial derivatives markets and the challenges involved in regulating them appropriately.  The issue has, of course, been brought to the fore by the recent crash in the financial sector, in which excessively risky derivative positions played a large part.  (I have previously written about some of this in a series of posts: “Formulas for Disaster, Parts 1, 2, 3, and 4“.)   The Economist article is well worth reading, and I will be writing, in a later segment of this article, about some of its recommendations.  But first I want to talk a bit about some of the institutional aspects of the derivatives markets; they vary considerably among those markets with respect to some key characteristics, and I think understanding some of those variations is crucial to understanding why derivatives became such a problem.

As the article points out, derivatives have been around for a long time:

Derivatives have a long history, stretching back thousands of years. In the 17th century the Japanese traded simple rice futures in Osaka and the Dutch bought and sold derivatives in Amsterdam.

Derivatives, as their name suggests, derive their value by reference to another asset or financial indicator (sometimes called the underlying asset, or simply the underlying).   A relatively simple example is an option on a traded common stock.  A (hypothetical) three-month call option on Microsoft at $20.00 would give the holder the right, but not the obligation, to buy a specified quantity (typically, 100 shares) of Microsoft stock at any time within the next three months.  If the option was not exercised by then, it would expire.  Clearly, the value of this option would depend on the price of Microsoft stock.  If it were $40 per share today, one could exercise the option, buy 100 shares for $2,000, and then immediately sell them for $4,000, making a tidy profit.  If the price were $10 per share today, the options would probably not be worth very much.

Another simple example is given by contracts in commodity futures.  Essentially, these are purchase contracts for a given quantity of a physical commodity (such as crude oil, wheat, or pork bellies) for delivery at a specified future date at a specified price.  Such contracts might be used, for example, to “lock in” a price for a portion of his wheat crop in advance of the harvest.

These simple types of derivative contracts are fairly benign, for reasons which we’ll explore in a moment, but even they have been accused of enabling all kinds of mischief, as the article points out:

In 1958 American onion farmers, blaming speculators for the volatility of their crops’ prices, lobbied a congressman from Michigan named Gerald Ford to ban trading in onion futures. Supported by the president-to-be, they got their way. Onion futures have been prohibited ever since.

It is easy to cite examples like this one to give the impression that worry about derivatives is often silly.  After all, no one has ever blamed a major financial panic on fluctuations in the price of onions (although the Dutch did get into it with tulip bulbs).

I would agree that worrying about things like onion futures, or options on Microsoft stock, is foolish if one’s concern is the overall stability of financial markets.  But I would like to note that derivatives like these (commodity futures, traded options, and similar contracts) have some common characteristics, which I will argue are of substantial significance:

  • The underlying assets are traded in an open and public market, in which transaction prices are reported, or at least easily observed.
  • The terms of the derivative contracts are standardized, and relatively simple.
  • The contracts themselves are traded on an open market, and their trade prices are reported
  • The derivative transactions are settled via an organized exchange or clearing house, which acts as one party to each transaction, and sets capital provision rules for the participants (for example, the required posting of earnest money deposits or “margin”).
  • In part because the contracts are standardized, there are published, widely-examined industry standards for the valuation and risk assessment of the contracts.

With all this, it would seem that these markets would be fairly orderly, as in fact they are.  So why then, back at the end of 2002, did Warren Buffet, arguably the most successful investor in the world, say that derivatives were “financial weapons of mass destruction”, in his annual letter to the shareholders [PDF] of Berkshire Hathaway?  I’ll start to explore that in the next post.


Opera 10.10 Released

November 23, 2009

Opera Software has released a new version of its Opera Web browser, version 10.10.  It is available for these operating systems: Windows, Mac OS X, Linux, FreeBSD, Solaris (Intel or SPARC), QNX, OS/2, and BeOS, and can be downloaded here.  This version contains a number of new features, including Opera Unite, a technology for sharing information with colleagues or friends, and Opera Link, a facility for synchronizing bookmarks and other data across multiple computers.  There are also improvements to the user interface, search functions, and security.  In addition to the Web browser, Opera also includes an E-mail client.


Health-Care Tradeoffs

November 22, 2009

Readers, at least those in the US, have undoubtedly heard by this time that a new report by the US Preventive Services Task Force recommends that the use of routine mammograms be reduced from annual to every other year for women over 50, and be eliminated for women between 40 and 50.  I have been bemused and somewhat perturbed by some of the reaction to this recommendation (which, by the way, is just that).

Let me first say that I have not examined all of the evidence, so I am in no position to evaluate the Task Force’s recommendation.  Even if I had seen it all, I may not be qualified to fairly judge some of it.  I am not at all surprised, or bothered, that there may be some controversy or disagreement over the interpretation of some parts of the evidence.  Making a recommendation such as this one involves reviewing a mass of statistical and other evidence, and trying to make an intelligent trade-off between the advantages and disadvantages of more (or less) testing.  This inevitably involves a certain amount of professional judgment, and exactly where the trade-off should be made is a matter about which reasonable people can disagree.

That sort of disagreement does not bother me.  What does bother me is the reaction that seems to assume that no trade-off needs to be made: something along the lines of, “If even one additional cancer is detected, it is worth whatever it costs.”  The first observation that must be made is that the costs, certainly in this instance, are not just financial.  Excessive exposure to ionizing radiation (like X-rays) can cause cancer.  Follow-up biopsies have their own risks, including infections.  The Task Force’s report cited the unnecessary distress that may result from a false positive test result.  Also, of course, there are bound to be some false positive diagnoses that result in unnecessary surgery.  And the economic costs are real enough.  No one likes to think about putting a monetary value on health, but we do have to do it, all the time.

Pretending that a trade-off is not required is just a retreat to magical thinking: somehow, we can have it all if we only close our eyes and wish with all our might.  Unfortunately, when we open our eyes, we still have to grow up.


New Internet Explorer Vulnerability

November 22, 2009

A new exploit affecting Microsoft’s Internet Explorer has been published, as reported by the SANS Institute and Network World.   The flaw is, apparently, due to an invalid pointer in the HTML viewer library, mshtml.dll, and affects Internet Explorer versions 6 and 7; version 8, the current version, is apparently not affected, but the older versions are still in widespread use.  The currently published exploit is not entirely reliable, according to security vendor Symantec; but the likelihood of a better exploit appearing is high, because flaws of this type are coveted by the Bad Guys, since they allow software to be installed on the victim’s machine if he merely visits a compromised or malicious Web site.

There is no information from Microsoft at the moment on a potential fix.  The exploit does require that JavaScript be enabled in order work, so disabling JavaScript should mitigate the threat.

Update Monday, November 23, 11:52

Brian Krebs at the Washington Post has posted a note about this on his “Security Fix” blog.

To repeat something I’ve said before: Internet Explorer, especially in its older versions (6 and 7), is a security nightmare.  You really should consider switching to Firefox or Opera; at the very least, upgrade to Internet Explorer version 8.


Follow

Get every new post delivered to your Inbox.

Join 30 other followers

%d bloggers like this: