Malware in Plain English

November 25, 2009

In the ongoing attempt to keep our computers safe from malicious programs, or malware, there is always an “arms race” going on between the Bad Guys that launch the attacks, and the system administrators and users who try to defend against them.  Most users are familiar with anti-virus programs, which work primarily by comparing suspect material to a “signature” data base, which lists characteristics of known malware.  (For example, a given virus might have a particular sequence of 16 bytes beginning 32 bytes from the beginning of the file.)  This approach can work well, but its obvious limitation is that the anti-virus vendor has to have seen samples of malware in order to derive the signatures, meaning that it is not useful against totally new threats.

To get around this problem, the defenders employ various heuristics, which it is hoped will work even against a new malicious program.  Some of these are behavior based: that is, a program that tries to carry out certain actions (for example, modifying the processing of keyboard interrupts) is regarded as ipso facto suspicious.   Another approach is to assume that the malware, because it by definition must contain executable code, will have certain characteristics that differentiate it from plain text, for example.  Some malware authors disguise the “dirty work” part of their product (the payload)  by encoding it, but even so there must be at least a simple routine that does the decoding and start-up.  So some detection schemes are focused on looking for the decoding program.

Unfortunately, a new paper [PDF] that was presented at last week’s ACM Conference on Computer and Communications Security suggests that this approach is not as robust as one might wish.  The researchers — Joshua Mason and Sam Small from Johns Hopkins University, Fabian Monrose from the University of North Carolina, and Greg MacManus of iSight Partners — have developed a technique for encoding malicious software so that both the payload and the initial decoding program are disguised as pseudo-English text.  The resulting text would not necessarily seem sensible to a human reader, but it would have the same superficial statistical properties as English text, making automatic detection very difficult.  For example, the following example text was generated to encode a routine that simply calls the system exit(0) function; some of the text is just padding, and is skipped by immediately-preceding “Jump” instructions:

There is a major center of economic activity, such as Star Trek, including the Ed Sullivan Show.  The former Soviet Union.  International organization participation Asian Development Bank, established in the United States Drug Enforcement Administration, and the Palestinian Territories …

This is clearly going to come across as a bit odd to a human reader, but will be very hard for a text analysis program to distinguish from, say, a newspaper article. (The overall “flavor” of the generated text depends on the body of legitimate text used to derive the encoding.)

These results are another reminder that there is no silver bullet when it comes to detecting malicious code when it is mixed together with arbitrary data streams, and that diligence in preventing code injection attacks is still of vital importance.

Update Friday, November 27, 14:05

The New Scientist now has an article about this research.


Happy Anniversary

November 25, 2009

Yesterday marked the 150th anniversary of the first publication of Charles Darwin’s historic book, On the Origin of Species, by means of Natural Selection.   (Fairness dictates that I mention that Alfred Russel Wallace independently developed much the same idea.)  It’s hard to over-estimate the importance of this work; the theory, along with work in genetics by Mendel and the much later discovery of DNA by Crick and Watson, changed biology from a glorified form of stamp collecting into a real science.  Today, the theory, in its developed form, is really the foundation of all life sciences.  The Origin, like Newton’s Principia, ranks as a landmark in the development of modern science.


There’s Always Time to Do It Over

November 24, 2009

Some of us who have worked in technology for a few years, or more than a few, sometimes think that the expression “penny wise and pound foolish” must have been invented just for our line of work.  It is all too easy for the allure of some near-term cost reduction to override sensible planning and design work that would make a development of considerably greater value over its lifetime.  (A related phenomenon is captured by the expression, “There’s never time to do it right, but there’s always time to do it over.”)   Fred Brooks wrote about this back in the 1970s in his classic book on project management, The Mythical Man-Month, so it is hardly news.

A recent story in the Richmond [Virginia] Times-Dispatch indicates that the problem is alive and well.  It seems that, in 2005,  Virginia entered into a service agreement with Northrop Grumman to run many of the state’s computer systems:

In a unique public-private venture, Virginia agreed in 2005 to let the giant defense and information contractor Northrop Grumman run nearly all the state’s IT systems.

The 10-year, $2.3 billion project aims to modernize 85 state government agencies’ computer networks, PCs, phones, servers and e-mail systems, while holding down costs.

Now there is nothing wrong with this kind of deal in principle, and it is a good thing that the state government is considering creative ways to provide the services that it requires.

The actual deal that Virginia got, however, seems to leave something to be desired.  Part of the system, used by the state Department of Transportation [DOT] and the Division of Motor Vehicles [DMV, responsible for licensing drivers and vehicles], relies on a computer network linking offices all over the state, including many in relatively rural southern and western Virginia.  Unfortunately, the new agreement apparently did not provide for any redundancy in communications facilities, even though the previous facilities did have such redundancy, and network outages have been a big problem:

In just five weeks this fall, the Virginia Department of Motor Vehicles suffered 12 computer system outages, putting individual offices out of business for a total of more than 100 hours. One outage lasted 29 hours, another 17.

“The problem of no-redundancy . . . accounts for 90 percent of our outages,” said David W. Burhop, the DMV’s chief information officer.

During the first six months of the year, state Department of Transportation workers faced 101 significant IT outages totaling 4,677 hours: an average of more than 46 hours per outage. One took 360 hours to fix.

Given that six months is roughly 180 days, and given that the average outage lasted more than 46 hours, this means that, to a first approximation, there was on average a service outage somewhere all the time.  That this interferes with the state’s employees’ work is obvious; and going to the DMV is enough of a hassle for the average citizen without adding computer-induced problems to the mix.

The problem here is not that a function previously run by the state was outsourced to a private-sector firm, but that a cost saving was apparently achieved, at least in part, by providing a service of considerably lower quality.  Buying a complex technological product like a state computer system is a different sort of problem than buying a loaf of bread.  Comparing the costs is the easy part; making sure that you understand and have properly specified what you are buying is what makes the difference.


Microsoft Advisory on IE

November 23, 2009

Microsoft has now issued a Security Advisory (977981) about the Internet Explorer vulnerability I mentioned yesterday.  The advisory does not supply a patch for the flaw, or a projected timetable for a patch.  It does, however, provide some suggested mitigation techniques.


Regulating Derivatives, Part 1

November 23, 2009

Recently, The Economist had a briefing article on the financial derivatives markets and the challenges involved in regulating them appropriately.  The issue has, of course, been brought to the fore by the recent crash in the financial sector, in which excessively risky derivative positions played a large part.  (I have previously written about some of this in a series of posts: “Formulas for Disaster, Parts 1, 2, 3, and 4“.)   The Economist article is well worth reading, and I will be writing, in a later segment of this article, about some of its recommendations.  But first I want to talk a bit about some of the institutional aspects of the derivatives markets; they vary considerably among those markets with respect to some key characteristics, and I think understanding some of those variations is crucial to understanding why derivatives became such a problem.

As the article points out, derivatives have been around for a long time:

Derivatives have a long history, stretching back thousands of years. In the 17th century the Japanese traded simple rice futures in Osaka and the Dutch bought and sold derivatives in Amsterdam.

Derivatives, as their name suggests, derive their value by reference to another asset or financial indicator (sometimes called the underlying asset, or simply the underlying).   A relatively simple example is an option on a traded common stock.  A (hypothetical) three-month call option on Microsoft at $20.00 would give the holder the right, but not the obligation, to buy a specified quantity (typically, 100 shares) of Microsoft stock at any time within the next three months.  If the option was not exercised by then, it would expire.  Clearly, the value of this option would depend on the price of Microsoft stock.  If it were $40 per share today, one could exercise the option, buy 100 shares for $2,000, and then immediately sell them for $4,000, making a tidy profit.  If the price were $10 per share today, the options would probably not be worth very much.

Another simple example is given by contracts in commodity futures.  Essentially, these are purchase contracts for a given quantity of a physical commodity (such as crude oil, wheat, or pork bellies) for delivery at a specified future date at a specified price.  Such contracts might be used, for example, to “lock in” a price for a portion of his wheat crop in advance of the harvest.

These simple types of derivative contracts are fairly benign, for reasons which we’ll explore in a moment, but even they have been accused of enabling all kinds of mischief, as the article points out:

In 1958 American onion farmers, blaming speculators for the volatility of their crops’ prices, lobbied a congressman from Michigan named Gerald Ford to ban trading in onion futures. Supported by the president-to-be, they got their way. Onion futures have been prohibited ever since.

It is easy to cite examples like this one to give the impression that worry about derivatives is often silly.  After all, no one has ever blamed a major financial panic on fluctuations in the price of onions (although the Dutch did get into it with tulip bulbs).

I would agree that worrying about things like onion futures, or options on Microsoft stock, is foolish if one’s concern is the overall stability of financial markets.  But I would like to note that derivatives like these (commodity futures, traded options, and similar contracts) have some common characteristics, which I will argue are of substantial significance:

  • The underlying assets are traded in an open and public market, in which transaction prices are reported, or at least easily observed.
  • The terms of the derivative contracts are standardized, and relatively simple.
  • The contracts themselves are traded on an open market, and their trade prices are reported
  • The derivative transactions are settled via an organized exchange or clearing house, which acts as one party to each transaction, and sets capital provision rules for the participants (for example, the required posting of earnest money deposits or “margin”).
  • In part because the contracts are standardized, there are published, widely-examined industry standards for the valuation and risk assessment of the contracts.

With all this, it would seem that these markets would be fairly orderly, as in fact they are.  So why then, back at the end of 2002, did Warren Buffet, arguably the most successful investor in the world, say that derivatives were “financial weapons of mass destruction”, in his annual letter to the shareholders [PDF] of Berkshire Hathaway?  I’ll start to explore that in the next post.


Opera 10.10 Released

November 23, 2009

Opera Software has released a new version of its Opera Web browser, version 10.10.  It is available for these operating systems: Windows, Mac OS X, Linux, FreeBSD, Solaris (Intel or SPARC), QNX, OS/2, and BeOS, and can be downloaded here.  This version contains a number of new features, including Opera Unite, a technology for sharing information with colleagues or friends, and Opera Link, a facility for synchronizing bookmarks and other data across multiple computers.  There are also improvements to the user interface, search functions, and security.  In addition to the Web browser, Opera also includes an E-mail client.


Follow

Get every new post delivered to your Inbox.

Join 30 other followers

%d bloggers like this: