Microsoft Releases Workaround for IE Flaw

December 31, 2012

Microsoft has released a “FixIt” workaround patch for the Internet Explorer vulnerability (in IE versions 6, 7, and 8) that I wrote about yesterday; the Security Advisory (2794220) has been updated to reflect this change.  Microsoft has also released a Knowledge Base article that contains links to the installation programs to enable, or disable, the workaround.  This is not a patch for the underlying vulnerability, but a sort of “quick fix” that prevents exploits from working.

If you are using the Windows system that requires the workaround, you can install it directly from the Knowledge Base page.  Alternatively, you can save the file to disk, and then run it manually on one or more other systems.  There is also a link to disable the workaround, in case it causes problems with your system.

Another mitigation step suggested in the Security Advisory is the use of a Microsoft utility, the rather Orwellian name of which is the Enhanced Mitigation Experience Toolkit (EMET).   The EMET utility implements a variety of general-purpose protections against malicious software.  It can be quite an effective tool, but it does involve some risk of incompatibility with particular applications.  I strongly suggest that you test it carefully before installing it on critical systems.  A general description and download links are in the Knowledge Base article (2458544) Enhanced Mitigation Experience Toolkit.  For more detailed and technical information on EMET, a TechNet blog post describes the latest version (3.0).

Since an example exploit seems to have been posted on the Web, I think it is prudent to take this vulnerability seriously.  If you have a vulnerable version of Internet Explorer, I suggest that you take one or more of these steps:

  • Switch to a different browser (e.g., Firefox or Chrome) and avoid Internet Explorer
  • Upgrade to Internet Explorer version 9 or 10 (not possible on Windows XP systems)
  • Apply the FixIt workaround, and possibly the EMET if it’s workable in your environment.

On Thursday, January 3, Microsoft should be announcing the security bulletins it will release this month.  I hope that a regular patch for this vulnerability can be ready in time to be included in that batch, which should be released Tuesday, January 8.

New Internet Explorer Vulnerability

December 30, 2012

Microsoft has issued a Security Advisory (2794220) concerning a new “zero-day” vulnerability in Internet Explorer.   Versions 6, 7, and 8 of the browser are affected, on all versions of Windows;  versions 9 and 10 are not.  The flaw involves a memory allocation and access bug; if exploited, it could lead to the attacker gaining access to the system with the same privileges as the logged-in user, and execution of arbitrary code.  The most likely exploit would involve clicking on a link to a malicious Web site, sent to the user via an E-mail or instant message. An exploit would not necessarily require the Web site itself to be compromised; a site that hosted user-supplied content might also serve as an attack vector.

Microsoft has assigned this vulnerability CVE-2012-4792; however, at this point there is no further information available in the CVE data base.  A more detailed technical explanation of the vulnerability is available in this Microsoft blog post.

As I mentioned earlier, this vulnerability does not affect Internet Explorer version 9 or 10.  However, those of you still using Windows XP are out of luck on that score, because upgrading to one of those IE versions is not an option.  (If you are still using XP, I hope you have started planning a transition to a newer version of Windows.)  Microsoft says that it is investigating the problem, and that it will “take appropriate action” once the investigation is complete.  The Security Advisory has some suggestions for possible mitigations.  As always, never click on an unsolicited link someone sends you.

Update Sunday, 30 January, 23:16 EST

It appears that a sample exploit for this vulnerability has been published on the Web.  If you are using a vulnerable version of Internet Explorer, I suggest that you switch to Firefox or Chrome, or at least apply Microsoft’s recommended mitigations as soon as you can.

SEC Gets Real-Time Market Data

December 27, 2012

I’ve written here before about the enormous growth in high-frequency equity trading that has taken place in the last few years, and about some of the side effects of that growth, such as the “Flash Crash” in May, 2010, or the trading disruptions on August 1 of this year.  After these events, it is customary for government regulators to issue reports on what went wrong; the Securities and Exchange Commission (SEC) issued a report on the Flash Crash in October, 2010.

You might visualize these regulatory agencies working in a way similar to air traffic control, monitoring the activity and health of the financial markets continuously throughout the day.  That picture is plausible enough, but it does not reflect reality, at least up to the present.

Earlier this week, The Washington Post published a report that the SEC was about to “go high tech” by installing a system, called MIDAS,  that would, for the first time, provide the regulator with real-time market data. To date, the agency’s information systems have been left far behind by developments in the markets it is supposed to monitor.

As computing power and big data have revolutionized stock trading in recent years, one market player has lagged far behind: the Securities and Exchange Commission, whose job policing the markets has been hampered by a serious technology gap.

Although the amount of data to be handled has increased significantly in the recent past, the technology of digital, nearly real-time financial market data is not new.   When I first began working in the industry, in the mid-1970s, market participants had this kind of data available, although often it was in the form of a video feed (essentially, a TV picture of a data display).  Even before that technology, market data was distributed by electro-mechanical stock “tickers”.   By the mid-1980s, there was a substantial movement toward digital distribution of market data; that change meant that the data could not only be looked at by traders, but also fed into spreadsheets and other applications.  (I did some work on digital data distribution in the late 1980s and early 1990s.)

Today, of course, the high-frequency trading that has become so significant is entirely based on the rapid processing of real-time data; speed is of the essence, as I noted back in 2010:

The time frames used in these strategies are in some  cases so short (measured in milliseconds) that firms aggressively bid for computer locations physically close to the exchange’s data center: the network propagation delay (at the speed of light!) has to be taken into account.

Given how important this type of trading has become, it is somewhat surprising that the SEC has not had the capability to monitor it.  Although it’s been a few years since I was actively involved in the industry, I was taken aback by the article, and apparently I am not alone:

“I scratch my head and say, ‘How could the SEC not have had this in place already?’ ” said Joseph C. Saluzzi, co-head of a brokerage firm called Themis Trading. “Why are they still playing catchup?”

Whatever the reasons for the delay, it is encouraging that the SEC is getting this facility in place; however, as the article points out, the system is just a starting point.  There is still more work to be done to create a comprehensive market surveillance system.

But experts who track the agency say more needs to be done. They are eager for the launch of the “consolidated audit trail,” a system that would require broker-dealers to report all their activity to a central repository and track the identities of those dealers and their clients.

Most critically, the SEC needs to be able to recruit and retain staff members that can use the data effectively, and to cultivate an organizational culture that is less legally-focused, and more data-driven.

National Strategy for Information Sharing and Safeguarding

December 23, 2012

The US government, through its various intelligence operations, collects an enormous amount of information; especially recently, private organizations and businesses have assembled some pretty impressive collections of their own (think Google or Facebook).  These collections have the potential to tell us a lot about the emergence of threats to either physical or information systems assets.  The problem has always been that it is much more challenging to sift through and analyze the information than it is to collect it in the first place.  I’m sure most readers have heard the narrative about all the warning signs of the 9/11 attacks; they were not hard to find after the fact, but no one “connected the dots” beforehand.  Furthermore, even among government agencies, information was not always shared, either because of inter-agency politics, or just inertia.  Information exchange between government and private-sector entities was even more problematic.

In the last decade, there have been efforts made to improve this situation.   As part of that overall effort, this past week the White House released a new National Strategy for Information Sharing and Safeguarding [PDF here, 24 pp. total].  As the title implies, the Strategy recognizes that information must be shared, but in a controlled way; sharing everything with everyone risks giving too much information to potential adversaries.  Citizens’ rights and privacy concerns also need to be taken into account.

Our national security relies on our ability to share the right information, with the right people, at the right time. As the world becomes an increasingly networked place, addressing the challenges to national security—foreign and domestic—requires sustained collaboration and responsible information sharing.

It also recognizes that many entities, not all of them governmental, are involved:

The imperative to secure and protect the American public is a partnership shared at all levels including Federal, state, local, tribal, and territorial. Partnerships and collaboration must occur within and among intelligence, defense, diplomatic, homeland security, law enforcement, and private sector communities.

To the extent that this reflects a shift toward looking at this problem as a whole, and not just at individual pieces, this is a welcome development.

I have had a quick preliminary read of the Strategy; although it is, like many similar documents from large organizations, over-supplied with jargon, its basic thrust seems sound.  The approach is based on three basic principles:

  • Information is a National Asset
  • Information Sharing and Safeguarding Requires Shared Risk Management
  • Information Informs Decisionmaking

The last is perhaps the most important, in the context of recent history.  Information in a form that cannot be used to inform decisions is not worth much.

The Strategy identifies five broad goals going forward:

  • Drive Collective Action through Collaboration and Accountability
  • Improve Information Discovery and Access through Common Standards
  • Optimize Mission Effectiveness through Shared Services and Interoperability
  • Strengthen Information Safeguarding through Structural Reform, Policy, and Technical Solutions
  • Protect Privacy, Civil Rights, and Civil Liberties through Consistency and Compliance

Each of these is discussed, and further broken down to more specifics.  The Strategy then goes on to identify objectives for action going forward.

As is often the case with security policy issues, the devil is very much in the details of implementation; but it is encouraging that a reasonable framework has been developed as a starting point.

Supercomputing Reaches New Heights

December 22, 2012

I’ve written here before about the semi-annual Top 500 ranking of the world’s supercomputer installations, based on their performance on a computational benchmark.  The site has a report of a new system that, while it does not qualify for inclusion in the Top 500 list, has a distinction of its own: it is located at an elevation of 5,000 meters  (16,400 feet) in the Andes in northern Chile, making it the highest system in existence.

The system is installed at the site of the Atacama Large Millimeter/submillimeter Array (ALMA) telescope, the most elaborate ground-based telescope in history.  ALMA, an international astronomy facility, is a partnership of Europe, North America and East Asia in cooperation with the Republic of Chile.  The giant telescope’s main array uses 50 dish antennas, each 12 meters (39.3 feet) in diameter, separated by as much as 16 kilometers (10 miles).  There is also a smaller array of four 12-meter and twelve 7-meter (23 feet) antennas.  ALMA functions as an interferometer, which means that the signals from all the antennas in use must be processed together in order to be useful.

The computing system, called the ALMA Correlator, contains 134 million processors, and can handle date from up to 64 antennas simultaneously.  In doing this, it performs approximately 17 quadrillion (1.7 × 1016) operations per second.  Because it is a specialized system, it is not directly comparable to the supercomputers in the Top 500 list (which are ranked on the basis of the LINPACK benchmark).  Nonetheless, the per-operation time is of the same order as that of the TITAN system, which is currently ranked number one of the Top 500, at 1.76 × 1016 floating point operations per second.  (The European Southern Observatory has published an announcement of the ALMA Correlator with more details.)

The radiation wavelengths (millimeter and sub-millimeter) that ALMA studies come from some of the coldest objects in the universe.  Because these wavelengths are significantly absorbed by water vapor, the observatory is located at one of the highest and driest places on earth, the high plateau at Chajnantor, in northern Chile.

Apart from the logistical difficulties involved in building an observatory in such a remote place, the high altitude and correspondingly thin atmosphere create other problems.  Because the air is so thin, the air flow needed to cool the system is approximately  twice that which would be needed at sea level.  Standard hard disk drives rely on “floating” the read/write heads above the platters on an air cushion; that doesn’t work at this altitude, so the system must be diskless.  Human performance is affected, too; a photo accompanying the article shows a technician working on the machine and wearing a supplemental oxygen supply.  (I have never worked at 16,000 feet, but I can say from personal experience that walking 50 yards at a 10,000 foot elevation is a noticeable effort.)  The site is also in a zone of regular seismic activity, so the system must be able to withstand earthquake vibrations.

The ALMA observatory is scheduled to be completed in late 2013, but it has already begun making some observations.  This is fascinating science; in effect, it gives us a “time machine” with which we can observe some of the earliest, and most distant, objects in the universe.

Raspberry Pi Has an App Store

December 21, 2012

I’ve posted here several times before about the Raspberry Pi single-board Linux computer, developed by the Raspberry Pi Foundation.   The Pi has proved to be extremely popular since it was introduced earlier this year.   The Foundation has now announced the opening of the Pi Store, a marketplace for applications for the Pi.

Currently, the Pi Store has 29 selections available, all but two of which (The Chimera game engine, and the game “Storm in a Teacup”) are free.   Eight of the entries are issues of the MagPi community-led magazine, which has articles for a range of users.  One of the notable applications is LibreOffice, the free open-source office suite, which has a range of capabilities comparable to Microsoft Office.  (I’ve written about LibreOffice before, most recently here.)  Another is the Asterisk open-source telecommunications software, which provides a set of building blocks for telecommunications systems, such as a conference bridge or a PBX.

If you have a Raspberry Pi, the store is directly accessible via an X application in the Raspbian OS (a derivative of Debian Linux for the Raspberry Pi).  It is also accessible on the Web.

Wired also has an article on the launch of the store.

Microsoft Re-Issues Security Bulletin MS12-078

December 21, 2012

Microsoft originally released security bulletin MS12-078 as part of its regular monthly patch bundle on Tuesday, December 11.   The bulletin, rated as Critical severity, addressed a flaw in the handling of TrueType or OpenType font files by kernel-mode drivers in Windows.  The bulletin included two patches, identified by Microsoft Knowledge Base numbers KB2753842 and KB2779030.

Microsoft has now re-issued Security Bulletin MS12-078, including an updated version of the KB2753842 patch.  According to Microsoft, the originally-issued patch does resolve the security vulnerability, but introduces some other problems with the handling of OpenType fonts.

Microsoft re-released this bulletin to address a known issue in the KB2753842 update related to OpenType Fonts (OTF) not properly rendering in applications after the original update was applied. Customers who have successfully installed the original KB2753842 update are protected from the vulnerability described in CVE-2012-2556. However, customers need to install the rereleased KB2753842 update to resolve the issue with improper OpenType font rendering and to keep the affected binaries up to date.

Download links for the patches are given in the updated Security Bulletin.  I do suggest that you apply the updated patch, even if you applied the original version; having old, slightly odd versions of system software hanging around is asking for trouble.

%d bloggers like this: