Copyrights and Wrongs, Revisited

May 23, 2011

I have written here a couple of times before about the origins of copyright law, and about the questionable nature of some of the “evidence” of widespread infringement presented by the content producing industry.   Apart from being obviously self-serving, the claims for economic loss due to infringement make the assumption that every unauthorized copy would become a sale if only enforcement were better, a claim that is not justified by any evidence that I’ve seen.  The industry’s claims also ignore the difficult to estimate, but almost certainly non-zero, benefits of fair use.

All of this has been an ongoing controversy here in the US, but it affects other countries, too.  Now, as reported in a post on the “Law & Disorder” blog at Ars Technica, an independent review of intellectual property law in the United Kingdom, commissioned by Prime Minister David Cameron, and conducted by Professor Ian Hargreaves, has been published.  The report, Digital Opportunity: A Review of Intellectual Property and Growth [PDF] although it recognizes the economic importance of “intangibles” to societies like the UK,  raises many of the same concerns that have expressed here in the US; in particular, as Prof. Hargreaves writes in the Foreword, the law has not kept up with the changes in society and technology:

Could it be true that laws designed more than three centuries ago with the express purpose of creating economic incentives for innovation by protecting creators’ rights are today obstructing innovation and economic growth?

The short answer is: yes. We have found that the UK’s intellectual property framework, especially with regard to copyright, is falling behind what is needed.

The review also finds that, as has been suggested in the US, much of the data presented to support the case for enhanced copyright protection is somewhat suspect.

Much of the data needed to develop empirical evidence on copyright and designs is privately held.  It enters the public domain chiefly in the form of ‘evidence’ supporting the arguments of lobbyists (‘lobbynomics’) rather than as independently verified research conclusions.

The review makes a strong case for moving to an evidence-driven process for setting policy, with the evidence being developed and vetted by someone who does not have an axe to grind.  It also strongly recommends that fair use exceptions to copyright take account of the benefits of such use, and that restrictions on copying for private use (e.g., for time- or format-shifting) be minimal.

The review also takes a strong position against the increasingly common practice of retroactive copyright extension, pointing out that the incentive for creative work that is supposedly the prime justification for copyright does not even make sense in that context.

Economic evidence is clear that the likely deadweight loss to the economy exceeds any additional incentivising effect which might result from the extension of copyright term beyond its present levels. …  This is doubly clear for retrospective extension to copyright term, given the impossibility of incentivising the creation of already existing works, or work from artists already dead

The report is worth reading if you have an interest in this area; I intend to do my best to persuade my Congress Critters to read it.


Clues from the Japanese Earthquake

May 22, 2011

Most of us have probably followed the ongoing story of the March earthquake and tsunami in Japan, and the dangerous situation with the nuclear power plants there.  The quake, estimated at magnitude 9, was the largest to strike Japan in modern times.    We’ve also seen other major earthquakes in Haiti and New Zealand.   At this point, we know a good deal about the underlying structure of the tectonic plates that make up the Earth’s crust, and also about the mechanics of the faults between plates; the buildup and subsequent sudden release of stress along these faults is an underlying cause of earthquakes.  We haven’t made too much headway, though, when it comes to predicting earthquakes.

The “Physics ArXiv” blog at Technology Review recently summarized some very interesting results, gathered from observations just before the Japanese earthquake.   The results [abstract, full article PDF available], which are preliminary and subject to revision, were presented at the 2011 European Geosciences Union conference in Vienna.

Dimitar Ouzounov at the NASA Goddard Space Flight Centre in Maryland and a few buddies present the data from the Great Tohoku earthquake which devastated Japan on 11 March. Their results, although preliminary, are eye-opening.

The researchers examined  data on infrared radiation, measured by satellite and provided by the Climate Prediction Center at NOAA; they also looked at ionospheric observations from three sources: the GPS/TEC ionosphere maps, ionospheric tomography from satellites in low earth orbit, and data from ground stations in Japan.  They discovered two interesting effects:

  • Beginning on March 8 (three days prior to the earthquake), there was a rapid increase in the emitted infrared radiation, centered over the area of the quake’s epicenter.
  • During the period March 3-11, there was an increase in electron density observed by all three systems, peaking on March 8.

As the Physics ArXiv post points out, this is consistent with an existing hypothesis on the interactions between events in the lithosphere and the atmosphere.

These kinds of observations are consistent with an idea called the Lithosphere-Atmosphere-Ionosphere Coupling mechanism. The thinking is that in the days before an earthquake, the great stresses in a fault as it is about to give cause the releases large amounts of radon.

Radon [Rn, atomic number 86] is a radioactive gas, and a natural product of the radioactive breakdown of uranium.  It is an α-emitter; its most stable isotope, 222Rn, has a half-life of 3.8 days.  It is certainly plausible that a large release of radon into the atmosphere could cause an significant increase in electron density (it is called ionizing radiation for a reason, after all).   Moreover, since water molecules are electrically polar, water vapor molecules are attracted to ions in the atmosphere.  This has the effect of providing nucleation sites for the condensation of water vapor to liquid water; because of water’s unusually high heat of vaporization (40.68 kJ/mol), the condensation could produce warming of the atmosphere, and hence an increase in emitted infrared radiation.

Once again, these results are preliminary, and need to be confirmed.  (And it is hard to predict how soon that might happen, since getting some of that confirmation will probably require us to observe more earthquakes.)   If the results hold up, they might provide a starting point for better predictions of earthquake risk.  They might also help to account for the persistence of anecdotal evidence that some animals seem to sense imminent earthquakes.


Common Vulnerability Reporting Framework Proposed

May 20, 2011

Back in 2008, a group of five technology provides formed the Industry Consortium for Advancement of Security on the Internet [ICASI]; the original five member companies — Cisco Systems, IBM, Intel, Juniper Networks, and Microsoft — have been joined by Nokia, as a Founding Member, and by Amazon.  The idea behind the formation of the effort was to provide a mechanism for cooperative work on security issues.

ICASI will allow IT vendors to work together to address multi-vendor security threats. The consortium will provide a mechanism for international vendor and customer involvement, and allow for a government-neutral way of resolving significant global, multi-product security incidents.

This past week, ICASI has released a free white paper  proposing a new Common Vulnerability Reporting Framework [CVRF], which attempts to provide a uniform format for reporting security information.   As the white paper points out, some basic standardization of security information has been achieved with, for example, the Common Vulnerabilities and Exposures [CVE] database; but most security information is still produced in a variety of formats, often vendor-specific. The CVRF proposal aims to provide a standard format for this reporting.

The Common Vulnerability Reporting Framework (CVRF) is an XML-based language that is designed to provide a standard format for the dissemination of security-related information. CVRF is intended to replace the myriad of nonstandard vulnerability reporting formats with one format that is machine readable.

Appendix A of the paper contains a list of Frequently Asked Questions.

I think anyone who has had the dubious pleasure of reading through vulnerability reports and security bulletins from multiple vendors would probably agree that the objective of standardizing this information is a worthy one.  It remains to be seen, of course, whether the various participants will get on board.


Memristors’ Workings Elucidated

May 20, 2011

Last fall, I wrote a note about a new business venture between Hewlett-Packard [HP] and Hynix (a Korean electronics manufacturer) to produce memory devices using memristor technology.  The projection that the devices would be commercially available by sometime in 2013 seemed a bit speculative, given the many questions surrounding the development of a new technology.

This week, an article at the BBC News site reports that at least some of the fundamental questions about how memristors work have begun to be answered.  A group of HP researchers has used X-ray techniques to analyze how current flows through the devices, and how the heat produced affects the structure of the materials in the device.

Now, researchers at Hewlett-Packard including the memristor’s discoverer Stan Williams, have analysed the devices using X-rays and tracked how heat builds up in them as current passes through. … The passage of current caused heat deposition, such that the titanium dioxide surrounding the conducting channel actually changed its structure to a non-conducting state.

The research was reported in the journal Nanotechnology [abstract, PDF download available], and examines the chemical, structural, and thermal characteristics of the device as its electrical state changes.

Dr. Williams told the BBC that this information would be of great importance in developing memristor technology.

The detailed knowledge of the nanometre-scale structure of memristors and precisely where heat is deposited will help to inform future engineering efforts, said Dr Williams.

He contrasted this with Thomas Edison’s development of the incandescent light bulb, which was characterized by a series of trial and error experiments.   The technology is still very new — memristors were first predicted theoretically in the 1970s, with the first prototype device being built at HP in 2008 — but it has the potential to produce some further amazing gains in electronics’ performance.


Better BIOS Security

May 19, 2011

An article at the Science Daily site reports that the National Institute of Science and Technology has issued a new report on guidelines for achieving better security in the PC BIOS, the initial firmware executed when the PC begins its boot sequence.  (I have talked a bit about what the BIOS does here.)  The report, BIOS Protection Guidelines  [NIST SP 800-147] [PDF], presents and discusses a set of guidelines for getting a secure BIOS implementation.  The key points are:

  1. An approved BIOS Update mechanism.  All updates to the BIOS code must be via an authenticated update mechanism, or a physically-secure local update mechanism.
  2. BIOS updates should be signed using a secure cryptographic protocol.
  3. An optional provision may be made for a physically secure local update mechanism, for emergency use.
  4. The BIOS code stored in the machine’s non-volatile memory should be protected against modification or corruption.
  5. It should not be possible to bypass any of the protection mechanisms.

This is a new area of security attention for the NIST, and a welcome one.  I have often smiled to myself when examining  purportedly secure PC workstations, which in too many cases have access to the BIOS settings (often called “Setup”, or something similar) completely unprotected, even though modern BIOS installations generally provide at least password protection.   If an attacker can access the Setup routine, he can typically boot from a device of choice (such as a USB flash drive or a CD-ROM).  If he can then go further, and modify the BIOS code, he can give himself a wide menu of malicious possibilities.

Without appropriate protections, attackers could disable systems or hide malicious software by modifying the BIOS. This guide is focused on reducing the risk of unauthorized changes to the BIOS.

As Ken Thompson pointed out in his 1984 Turing Award lecture, Reflections on Trusting Trust, malicious code at this level can be very difficult to find.

In demonstrating the possibility of this kind of attack, I picked on the C compiler. I could have picked on any program-handling program such as an assembler, a loader, or even hardware microcode. As the level of program gets lower, these bugs will be harder and harder to detect. A well installed microcode bug will be almost impossible to detect.

The report is timely, because many manufacturers are at the point of using new BIOS implementations, including those using the new Unified Extensible Firmware Interface [UEFI] specification.

The NIST report also makes suggestions for system management practices to complement improved BIOS security.

The publication also suggests management best practices that are tightly coupled with the security guidelines for manufacturers. These practices will help computer administrators take advantage of the BIOS protection features as they become available.

It also contains a very good summary of how the boot process works, both for conventional and UEFI implementations.


Android Authentication Vulnerability

May 18, 2011

Back in February of this year, Dan Wallach, a computer science professor at Rice University, posted an article at the Freedom to Tinker blog about the results of a class experiment using a WiFi “sniffer” to eavesdrop on communications from an Android smartphone.  One of the things he noted was that communications to and from some services, such as Google Calendar, were done in clear text.  This issue has resurfaced in the last few days, because a group of researchers at Universität Ulm in Germany published a note on their work, demonstrating that there was an exploitable security vulnerability.

The vulnerability exists in Android versions prior to 2.3.4 (the most recent version).  It is not really a vulnerability in the Android OS, but a consequence of the clientLogin protocol, which is used to authenticate the user, being transmitted in clear text.  Under this protocol, when the user has entered his user ID and password, an authentication token is returned to the browser, which is used for subsequent session authentication.  If the plain-text token can be sniffed, an attacker can impersonate the user.  (This is very similar to the technique used for browser session hijacking embodied in the Firesheep extension for Firefox.)  The exposure is made greater because the authentication tokens typically have a long lifetime (up to two weeks).

Beginning with Android 2.3.4, the clientLogin protocol and subsequent communications are done with a secure encrypted (https:) protocol, at least for the Calendar and Contact applications, which removes the vulnerability.  (Some applications, such as Picasa, still appear to be vulnerable.)   However, as an article at Ars Technica points out, the absence of a systematic software update procedure for smartphones means that most users are still vulnerable.

Although the bug has already been fixed (for calendar and contact sync, but not Picasa) in Android 2.3.4—the latest version of the operating system—the vast majority of mobile carriers and handset manufacturers haven’t issued the update yet. According to Google’s own statistics, this means that 99.7 percent of the Android user population is still susceptible to the vulnerability.

This really is a problem that should have been foreseen.  After all, we do have experience with the (lack of) security consequences of putting general-purpose computing devices in users’ hands without a plan to fix software problems.  Also, especially for wireless devices, there is really no justification for using anything but encrypted communications.

Fortunately, there are some encouraging developments.  A group of large handset vendors and cellular carriers have formed a working task force to develop systematic update policies and guidelines.

Although the initiative is still at a very early stage and the policies it formulates will be entirely voluntary, it already has preliminary buy-in from enough prominent Android stakeholders to make it credible. The leading Android handset manufacturers and all four of the major US carriers are currently involved.

I hope that the group does manage to put some reasonable system in place; we have seen all too often that leaving all of the systems administration to the users is a recipe for trouble.


Kansas City, Missouri, Added to Google Test

May 18, 2011

Back at the end of March, I posted a note here about Google’s selection of Kansas City, Kansas, as the location for its experiment in providing high speed  (1 Gbps) residential Internet connectivity.  Now, according to an article in the “Law and Disorder” blog at Ars Technica, Kansas City, Missouri will be included in the experiment as well, under a new agreement between Google and the local electric utility, Kansas City Power & Light.

That deal provides cash in exchange for Google’s access to the electrical infrastructure, including poles and substations, to make deployment of its fiber fast and (relatively) inexpensive.

The deal substantially increases the number of potential customers.  The population of Kansas City MO is about 480,000, while the population of Kansas City KS is about 150,000.  (Population of the Kansas City metro area is ~2 million.)

The rollout of the new service is expected to begin early next year.


Follow

Get every new post delivered to your Inbox.

Join 30 other followers

%d bloggers like this: