Google Releases Chrome 16·0·912·77

January 24, 2012

Google has released a new stable version, 16·0·912·77, of its Chrome browser, for Windows, Linux, Mac OS X, and Chrome Frame.   The new release contains fixes for five security vulnerabilities.  Google has rated one of these vulnerabilities as Critical severity, and the other four as High.   More details on the changes in this version are in the release announcement.

Windows and Mac users should get the new version via the built-in update mechanism.  Linux users should get the updated package from their distributions’ repositories, using their standard package maintenance tools.


DARPA Seeks New Authentication Methods

January 23, 2012

I’ve talked many times here about the problems with passwords as a means of authenticating computer users (most recently here and here), and about the search for better alternatives.   Some improvements are available, such as two-factor identification methods, but these have their own issues, and are not always enormously secure, either.

Network World reports on a new effort being launched by DARPA, the Defense Advanced Research Projects Agency, to develop new techniques for authenticating users.  The project, which DARPA calls “Active Authentication”, takes a slightly different approach from most past efforts in this area.

… the agency’s Active Authentication program looks to develop what DARPA calls “novel ways of validating the identity of the person at the console that focus on the unique aspects of the individual through the use of software-based biometrics.”

The “biometrics” that are mentioned here are not the usual ones, like fingerprints or hand geometry, but are drawn from a broader set of user characteristics and behavior.

Active Authorization focuses on the computational behavioral traits that can be observed through how we interact with the world.

Examples of the kinds of user behavior that might be considered as authentication factors include:

  • – keystrokes
  • – eye scans
  • – how the user searches for information (verbs and predicates used)
  • – eye tracking on the page
  • – speed with which the individual reads the content

Some of this is similar in concept to some earlier work on user profiles for security.  In its current announcement, DARPA emphasizes that the first phase of the project will concentrate on developing techniques that can be implemented without installing additional hardware devices in a standard office environment.  Later phases might consider new types of sensor technology.

This is an intriguing approach.  The use of multiple authentication factors should increase reliability of the system; also, as DARPA point out, it might help detect intrusions from logged-in workstations left unattended, since the behavioral authentication factors can be measured on an ongoing basis.


This Time’s for Real: IPv6 Day

January 21, 2012

Last year, the Internet Society organized World IPv6 Day, in order to provide the first global test of the Internet infrastructure changes needed to support the new IPv6 [Internet Protocol, version 6] addressing scheme.   The test was conducted on June 8, 2011, and included several major Internet companies, including Google, Yahoo!, Facebook, and Akamai.  Some minor glitches occurred, but on the whole the test was reasonably successful.  Although the IPv6 changes have been on the Internet standards track for more than a decade, and the reason it is needed is all too clear (the supply of old-style IPv4 addresses is effectively exhausted), uptake of the new standard has been slow.

An article at Ars Technica  reports that another IPv6 Day has been scheduled for June 6, 2012.  Once again, many of the large Internet services will participate: Google, Microsoft’s Bing, Yahoo!, and Facebook.  In addition, several large ISPs are participating this year, including Comcast, Time-Warner Cable, and AT&T, as well as Free Telecom in France, and XS4ALL in the Netherlands.  Cisco/Linsys and D-Link will also begin enabling IPv6 by default in their home routers.  But the most important difference in World IPv6 Launch is that, this time, it’s not just a test.  The participants will permanently enable IPv6 for their sites and networks.

There will, inevitably, be some configuration errors and other problems that will surface once IPv6 connectivity is being used on an ongoing basis.  But forcing the issue is probably the only realistic way to get people to change.  And, as Ars points out, the Web itself, and  the HTTP protocol, are relatively tolerant of a mixed environment; other services, however, such as Skype, really need to move to IPv6, but have not done much so far.  So there will probably be some inconveniences along the way, but there really is no practical alternative to making the change.


Internet Strike Has an Impact

January 19, 2012

By this time, I’m sure that readers  know about the one-day Internet “strike” yesterday, January 18, either by seeing the notices on many prominent sites (including Wikipedia and Google), or from reports in the media.  The action was taken to protest against two pieces of legislation that currently under consideration in the US Congress: the Stop Online Piracy Act [SOPA] in the House of Representatives, and the Protect Intellectual Property Act [PIPA] in the Senate. (I’ve written about the PIPA bill before.) Although there are minor differences in the two bills, either would have the effect of setting up a “control system” for the Internet, allowing sites to be blacklisted and removed from the Domain Name System [DNS], with very little in the way of due process.  This is proposed in order to prevent the unauthorized copying and distribution of copyrighted or trademarked material.

I will not attempt to describe the details of this legislation, since others have already produced good summaries.  Wired has an article explaining some of the reasons for the protest.  Google also has a page on the issue, and the Electronic Frontier Foundation has a threepart article going into more depth.  The prime movers behind this legislation are the content producers, especially those represented by the RIAA [Recording Industry Association of America] and the MPAA [Motion Picture Association of America].  They are upset because the advent of digital distribution makes some parts of their traditional business model economically unsustainable.

The protest and other advocacy by the technology industry and others does seem to have had some effect.  LAst weekend, the White House issued a statement saying that it would not support the bills in their current form.  And just in the last two days, a number of Senators have backed away from support of PIPA.

This is progress, but Congress needs to understand that this kind of legislation, benefiting one specific group while potentially causing great “collateral damage”, is a really bad idea.  As Bruce Schneier  often reminds us, putting in place the surveillance and censorship mecahnisms of a police state is not good civic hygiene.


JStor Research Archive to Offer Some Open Access

January 16, 2012

I’ve written here several times about the growing trend among organizations, including The Royal Society, Princeton, Yale, and the National Academies Press, to make some or all of their content available at no charge on the Web.  Now Technology Review is reporting that JStor, an extensive archive of scholarly publications, is about to begin a program to provide free access to articles from 70 different journals in its database.

An organization that maintains a huge database of academic research plans to soon let the public view some of the trove of information for free—a big boost for the idea of “open access” to the world’s knowledge.

JStor, which is run by a non-profit organization, was set up in the mid-1990s to relieve libraries of the burden of storing and cataloging paper journals.  Its total archive includes more than 1,400 journals, so the material to be included in the beta Register & Read program is just a small chunk of the archive, but JStor indicates that more content may be added if the initial experiment is successful.

JStor previously launched its Early Journal Content program, which provides free access to journal articles published prior to 1923 in the United States and prior to 1870 elsewhere.  The organization says that both this and the Register & Read program are part of an attempt to find sustainable ways to provide JStor access to people not affiliated with a participating institution.

Obviously, there is a significant cost associated with running a facility like JStor, and the money to pay the bills has to come from somewhere.  I think JStor is to be commended for trying to provide better free access; there are many people in the world who have no realistic chance of getting “official” access, and it’s just possible that some of them might have significant contributions to make.


Itsy-Bitsy Bits

January 15, 2012

Magnetic storage media have been around, in various forms, about as long as we’ve had computers.   Magnetic tapes have long been used for storage of really large data sets and for backups.  Back in the days of the System/360, a common IBM hard disk drive (DASD, Direct Access Storage Device, in IBM parlance) was about the size of domestic washing machine.  Floppy disks started out as 8-inch monsters, even before the introductions of the PC; they were used, for example, in dedicated word processors from Wang and IBM back in the late 1970s.   In the PC world, we first had 5.25-inch and then 3.5-inch floppies, and we still have our smaller, faster, and much more capacious hard disks.

Now, according to reports at Technology Review and the New York Times, researchers at IBM’s Almaden lab have developed a new type of magnetic storage material that is dramatically more compact than existing technologies, using only a dozen atoms per bit.

The smallest magnetic-memory bit ever made—an aggregation of just 12 iron atoms created by researchers at IBM—shows the ultimate limits of future data-storage systems.

By comparison, the highest-density magnetic storage devices made today use ~1 million atoms per bit.  The research is reported in the current issue of Science [abstract].  The technology is not quite ready to be included in your next laptop or smart phone.  It only works at very low temperatures (< 5K), and can only retain the data stored for a few hours.  Researchers feel that this can be improved by using a slightly larger number of atoms, ~150, but there is still no scalable method of manufacturing the material.

Nonetheless, the work is of great interest because it shows the possibilities for aggressive use of nanotechnology.  The experimental device is assembled using a scanning tunneling electron microscope (invented in the early 1980s at IBM Zürich) to place individual iron atoms in an anti-ferromagnetic alignment.  In conventional ferromagnetic materials, like the magnetic coating on disk platters, or the magnet on your fridge, the magnetic spins of individual atoms are aligned; this can lead to instability when components are made smaller.  The new material does not have this alignment, so some degree of stability can be achieved with a tiny amount of material.

It’s too soon to say for sure that this approach will produce a new type of computer storage; but we are approaching the physical limits of what can be done with silicon fabrication technology, so it’s good to see new approaches being explored.

Update Sunday, 15 January, 17:05 EST

Wired also has an article on this research, complete with an image of a byte of memory in the new material, with eight little 12-atom clumps.


%d bloggers like this: