Particle Physics Group Makes Open Access Deal

September 26, 2012

I’ve written here from time to time (most recently in June and July) about the growing movement to provide more free and open access to scholarly publications.  Historically, these articles have been published mostly in limited-circulation journals,  many of which have very high subscription prices and access fees.  (For example, getting access to a single article can easily cost $25-30, or more.)  The publishers claim that the high prices reflect the value they add, by providing peer review and editorial services; they also claim that high costs (for example, for typesetting mathematics articles) justify higher prices.  There is some truth to these claims; however, much of the (highly skilled) labor required for peer review and editing is provided by academics at no charge, and technology has greatly reduced the cost of specialized production.  In particular, the marginal cost of additional digital copies of a given article is effectively zero.

A recent article at the Nature news site reports that a newly-negotiated agreement between journal publishers and a particle physics consortium, the Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP3), may soon make most publications in the field available under open access.

The entire field of particle physics is set to switch to open-access publishing, a milestone in the push to make research results freely available to readers.

The field already is one of the leaders in providing some degree of open access, notably via the arXiv site, but the process is clumsy.

Particle physics is already a paragon of openness, with most papers posted on the preprint server arXiv. But peer-reviewed versions are still published in subscription journals, and publishers and research consortia at facilities such as the Large Hadron Collider (LHC) have previously had to strike piecemeal deals to free up a few hundred articles.

In exchange for open access, the deal gives the participating journal publishers a contractual payment per article.  Funds for this have been pledges by a group of libraries, research funding agencies, and research consortia; effectively, some funds previously budgeted for journal subscriptions will, under the deal, be used for these payments.  The SCOAP3 group currently has a pledged budget of about € 10 million ($12.5 million).  Assuming all goes well, the arrangement will take effect beginning in 2014.

The deal is not perfect, and its implementation is not a foregone conclusion.

  • Some very prestigious journals, such as Physical Review Letters, are not part of the deal.
  • It has been agreed in principle that participating libraries should pay a reduced subscription fee to the journals, reflecting their up-front commitment. However, the details of this are still to be worked out.
  • The SCOAP3 group has to actually collect on the libraries’ pledges.  If the articles are truly open access, there is a classic “free rider” problem waiting in the wings.

Nonetheless, I think this is an important step forward.  In moving to open access, as  in many human endeavors, one of the key, if unspoken obstacles, is “We’ve always done it this way”.  (Other appeals to unreason are often also significant, of course.)   If scholars in an intellectually prestigious field, like particle physics, adopts open access as the normal way of doing things, it may stimulate thinking about the issue in other fields.


Google Releases Chrome 22

September 25, 2012

Google has released version 22.0.1229.79 of its Chrome browser, for all platforms (Linux, Windows, Mac OS X, and Chrome Frame).  According to the Release Announcement, the new version fixes 25 identified security vulnerabilities; of these, Google rates one as Critical severity, 15 as High, 6 as Medium, and 3 as Low.  There is also a mitigation for a Critical Windows kernel memory corruption flaw (CVE 2012-2897).

This version also incorporates some new capabilities, including:

  • Mouse Lock API availability for Javascript
  • Additional Windows 8 enhancements
  • Continued polish for users of HiDPI/Retina screens

A post at the Official Chrome Blog also mentions improvements for gaming in Web applications.

Because of the security content of the new version, I recommend that you update your systems as soon as you conveniently can. Windows and Mac users should get the new version via the built-in update mechanism. Linux users should get the updated package from their distributions’ repositories, using their standard package maintenance tools.

You can check the version of Chrome that you have by clicking on the tool menu icon (the little wrench), and then selecting “About Google Chrome.”

 


Watson in the Clouds

September 24, 2012

I’ve written here several times about IBM’s Watson system, which first gained some public notice as a result of its convincing victory in a Jeopardy! challenge match against two of the venerable game show’s most accomplished human champions.   Since then, IBM has announced initiatives to put Watson to work in a variety of areas, including medical diagnosis, financial services, and marketing.  All of these applications rely on Watson’s ability to process a very large data base of information in natural language, and to use massively  parallel processing to draw inferences from it.  (The Watson system that won the Jeopardy! test match used 10 racks of servers, containing 2880 processor cores, and 16 terabytes of memory.)

Now an article in the New Scientist suggests an intriguing  new possibility for Watson, as a cloud-based service.

Watson, the Jeopardy-winning supercomputer developed by IBM, could become a cloud-based service that people can consult on a wide range of issues, the company announced yesterday.

The details of this are, at this point, fuzzy at best, but making Watson available as a cloud service would certainly make it accessible to a much larger  group of users, given the sizable investment required for a dedicated system.

Because Watson can respond to natural language queries, it is tempting to compare it to other existing systems.  Apple’s Siri, for example, can interpret and respond to spoken requests, but the back-end processor is essentially a search engine.  The Wolfram|Alpha system also responds to natural-language queries, but its ability to deliver answers depends on a structured data base of information, as Dr. Stephen Wolfram has explained.  Watson really is a new sort of system.

All of this is still in the very early stages, of course, but it will be fascinating to see how it develops.


Ig®Nobel Prizes, 2012

September 23, 2012

Last Thursday evening, at Harvard University’s Sanders Theater, the Annals of Improbable Research presented the 2012 Ig®Nobel Prizes,as scheduled, “for achievements that first make people laugh, and then make them think“.   As I’ve noted in the past, the awards are generally given for actual research that has a humorous, quirky,  or slightly off-the-wall character.  The official awards page has citations for the relevant articles; here are a few of my favorites from this year’s awards:

  • ACOUSTICS PRIZE: Kazutaka Kurihara and Koji Tsukada [JAPAN] for creating the SpeechJammer — a machine that disrupts a person’s speech, by making them hear their own spoken words at a very slight delay.   I daresay readers here will agree that there are many possibilities for the use of this device.
  • ANATOMY PRIZE: Frans de Waal [The Netherlands and USA] and Jennifer Pokorny [USA] for discovering that chimpanzees can identify other chimpanzees individually from seeing photographs of their rear ends.  I hope this does not become the next biometric panacea for identifying computer users.
  • FLUID DYNAMICS PRIZE: Rouslan Krechetnikov [USA, RUSSIA, CANADA] and Hans Mayer [USA] for studying the dynamics of liquid-sloshing, to learn what happens when a person walks while carrying a cup of coffee.  Apparently both the size and shape of the container and the biomechanics of a person’s walking are significant factors.
  • LITERATURE PRIZE: The US Government General Accountability Office, for issuing a report about reports about reports that recommends the preparation of a report about the report about reports about reports.  An outgrowth of work commissioned by the Department of Defense, it should perhaps also be commended for illustrating the potential pitfalls of recursive procedures.

The ceremony, as usual, featured several actual Nobel laureates to present the prizes, along with talks and a blizzard of paper airplanes.

Ars Technica also has an article on this year’s awards.


IgNobel Prelude

September 20, 2012

I have always had a certain fondness for slightly off-the-wall events, and one of my favorite annual events is the Ig®Nobel Prizes, awarded by the journal Annals of Improbable Research.

The Ig Nobel Prizes honor achievements that first make people laugh, and then make them think. The prizes are intended to celebrate the unusual, honor the imaginative — and spur people’s interest in science, medicine, and technology.

Although there are generally one or two awards that are based primarily on spectacular stupidity, most of them are given for real, published research that meets the fundamental criterion above.  I’ve also enjoyed blogging about the Prizes, since I started this blog in 2009; my posts about the awards are here: 20112010, and 2009..

This year’s Prizes are scheduled to be awarded this evening at the usual venue, Sanders Theater at Harvard University in Cambridge MA.

In the meantime, Wired has an article and photo gallery from previous years’ IgNobel awards.


The FTC’s New Chief Technologist

September 18, 2012

In my post yesterday, I talked about Prof. Ed Felten’s stint as the first Chief Technologist of the US Federal Trade Commission [FTC], and his comments on that experience.   Prof. Felten was successful at the FTC in at least one other important way: there will be a second Chief Technologist.

I am very glad to see that the FTC has made another excellent choice in appointing Prof. Steven M. Bellovin to the post.   Dr. Bellovin is a professor of computer science at Columbia University; previously, he worked for many years at AT&T Research. He has made many contributions to the development of the Internet, having served as a member of the Internet Engineering Task Force and the Internet Architecture Board.  He describes his research interests as “Networks, security, and especially why the two don’t get along”, and is co-author of the classic book, Firewalls and Internet Security: Repelling the Wily Hacker, first published in 1994, a copy of which has been on my shelves for many years.  Prof. Bellovin, in his new role,  has an introductory post on the Tech@FTC blog.

It seems to me that getting experts of the caliber of Ed Felten and Steve Bellovin involved in the FTC’s policy making process is a good thing from any reasonable point of view, and I think the FTC should be commended for making it happen.


Prof. Felten’s Take on Washington

September 17, 2012

Back in November, 2010, I wrote about the appointment of Prof. Ed Felten, of Princeton University, as the Federal Trade Commission’s Chief Technologist.   This was a term appointment, and Dr. Felten is now back at Princeton as a professor of computer science and public affairs.  He is also resuming his role as Director of the university’s Center for Information Technology Policy, and frequent contributor to the Freedom to Tinker blog.

Ars Technica has an interview with Prof. Felten, focused on his experience in Washington.

So what’s it like to be a geek in the land of lawyers? Ars Technica interviewed Felten by phone on Tuesday to find out.

The interview is short, but well worth reading for anyone interested in technology policy.  As the article points out, many people in policy-making positions in Washington have little to no technical background; many are lawyers.  And many of these people, regardless of their background, have some odd ideas about technology in general.

Computer scientists are a rare breed in lawyer-dominated Washington, DC, and Felten said it was sometimes a challenge helping policymakers understand the nature and limits of technology.

For example, he said a lot of people in Washington have a misconception that any problem “can obviously be solved if you try hard enough.”

In the absence of technical knowledge and understanding, many policy makers rely on getting advice from people they trust, on the basis of personal relationships.  This, of course, is at the root of the enormous lobbying business, but it is not all bad.  If the trusted people are actually competent, and not just pre-scripted automatons, it provides a means for technically qualified people to communicate their views.

… Felten said there are ways ordinary geeks can influence the policy process. The most important thing they can do, he said, is to develop relationships with people who do have direct connections to the policy process.

Although technology and science evolve quite rapidly, human nature has really not changed all that much.  Technical people ignore or discount personal relationship building at their peril.


%d bloggers like this: