Getting Warmer

October 31, 2011

I think by now most people are familiar with the basic idea of global warming: the hypothesis that the Earth is getting warmer because human activities (notably the heavy use of fossil fuels) have increased the concentration of “greenhouse gases” (such as carbon dioxide, or methane) in the atmosphere, causing more heat to be retained and less to be radiated into space.   Although I think it is fair to say that the consensus among scientists is that something like global warming is going on, the idea has generated a lively argument, and a group of fairly vocal skeptics.  Some skeptics accept the evidence that the planet’s temperature is rising, but question whether this is attributable to human activities; others have expressed doubt that any change is really occurring at all.

The Associated Press, in an article carried at Yahoo!, reports on a newly-released study, carried out by a prominent scientist heretofore skeptical of the change, that will probably not be welcome news to that second group of skeptics.

A prominent physicist and skeptic of global warming spent two years trying to find out if mainstream climate scientists were wrong. In the end, he determined they were right: Temperatures really are rising rapidly.

The study, the Berkeley Earth project, was led  by Dr. Richard Muller, a physics professor at UC Berkeley, who also works at the Lawrence Berkeley National Laboratory.   Its independent analysis, based on temperature data since 1800 from 15 sources,  produced essentially the same results as earlier studies by NASA and the National Oceanographic and Atmospheric Administration [NOAA]: land temperatures  have risen by 1° C since the 1950s.

It is mildly amusing, in an ironic way, that a significant chunk of the funding for the study, according to the AP report,

… came from the Charles Koch Foundation, whose founder is a major funder of skeptic groups and the tea party. The Koch brothers, Charles and David, run a large privately held company involved in oil and other industries, producing sizable greenhouse gas emissions

Skeptics had previously argued that the reported increase might not be real, because of two potential problems with the data:

  • The weather stations from which the data were collected provide inaccurate and unreliable measurements
  • Cities tend to be warmer than rural areas (for example, because asphalt absorbs solar heat better than trees), and create “heat islands” that may skew the reported data.

Dr. Muller and his colleagues examined both of those possibilities carefully, and found that they did not have any significant effect on the results.

“Our biggest surprise was that the new results agreed so closely with the warming values published previously by other teams in the US and the UK,” said Richard Muller. “This confirms that these studies were done carefully and the potential biases identified by climate change skeptics did not seriously affect their conclusions.”

The four draft papers from the project, which are in the process of peer review, are available at the project site.   Dr. Muller also wrote an editorial at the Wall Street Journal site, in which he says that, although there may have been grounds for skepticism originally, the issues have now been explored and should be put behind us.

When we began our study, we felt that skeptics had raised legitimate issues, and we didn’t know what we’d find. Our results turned out to be close to those published by prior groups. We think that means that those groups had truly been very careful in their work, despite their inability to convince some skeptics of that.

The Berkeley study looked specifically at the question of urban “heat islands”, by looking at the results using only “very rural” locations.  They found no significant difference from the overall results.

The result showed a temperature increase similar to that found by other groups. Only 0.5% of the globe is urbanized, so it makes sense that even a 2ºC rise in urban regions would contribute negligibly to the global average.

Similarly, although it is true that the data quality from some weather stations leaves a good deal to be desired, there is no evidence that this affected the results.  The study looked separately at stations rated as having good accuracy, and those whose accuracy was rated as poor.  Once again, there was no significant difference in the results.

Remarkably, the poorly ranked stations showed no greater temperature increases than the better ones. The mostly likely explanation is that while low-quality stations may give incorrect absolute temperatures, they still accurately track temperature changes.

That last point is quite an important one.  What we are interested in is the change, if any, over the time period studied.  If poor quality stations  just provide data with a lot of random noise, that will tend to average out over a large sample; even if those stations provide biased data (for example, they always report the temperature 5° F too high), that will not affect the measured temperature difference.  Similarly, if we think about the potential “heat island” effect, if it were constant, it would not affect measurements of temperature change.  A misleading result might be obtained if a significant proportion of the area surveyed changed from being rural to urban over the survey period, but I rather doubt any such change has happened since the 1950s.  (If we were looking at data since, say, 1500, this might be a noticeable effect.)

In the WSJ editorial, Dr. Muller expresses the hope that these results will allow everyone to accept that the world is getting warmer, and focus on the questions of why this is happening, and what to do about it.

Global warming is real. Perhaps our results will help cool this portion of the climate debate. How much of the warming is due to humans and what will be the likely effects?

I’m afraid he is a bit more optimistic than I am about moving forward; as George Bernard Shaw put it:

Reformers have the idea that change can be achieved by brute sanity.

Nonetheless, having additional carefully collected evidence is a good thing.

There are additional reports on this research at Ars Technica and the New Scientist.


Open Data Centers Gain Friends

October 30, 2011

I’ve posted here a couple of time previously about Facebook’s effort to apply open-source principles to data center design, and its Open Compute Project.   Facebook contributed a large body of design information to the project, based on a new, highly-efficient data center the company recently completed in Prineville, Oregon.  That data center is one of the most efficient in the world, in terms of power consumption; it uses 38% less electricity than other Facebook data centers.

According to an article at Ars Technica, Facebook has just announced the creation of a foundation to support the project’s work.  In order to create a data center like the one at Prineville,  Facebook had to design a great deal of the equipment used: servers, power supplies, battery backup systems,and a power distribution system.  Intel, AMD, Dell, and Asus also contributed intellectual property to the project.  And Facebook’s situation is hardly unique.

Google, Amazon, and others all have had to follow a similar path, according to Arista Network chief development officer and Sun cofounder Andy Bechtolsheim, who spoke at today’s event in New York. “Literally all the large-scale data centers in the world are built on off-the-shelf mother boards,” he said. “Because there was no standards, everyone had to do their own thing.”

The new Open Compute Project Foundation is a non-profit organization along the lines of the Apache Foundation, which sponsors the development of the very successful, and widely-used, Apache Web server.   Members of the foundation’s  board include Mr. Bechtolsheim, Goldman Sachs managing director Don Duet, Frankovsky, Rackspace chief operating officer Mark Roenigk, and Intel data center group general manager Jason Waxman.    The new foundation also has a relationship with the Open Data Center Alliance, a consortium of IT customers and some universities.

The focus of the Open Compute project, so far, has been on designs for very large, scalable data centers like those run by Facebook, Amazon, or Google.  This sector certainly has some very special and specific needs.  But the project aims to broaden its scope in time.

There are signs that the Open Compute designs could become more practical for a broader array of data center customers in the future. One of the new participants in the project is Digital Realty Trust, the world’s largest operator of third-party data center space.

The Data Center Knowledge site also has an article on the project.


Steampunk Exhibition

October 29, 2011

Wired has a photo gallery article  from an amusing exhibition, Steampunk: Form and Function: An Exhibition of Innovation, Invention and Gadgetry, described as “the Jules-Verne-meets-Bill-Gates school of contraption art”, being shown at the Charles River Museum of Industry and Innovation in Waltham, Massachusetts, just outside Boston.

The Steampunk Time Machine Antique Master Bathroom Computer Workstation, designed by Bruce Rosenbaum and Walter Parker, melds a modern computer with antique plumbing components, including a ribcage shower, toilet and pipes.

I don’t think the plumbing components are intended to be functional.  There is also a computer desk built from an Eastman Kodak Century No. 1 Studio Camera.  I particularly like the Victorian “Sojourner” Keyboard, by Rich Nagy, and the Waterproof USB Drives by Derrick Culligan.

The show includes more than 30 digitally rejiggered antiques, including clocks, coffeemakers, humidifiers, workstations and grand pianos. It’s all displayed, appropriately enough, in a former textile factory built in 1814.

The exhibit runs through January 15, 2012.

 


Another Chrome Update

October 29, 2011

Google has released a new version, 15.0.874.106, of its Chrome browser, for all platforms (Linux, Windows, Mac OS X, and Chrome Frame).  This release fixes a specific login problem (issue 101274) that affected Dow-Jones sites (Barrons and the Wall Street Journal).  Unless you are experiencing the problem, I don’t think there’s any urgency to update your system.  The release announcement is here.


Getting Down to Basics

October 28, 2011

I have posted a few notes here about the effort to find a new definition for the kilogram, the only fundamental unit of the SI [Le Système International d’Unités] system of units that is still defined by a physical artifact: the mass of a particular cylinder of platinum/iridium alloy, stored in a vault at the Bureau International des Poids et Mesures [BIPM] at Sèvres, outside of Paris.  Other fundamental SI units are defined in terms of fundamental physical processes; the meter, for example, is defined as the distance traveled by light in a vacuum in 1/299,792,458 second.   The advantages of this kind of definition are that it does not depend on a integrity of a physical object, and that it can be replicated anywhere that suitable conditions and apparatus are available.

According to a report at New Scientist, this year’s meeting of the quadrennial Conférence Générale des Poids et Mesures [CGPM] has unanimously adopted a proposal to change the SI to redefine the kilogram, as well as three related units, the mole, the ampere, and the Kelvin [degree].

The General Conference on Weights and Measures (CGPM) in Paris, France, has unanimously agreed on a proposal that would lead to reform of the mole, kilogram, kelvin and ampere, according to the international system of units (SI).

The change will need to be confirmed at the next meeting of the CGPM in four years’ time; if it is confirmed, it will be the most significant change in SI for a century.

The new definitions will tie the SI units to fundamental physical constants; to put it another way, the units will be defined on the basis that the constants are really constant, and have known values.  The proposed new definitions are:

  • Ampere (unit of electric current) will be defined such that the elementary charge (the charge on one proton or electron) is exactly 1.60217653 × 10-19 coulombs.
  • Kelvin (unit of absolute temperature) will be defined such  that the Boltzmann constant is 1.3806505 × 10-23 joules/Kelvin
  • Mole (amount of s substance) will be defined such that Avogadro’s constant is 6.0221415 × 1023 mole-1
  • Kilogram (unit of mass) will be defined such that Planck’s constant is 6.6260693 × 10-34 joule-second.

The New Scientist also has a handy chart showing the old and new definitions of the fundamental units, including those that will remain unchanged: the second, the meter, and the candela.

The BIPM Web site has the press release [PDF], as well as the (fairly technical) text of Resolution 1 [PDF].

Making a change like this is a slow process, because the people responsible, from the various national metrology labs, tend to be conservative, for good reasons; one of their objectives is that definition changes will not cause dislocations in everyday life and commerce.  Watching the process can seem almost as exciting as watching paint dry.  It’s probably useful to remember that this problem exists only because we have learned to measure the physical world to a degree of accuracy undreamed of when the metric system was first formulated in the 18th century.


Built-In PDF Viewer for Firefox

October 28, 2011

One of the features of Google’s Chrome browser that is rather nice is the built-in viewer for documents in Adobe’s Portable Document Format [PDF].  The PDF format is very widely used for distributing documents, since Adobe has always made its Reader (previously Acrobat Reader) software available at no cost, and has published the PDF specification (although it is still owned and controlled by Adobe).  Thus, a PDF document can be read by just about anyone, regardless of the particular platform they are using.

There are downsides to using PDFs, though.  Because they are widely used, across platforms, they have become a popular attack target for the Bad Guys, and there have been many instances of security vulnerabilities.  This has happened, at least in part, because the traditional method of accessing PDFs has been via Adobe’s Reader, and its associated browser plug-in.  Reader is a very large program with tons of features — it can, for example, display documents with embedded Flash video, and is used by the US Internal Revenue Service for downloadable, “fill in the blank” tax forms — most of which are not used in typical reports or articles.  That complexity presents a large attack surface to probe for security holes; it also makes Reader a rather lumbering beast.   Having to keep up with a separate patching and update process for Reader and the browser plug-in is also something of a nuisance.

One of the recommended mitigations for all this has been to use an alternative PD F viewer for routine tasks.  For Windows, there is Foxit Reader; Linux users can use the very small and speedy xpdf.   The built-in reader in Chrome is another choice.  Now, according to an article posted at Geek.com, an early version of a similar built-in PDF reader for Mozilla’s Firefox is available.  The Mozilla viewer has some distinctive features:

  • It is implemented in JavaScript
  • It is entirely open source (unlike the Chrome reader, which apparently incorporates some code from Foxit)
  • Its feature set, like the Chrome reader, is not as complete as Adobe Reader’s, but will be suitable for routine documents
  • The development project is also open, and documented at the PDF.js page on MozillaWiki.

This, it seems to me, is a very worthwhile development, giving Firefox users a simple, free, and open alternative.


Royal Society Archives Available Free

October 27, 2011

One of the more worthy ideas that has motivated development of the Internet is that it would make available an enormous body of knowledge to people all over the world at relatively low cost.   The reality of the Internet’s evolution has not always matched these ideals, what with the efforts of the content producers to restrict the availability of information; but we have seen some progress toward more open access.  One obvious example is the continuing development of Wikipedia, the open source encyclopedia.  Princeton and Yale have taken steps to make more information available on line, and the National Academies Press has made more than 4,000 of its books available at no cost as PDF documents.

Now, according to an article in the BBC News Magazine, The Royal Society, the oldest scientific academy in the world, is making its archive of journal articles, dating back to 1665, permanently available on line at no cost.  The archive, which contains about 60,000 scientific papers, provides a unique look at several centuries of  scientific development.

The plague, the Great Fire of London and even the imprisonment of its editor – just a few of the early setbacks that hit the Royal Society’s early editions of the Philosophical Transactions.  But against the odds the publication, which first appeared in 1665, survived.  Its archives offer a fascinating window on the history of scientific progress over the last few centuries.

The archive has papers by Isaac Newton, Benjamin Franklin, and James Clerk Maxwell, as well as Crick and Watson’s paper on the structure of DNA.  (The Royal Society’s own announcement is here.)   I mentioned some of these in a post back in 2009, on the occasion of The Royal Society’s 350th year.  There is, of course, vastly more available to explore, sometimes wonderful, sometimes a bit wacky.  You can search the archive at Royal Society Publishing here.

The Royal Society is to be commended for making this material available to everyone; I hope it inspires other organizations to take similar actions.


%d bloggers like this: