Opera Releases 10.53, Security Update

April 30, 2010

Just a couple of days after releasing version 10.52 of the Opera browser for Windows and Mac OS X, Opera Software has released another new version of the browser, 10.53, also for Windows and Mac.   This update addresses a specific security vulnerability , rated by Opera as “extremely severe”.   The (brief) Release Notes are here.

The new version should be available via the built-in update mechanism (main menu: Help / Check for Updates), or it can be downloaded here.  Because of its security content, I recommend that you install this update as soon as you conveniently can.


Ubuntu Linux 10.04 Released

April 29, 2010

The good folks at Canonical Ltd have announced the release of the newest version of Ubuntu Linux, 10.04 (LTS), code-named Lucid Lynx.  This is a Long-Term Support [LTS] release, meaning that it will get software bug fixes and security updates for three years for the desktop edition, five years for the server edition.  (There is also a Netbook Edition optimized for less powerful devices with smaller screens.)  The ISO CD image can be downloaded from this page.   As usual, the image is for a bootable “live CD”, so you can try out the system without installing anything to you PC’s hard disk.  (Running from the CD, rather than from the hard disk, is generally slower, of course.)  Once you have downloaded the CD image, just re-boot your computer from the CD.

The base system installation from the CD is simple, and gives you a system with a pretty complete set of basic software.  It includes the OpenOffice.org office suite, the Firefox browser, and tools for E-mail, media players, photo editors, and many more.  There is also a huge selection of free software available online via the Software Center.

It is also possible to buy Ubuntu CDs and DVDs directly from Canonical, from Amazon in the USA, and from various other suppliers.

The core Ubuntu distribution for desktops uses the GNOME graphical desktop.  An official alternative version, Kubuntu, uses the KDE desktop, which may seem a bit more familiar to Windows users, and is available here.   Particularly if you have older hardware, you might also be interested in Xubuntu, which uses the Xfce desktop, and requires less in the way  of hardware resources.   There are several other derivative versions for specific environments.

The Ubuntu Linux system, and the tools included with it, are all free software; you are not only allowed, but encouraged to share copies with your friends.

Update Thursday, 29 April, 17:35 EDT

I forgot to include a link to  the Release Notes.   Also, Ars Technica has a brief overview of the new release.


Skip the Soda

April 28, 2010

You’ve probably heard that drinking too much soda is a bad idea, perhaps because the sugar will rot your teeth, or perhaps because all those extra calories are bound to show up somewhere.  Now, according to some recent research summarized at the PhysOrg site, there’s another potential reason to avoid soda, and other processed foods containing high levels of phosphates: they make you get older faster.

New research published online in the FASEB Journal shows that high levels of phosphates may add more “pop” to sodas and processed foods than once thought. That’s because researchers found that the high levels of phosphates accelerate signs of aging.

The research team, led by Dr. M. Shawkat Razzaque of the Harvard School of Dental Medicine, studied three groups of mice.  The first group had a genetic modification that resulted in their having abnormally high phosphate levels in their bodies.  A second group had a modification that resulted in low phosphate levels.  The third group was genetically identical to the second, but was fed a diet that produced high phosphate levels.

Mice in both the first and the third group — the mice that had high phosphate levels, induced by either genetics or diet — had substantially shorter lifespans than the mice in the second group.   There was evidence that high phosphate levels were associated with renal, cardiovascular, and skeletal diseases.  From the paper’s abstract:

The results of our dietary and genetic manipulation studies provide in vivo evidence for phosphate toxicity accelerating the aging process and suggest a novel role for phosphate in mammalian aging.

Perhaps it’s just as well that I’ve always liked coffee better.


Opera Releases Version 10.52

April 28, 2010

The folks at Opera Software have released a new version, 10.52, of their Web browser for Windows and Mac OS X.  The changes, which are summarized here for Windows, and here for Mac,  are mainly stability improvements and bug fixes.  The new version is available from the download page.

Linux users should note that the current version for their platform is still 10.10, although a 10.5x release is promised “shortly”.


The Significance of Statistics

April 27, 2010

The science columnist Clive Thompson has an interesting article over at Wired on the problematic interplay between the average person’s lack of statistical understanding, and the increasing importance of statistical evidence in public policy debates.  He takes as his starting point events around the major snowstorms we had this past winter here in the Washington DC area.  Many of those skeptical of the idea of global warming immediately cited the snowstorms as proof that global warming couldn’t be true.

Now, if Thompson’s point was just that this argument was idiotic, there really wouldn’t be much else to say.  But he goes on to talk about something that has become a concern of mine in the last few years: the serious ignorance of journalists — even science journalists — and the public at large when it comes to understanding statistical data.

We live in a world where the thorniest policy issues increasingly boil down to arguments over what the data mean. If you don’t understand statistics, you don’t know what’s going on — and you can’t tell when you’re being lied to.

Climate change is just one example of an area where lack of understanding leads to foolish or pernicious claims.  Lack of statistical understanding also bedevils the criminal justice system: for example, with respect to evidence based on fingerprints or DNA.  It also occasions a great deal of confusion in medical testing, particularly with regard to the base rate fallacy. As Thompson points out, there are many other examples of very dubious conclusions arrived at due to lack of statistical understanding.

I’ve sometimes been asked, by parents of high-school students, whether they should take probability and statistics courses, and why they are important.  There are basically two parts of my answer.  The first is that, as Thompson’s article suggests, understanding at least the basics of statistical analysis is important to playing a constructive role as a citizen.  The second is that, although the mechanics and calculations of statistical analysis are usually not especially difficult, the mental discipline required to think about problems carefully is not easy or intuitive.

Granted, thinking statistically is tricky. We like to construct simple cause-and-effect stories to explain the world as we experience it. “You need to train in this way of thinking. It’s not easy,” says John Allen Paulos, a Temple University mathematician.

Doing it right takes instruction and practice.  Even very simple problems can be traps for the unwary or unskilled.  I’ll talk about that a bit more, and give some examples, in a follow-up post.


More Surprises from Inner Space

April 26, 2010

I’ve  talked here before about the complex ecosystem of bacteria and other microbes that exists inside each of us, and about the concern that some of our well-intentioned efforts at improving health may upset a delicate equilibrium that helps keep us healthy.  Now some biologists at Cal Tech have discovered what may be some additional pieces of the puzzle.

According to a report at the PhysOrg site, disturbances to the microbial equilibrium in the gut have been linked to inflammatory bowel disease and colon cancer.

“It has been proposed that the coupled equilibrium between potentially harmful and potentially beneficial bacteria in the gut mediates health versus disease,” says Sarkis K. Mazmanian, assistant professor of biology at Caltech. “If the balance is altered,” say, by changes in diet, the effects of stress, or the use of antibiotics, “then the immune response in the intestines is also changed.”

The normal microbial population of the gut includes something like 1000 different species of bacteria.  It may include some distinctly harmful (pathogenic) bacteria; it also contains a large number of beneficial bacteria (symbionts).   The CalTech researchers suggest that there is a third category, which they have dubbed “pathobionts”, exemplified by an organism called Helicobacter hepaticus.  This organism can live in the gut of a healthy individual for many years without causing any ill effects; however, it can cause symptoms similar to inflammatory bowel disease in individuals whose immune system is compromised.

The team found that the “tolerant” relationship in healthy individuals was maintained by the bacterium’s secretion system, “a collection of proteins the microbe uses to send chemical messages to its host.”   These messages apparently work to negotiate a “cease fire” between the microbe and the immune system.  When the secretion system is disrupted, the population of H. hepaticus increases sharply, and the activity of the immune system also increases, producing inflammation.

“The bacteria appear to have struck a deal with their host,” Mazmanian says. They keep their own numbers low so they don’t overwhelm the immune system, and in return, the immune system leaves them alone. “The bacteria need the secretion system to put the host in ‘don’t attack’ mode.”

The paper describing this research appears in the April 22 issue of Cell Host & Microbe, and is available online here.


Programming for the Cloud

April 25, 2010

I’ve written here before about some of the issues and problems raised by the development of multi-core processors, and the attempt to parallelize computations to take advantage of the additional hardware capabilities.  Programming effectively for the “cloud computing”  environment is a similar, but more difficult problem.  The degree of concurrency (e.g., the number of available processors) is often not known in advance.  Timing problems and potential race conditions can be tricky even on a single machine with multiple processors; on a “machine” comprised of many distinct physical machines, connected by IP networks, relative timing is essentially impossible to predict.

According to an article at Technology Review, a research group at the University of California, Berkeley, is developing a new set of tools to make programming for the cloud easier.  (The Technology Review article is part of the TR 10 series, an annual list of the most important emerging technologies, as seen by the editors.)   Their starting point is the idea behind database programming languages, ranging from the venerable Structured Query Language [SQL] to more complex systems like Google’s Map-Reduce.   These languages describe what is to be done with the data, not how it is to be processed.  This has the effect of abstracting the data manipulation from the details of how the data is stored and retrieved. It also provided a framework (for example, the relational algebra underlying SQL) in which the data problem could be analyzed.

(I can remember the introduction of the first relational data base products that used SQL.  In the IBM mainframe world, they were DB/2, for MVS systems, and SQL/DS, for VM systems.  Prior data base systems were notoriously difficult to program, because the application had to be aware of how the data was stored, indexed, and so on.)

The group at Berkeley, led by Joseph Hellerstein, proposes to extend this idea to incorporate the specification of temporal variation in the data.

The solution, ­Hellerstein explains, is to build into the language the notion that data can be dynamic, changing as it’s being processed. This sense of time enables a program to make provisions for data that might be arriving later–or never.

This is a very intriguing idea.  Just as, for example, SQL allows the process of query optimization to be done “under the covers” without the direct involvement of the application programmer, these tools might allow concurrency problems to be dealt with without introducing unnecessary complexity in applications.

The project, called BOOM (for Berkeley Orders Of Magnitude) is still in a relatively early stage, but the team hopes to have a version of its new, open-source declarative language (called, somewhat confusingly, BLOOM) ready for release in late 2010.  They have already done some preliminary development with their new tools:

So far, Hellerstein’s group has used the Bloom language and its predecessors to quickly rebuild and add major features to popular cloud tools such as Hadoop, a platform used to manipulate very large amounts of data. By lowering the complexity barrier, these languages should increase the number of developers willing to tackle cloud programming, resulting in a wave of ideas for new types of powerful applications.

The project has a list of technical papers that are available for download, as well as a FAQ page, which talks about some of the work done with Hadoop.


%d bloggers like this: