Mood-Altering Bugs Inside

August 30, 2011

I’ve mentioned here before the ongoing research on the menagerie of micro-organisms that call us home; the National Institutes of Health is sponsoring the Human Microbiome Project, which is attempting to identify and classify these organisms (which outnumber our own cells by about 10:1), and their roles.   We’ve also seen that disruption of the normal microbial population of the digestive tract can cause some serious health problems.

The Science Now site has a report that some new research suggests that changes in the digestive system’s microbiome can affect how the brain works.  It has been hypothesized that neurological functions might be affected by toxins produced by bacteria, or by the effect of microbes on the immune system, but this new work seems to suggest a more direct effect.

Now, a new study suggests that gut bacteria can even mess with the mind, altering brain chemistry and changing mood and behavior.

To test for an effect, John Cryan, a neuroscientist at University College Cork in Ireland,and a group of researchers from McMaster University in Canada, fed laboratory mice a broth containing Lactobacillus rhamnosus, a benign bacterium.  The particular species was handy, but is also of potential interest because other bacteria from the genus Lactobacillus are common in probiotic preparations.  Compared to a control group of mice that received broth without the added bacteria, the treated mice showed lower levels of stress and anxiety.

Mice whose diets were supplemented with L. rhamnosus for 6 weeks exhibited fewer signs of stress and anxiety in standard lab tests, Cryan and colleagues report online today in the Proceedings of the National Academy of Sciences [abstract].

The research team found some evidence that the effect was due, at least in part, to a change in the receptors for GABA [gamma-Aminobutyric acid], an important inhibitory neurotransmitter.

In the brains of the treated mice, the researchers found changes in the activity of genes that encode portions of the receptor for the neurotransmitter GABA. GABA typically dampens neural activity, and many drugs for treating anxiety disorders target its receptors.

The researchers also found that, if the vagus nerve, a major channel for sensory data from the gut to the brain, was severed, the effect did not occur, providing further evidence that the central nervous system was being affected directly.

How all this works is still a considerable puzzle, but it is another reminder that all of our bodily systems are connected in many different ways.

The MedicalXpress site also has an article on this research.


Google Releases Chrome 13·0·782·218

August 30, 2011

Google has released a new version, 13·0·782·218, of its Chrome browser, for all platforms (Linux, Mac OS X, Windows, and Chrome Frame).  This release includes the updated version, 10.3.183.7, of Adobe’s Flash Player, released a few days ago.   More details are available in the release announcement on the Chrome Releases blog.

Windows users  should  get the new version via the built-in automatic update mechanism; you can verify that your system has been updated by clicking on the tools menu (the little wrench), and then on “About Google Chrome”.  Linux users can get the updated package using their distros’ usual update tools.


New Worm Attacks via RDP

August 29, 2011

A new worm, which has been named Morto, has surfaced on the Internet in the last few days, according to a diary post at the SANS Internet Storm Center.  It infects machines via the Remote Desktop Protocol [RDP], developed by Microsoft, and is capable of compromising both Windows servers and workstations.  A key symptom of Morto infection is a very large amount of outbound traffic on TCP port 3389, which is used for RDP; the traffic is the result of the worm searching for other machines to infect via RDP.  The attack itself is fairly basic; the worm tries to log in to the target computer using a list of common user IDs (e.g., admin, root, support) and common dumb passwords (e.g., 1234567, 1qaz2wsx, password).  Once established, the worm attempts to contact one or more remote control servers on the Internet, which can instruct it to launch denial-of-service attacks.  It also attempts to terminate any running processes with names in a list of those commonly used by anti-malware tools.

This is the first prominent worm we have seen for a while, and the first to use RDP as an infection vector.  At one time, worms like SQLSlammer were quite common, but more targeted attacks for monetary gain have been more important recently.

Microsoft has articles describing the original worm, and a new variant, at its Malware Protection Center; these articles give extensive details on how the worm operates.  The ThreatPost blog, by security vendor Kaspersky Labs, also has an article on the original worm, and an update on recent developments.

Given the simplicity of the attack mechanism, properly administered systems should not be very vulnerable (surely you do not use ‘1234567’ as a password!), but it is worth checking that you are up-to-date on patches, that your firewall is properly configured, and that you are not running RDP services if they are not required.  And if you see a lot of TCP/3389 traffic, it merits a closer look.


Another Look at Gravity

August 28, 2011

Back in 2010, I posted a couple of articles about a new approach to gravity, proposed by Dr. Erik Verlinde of the University of Amsterdam, which attempts to explain why gravity exists.   That there is a question about this may come as something of a surprise; but both Newton’s theory of gravity, which is good enough to send a spacecraft to Mars, and Einstein’s theory of General Relativity, which models gravity as a “warping” of four-dimensional space-time, are basically descriptive.  Gravity still presents some puzzles.  It is by far the weakest of the four fundamental forces (the others are electromagnetic force, and the strong and weak nuclear forces); gravity is also the one force that is not consistent with the standard model of particle physics.   So far, no one has been able to reconcile General Relativity with quantum mechanics, even though both theories are very successful at predicting physical phenomena, General Relativity on the very large scale, and quantum mechanics on the very small.  Dr. Verlinde’s proposal suggested that gravity might be an emergent property caused by the increasing entropy of the universe.

He [Verlinde]suggested that gravity is merely a manifestation of entropy in the Universe, which always increases according to the second law of thermodynamics. This causes matter to distribute itself in a way that maximises entropy. And the effect of this redistribution looks like a force which we call gravity.

A major attraction of this idea is that it might provide a means of reconciling quantum mechanics with General Relativity.  Now, according to an article in the “Physics arXiv” blog at Technology Review, Archil Kobakhidze at The University of Melbourne in Australia has argued that existing experimental data are not consistent with Verlinde’s emergent gravity hypothesis.  He argues that, because emergent gravity requires that each particle be acted on by a large number of other particles, it has particular implications for the force experienced by each particle, and that those implications differ from those resulting from the traditional view of gravity.

In other words, the emergent and traditional views of gravity make different predictions about the gravitational force a quantum particle ought to experience. And that opens the way for an experimental test.

As it happens, several recent experiments have measured the gravitational force on neutrons, and Dr. Kobakhidze says that these measurements show that the emergent hypothesis is wrong.

“Experiments on gravitational bound states of neutrons unambiguously disprove the entropic origin of gravitation,” he says.

Kobakhidze’s paper, which is quite technical, is available on the arXiv.org site: [abstract, with link to PDF download].

At least for the present, gravity seems set to remain more than a bit mysterious.


Flash Player Updated

August 27, 2011

Adobe has quietly released an updated version, 10·3·183·7, of its Flash Player for all platforms (Windows, Linux, Mac OS X, and Solaris).  This release is described as addressing “compatibility issues that were encountered with Flash Player 10.3.183.5”.  Further details are in the Adobe forum post about the release.  The announcement also says that users of version 10.3.185.5 will not be prompted automatically to update their installations.  However, anyone downloading the player will get the new version.

If you are not experiencing any of the issues identified in the forum post, I don’t think there is any urgency about updating your system.  You can always check to see if you have the most recent version of Flash Player by visiting the “About Flash Player” page.  Adobe has also, apparently, set up an RSS feed to provide notice of updates to Flash Player and AIR; you can subscribe here.


A Really Big Disk Drive

August 26, 2011

It’s a commonplace of today’s technology that we are able to obtain more and more capability in smaller and smaller packages.  I can carry in my shirt pocket more storage, in a USB drive about the size of a pack of chewing gum, more storage than the first System/360 computer I worked on ever had.   A less obvious result of the same advances in technology is that we can build much bigger systems than would have been imaginable even a couple of decades ago.  (We use some of these systems — Google and Facebook come to mind — all the time, but most of us never see them physically, or have occasion to think about the totality of the system.)

Now, according to an article at Technology Review, IBM Research in Almaden, CA, is in the process of building a disk drive array at least ten times bigger than any in existence.  The new, as yet unnamed, system, which is being built for an unnamed client, will use 200,000 individual drives to achieve a capacity of 120 petabytes, or 1.2 × 1017 bytes.  To put it another way, it’s 120 million gigabytes.

The giant data container is expected to store around one trillion files and should provide the space needed to allow more powerful simulations of complex systems, like those used to model weather and climate.

Other potential application areas are seismic processing in the oil/gas industry, and computational chemistry.  Although the initial uses for this array will almost certainly be in supercomputing environments, the technology could be adapted for more conventional cloud computing systems.

The hardware for the system is water-cooled, to accommodate higher equipment density.  With so many individual devices, dealing with component failures is a necessity.  The system uses fairly standard data redundancy techniques (as in RAID systems) to allow data from a failed drive to be rebuilt in spare space, but its software is designed to minimize the effect on overall system throughput.

IBM uses the standard tactic of storing multiple copies of data on different disks, but it employs new refinements that allow a supercomputer to keep working at almost full speed even when a drive breaks down.

When a lone disk dies, the system pulls data from other drives and writes it to the disk’s replacement slowly, so the supercomputer can continue working. If more failures occur among nearby drives, the rebuilding process speeds up to avoid the possibility that yet another failure occurs and wipes out some data permanently.

The new system uses IBM”s Global Parallel File System [GPFS], a distributed,highly scalable file system, originally developed for high-performance computing to provide faster data access by, for example, “striping” parts of a file across multiple devices.  IBM says that current GPFS implementations have I/O rates of ~100 GB/sec.  GPFS also incorporates features to reduce the overhead associated with maintaining the file system.

Last month a team from IBM used GPFS to index 10 billion files in 43 minutes, effortlessly breaking the previous record of one billion files scanned in three hours.

(The IBM GPFS pages linked above have more information and links to documentation on GPFS.)   IBM Research says it is working on a new generation of GPFS systems that will be even faster.

IBM Research – Almaden is working with IBM’s product divisions to extend GPFS to support a new 2011-2012 generation of supercomputers featuring up to 16,000 nodes and 500,000 processor cores. Such a system must be capable of achieving I/O rates of several terabytes per second to a single file, be capable of creating 30,000 to 40,000 files per second, and holding up to a trillion files (to create a trillion files, just create 30,000 files per second continuously for a year).

This is pretty amazing stuff.


%d bloggers like this: