Lose the Sweet Tooth

May 27, 2013

One can safely assume that cockroaches are not among the typical urban resident’s favorite animals.  While they are not as dangerous as, say, a malaria-carrying mosquito, or a tse-tse fly, they can potentially transport infectious agents, and have been implicated in some human allergies.  Mostly, though, people just think that they’re gross.  They’re also quite hardy, and able to go for fairly long periods without food or water; getting rid of them can be a chore.

Back in 1976, the Black Flag company introduced a roach trap called the Roach Motel™; it is a small enclosure, which contains a slow-acting poison mixed with a bait.  (Other companies marketed similar products.) The idea is that roaches will be attracted to the bait, and then carry the accompanying poison back to their nests, thereby getting other roaches as well.  The product was quite successful, in part because of aggressive advertising, and its memorable slogan, “Roaches check in, but they don’t check out.”

More recently, though, users have noticed that the traps were becoming less effective. The New Scientist  has a recent article on some research that may explain why.   One of the ingredients in the bait used is the simple sugar, glucose.  It seems that the roaches have evolved a distaste for glucose.

In the race for world domination, cockroaches have scored another point against Homo sapiens. Their weapons? A distaste for sugar and a helping hand from evolution.

In a paper published in the journal Science, researchers at North Carolina State University in Raleigh discovered that some roaches of the species commonly known as “German cockroaches” [Blattella germanica] had a difference in their neurochemistry that caused glucose to “taste” bitter, a trait which they passed on to their offspring.  The use of glucose as a bait in poisoned traps creates selection pressure that favors roaches without a sweet tooth.  Hence, the authors suggest, evolution is at the heart of the traps’ decreased effectiveness.

This kind of evolutionary “arms race” is similar to what we have seen in the development of antibiotic-resistant bacteria, DEET-resistant mosquitoes, and herbicide-resistant weeds.  We humans are quite good, in many cases, at devising ways to modify our environment.  But we too often forget that our environment is not just a passive lump of matter — when we push, it frequently pushes back.


Weather Forecasts: Improving

May 25, 2013

Although there are a lot of different sources from which you can get a weather forecast, those forecasts all come from one of a few sources: national weather services that run large, numerical weather prediction models on their computer systems.  Two of the major suppliers are the US National Weather Service’s [NWS]  Global Forecasting System [GFS] (the source for most US forecasts), and the European Centre for Medium-Range Weather Forecasts [ECMWF], located in Reading, England.  Over the last few years, there has been a growing feeling that the US effort was not keeping up with the progress being made at ECMWF.  The criticism became considerably more pointed in the aftermath of last year’s Hurricane Sandy.  Initial forecasts from the GFS projected that the storm would head away from the US East Coast into the open Atlantic.  The ECMWF models correctly predicted that Sandy would make a left turn, and strike the coast in the New Jersey / New York region.

According to a story in Monday’s Washington Post, and a post on the paper’s “Capital Weather Gang” blog, at least one good thng will come out of this rather embarrassing forecasting error.  It’s anticipated that the NWS will get additional appropriated funds to allow the computers and the models they run to be updated.

Congress has approved large parts of NOAA’s spending plan under the Disaster Relief Appropriations Act of 2013 that will direct $23.7 million (or $25 million before sequestration), a “Sandy supplemental,” to the NWS for forecasting equipment and computer infrastructure.

This should go a long way toward addressing one of the most pressing needs for the GFS: more computing horsepower.

Computer power is vital to modern weather forecasting, most of which is done using mathematical models of the Earth’s climatic systems.  These models various weather features, such as winds, heat transfer, solar radiation, and relative humidity, using a system of partial differential equations.  (A fundamental set of these is called the primitive equations.)  The equations typically describe functions that are very far from linear; also, except for a few special cases, the equations do not have analytic solutions, but must be solved by numerical methods.

The standard techniques for numerical solution of equations of this type involves approximating the differential equations with difference equations on a grid of points.  This is somewhat analogous to approximating a curve by using a number of line segments; as we increase the number of segments and decrease their length, the approximation gets closer to the true value.  Similarly, in weather models, increasing the resolution of the grid (that is, decreasing the distance between points) allows better modeling of smaller-scale phenomena.  But increasing the resolution means that correspondingly more data must be processed and more sets of equations solved, all of which takes computer power.  Numerical weather prediction , although it had been worked on for some years, really only began to be practical in the 1950s, with the advent of digital computers, and the early weather models had to incorporate sizable simplifications to be at all practical.  (It is not too useful to have a forecasting model, no matter how accurate, that requires more than 24 hours to produce a forecast for tomorrow.)

The computation problem is made worse by the problems inherent in data acquisition.  For this type of numerical analysis, the three-dimensional grid would ideally consist of evenly spaced points, covering the surface of the Earth and extending upwards into the atmosphere.  Clearly, this ideal is unlikely to be achieved in practice; getting observations from the center of Antarctica, or the mid-Pacific Ocean, is not terribly convenient.  There are also ordinary measurement errors to deal with, of course.  This means that a good deal of data pre-processing and massaging is requied, in addition to running the model itself, adding even more to the computing resources needed.

Many observers point to the GFS’s limited computer power as one of the chief weaknesses in the US effort.  (For example, see this blog post by Cliff Mass, Professor of Atmospheric Sciences at the University of Washington, or this post by Richard Rood, Professor at the University of Minnesota in the Department of Atmospheric, Oceanic and Space Sciences.)   The processing speed of the current GFS system is rated at 213 teraflops (1 teraflop = 1 × 10¹² floating point operations per second); the current ECMWF system is rated at 754 teraflops (and is listed as number 38 in the most recent Top 500 supercomputer list, released in November 2012 — the GFS system does not make the top 100).

The projected improvements to the GFS system will raise its capacity to approximately 2600 teraflops; in terms of the most recent Top 500 list, that would put it between 8th and 9th places.  (Over the same period, the ECMWF system is projected to speed up to about 2200 teraflops.)   This will enable the resolution of the GFS to be increased.

The NWS projects the Sandy supplemental funds will help enhance the horizontal resolution of the GFS model by around a factor of 3 by FY2015, enough to rival the ECMWF.

There are also plans to make other improvements in the model’s physics, and in its associated data acquisition and processing systems.

These improvements are worth having.  The projected $25 million cost is a very small percentage of the total Federal budget (about $3.6 trillion for fiscal 2012).  As we are reminded all too often, extreme weather events can come with a very large price tag, especially when they are unexpected.  Better forecasts have the potential to save money and lives.


Statistical Twisters

May 22, 2013

During yesterday evening’s ABC World News program, which was largely taken up with coverage of the tornado disaster in and around Moore OK, there was a segment on a 90+ year old resident who had lost her house to a tornado for the second time.   (The first time was in May 1999, when a similar strong twister hit Moore.)  There was then a statement, which caught my attention, that the odds against this happening were “100 trillion to 1”.

Now, those are pretty long odds.  One hundred trillion is 100 × 10¹²; by way of comparison, it is about twenty times the estimated age of the universe, since the Big Bang, measured in days.  If the odds are true, we are talking about a really rare phenomenon.

Thinking about the question this morning, I decided to double-check the report — perhaps I had just misunderstood the number that was being quoted.  I found a report on the ABC News site, which actually made the whole odds business more questionable:

A recent tornado probability study, published by Weather Decision Technologies, predicted the odds of an E-F4 or stronger tornado hitting a house at one in 10,000.

That same study put the odds of that same house getting hit twice at one in 100 trillion.

It is almost impossible to imagine how both these probability assessments could be correct, or even reasonable guesses.  If the odds against the house being hit once are one in 10,000 (probability 0.0001) , then, if tornado hits are independent, the probability of a house being hit twice is (0.0001)², or odds of 1 in 100 million.  That would make the quoted odds (1 in 100 trillion) off by a a factor of one million.  Of course, if tornado hits are not independent, then my calculations are inappropriate.  But for the numbers to work as quoted, the first hit would have to, in effect, provide truly enormous protection against a second hit.  (If the odds against the first are one in 10,000, then the odds against the second must be truly astronomical to produce cumulative odds of one in 100 trillion.)

Now, I don’t actually believe that tornado hits are independent.  Tornadoes certainly do not occur uniformly across the world, or even across the United States.  The NOAA Storm Prediction Center’s Tornado FAQ Site has a map highlighting “tornado alley”, the area where most significant tornadoes occur.  Although a tornado may, in principle, occur almost anywhere, you are considerably more likely to encounter one in Kansas or Oklahoma than you are in northern Maine or the upper peninsula of Michigan.

This question of independence is directly relevant to the news segment I mentioned at the beginning; it turns out that the unfortunate lady who has lost two houses built the second one on the same site as the first one, destroyed in 1999.  If the odds are affected at all by location (as they seem to be, at least “in the large”), then this was not, perhaps, the best possible choice.

I’ve griped before about the widespread ignorance of journalists and others when it comes to statistical information.  I have tried to find a copy of the  “Tornado Probability Study” mentioned in the quote above, so far without success.  I’ll keep trying, and report on anything I discover.  If I’m missing something, I’d like to know; if the probabilities are just made up, I’d like to know that, too.


Triclosan, Still

May 21, 2013

I’ve written here a number of times over the past couple of years (most recently here) about triclosan, an anti-bacterial and anti-fungal agent that is used in a wide variety of consumer products, including anti-bacterial soaps, toothpaste, deodorant, mouthwash, other cosmetic products, and household cleaning supplies.   The US Food and Drug Administration [FDA] has been conducting a safety and effectiveness review of triclosan for some time now. The review was originally scheduled to be released in April, 2011; last summer, it was promised by the end of the year (2012).  We’re all still waiting.

The Singularity Hub site has an article on this ongoing saga.  It gives a bit more of the history: the FDA issued draft guidelines in 1978, which classified triclosan as “not generally recognized as safe and effective”.  Since the guidelines were never finalized, nothing changed.

The FDA has not given an updated timetable for the release of its review.


Is It Warm in Here?

May 18, 2013

The May 11 issue of The Economist has an interesting, though disturbing, short article on one measure of global climate change: the percentage of carbon dioxide [CO2] in the atmosphere.  This has recently reached a new high in recent history.

AT NOON on May 4th the carbon-dioxide concentration in the atmosphere around the Mauna Loa Observatory in Hawaii hit 400 parts per million (ppm).

Now, 400 ppm does not sound very high; after all, it is only 0.04%.  However, as the article goes on to point out, this concentration of CO2 has not been routinely present since the Pliocene epoch, about 4 million years ago.

The data series  is from the observatory at Mauna Loa in Hawaii, run by the Scripps Institution of Oceanography, part of the University of California at San Diego.  This series (sometimes called the Keeling Curve in honor of the scientists who initiated the project) is of particular interest for two reasons:

  • The observation site is remote from large centers of human population, minimizing fluctuations due to temporary pollution spikes.
  • The observations have been made consistently, at the same place, since 1958.

There is a regular seasonal fluctuation in CO2 levels, tied to plants’ growth cycles.  In the northern hemisphere, levels tend to peak in May, and then fall until about October, as plants’ growth removes carbon dioxide from the atmosphere.

Carbon Dixoide Levels at Mauna Loa

Source: Scripps Institution of Oceanography

The seasonal pattern is clearly visible in the graph.  The more striking thing, of course, is the steady rise in the carbon dioxide levels, an increase of more than 25% over the observation period.  And there is no evidence that the rate of increase is getting smaller.


Triclosan Again

May 5, 2013

The Yahoo! News site has an article from the Associated Press [AP] about the US Food and Drug Administration’s [FDA] ongoing review of triclosan, an anti-bacterial and anti-fungal agent that is used in a wide variety of consumer products, including anti-bacterial soaps, toothpaste, deodorant, mouthwash, other cosmetic products, and household cleaning supplies.  The FDA’s original goal was to release the results of this review in April, 2011; clearly they are a bit behind schedule.   (According to the article, the results should be released later this year — or, at least, real soon now.)  Triclosan does have one use explicitly approved by the FDA: it is used in some toothpastes to help prevent gingivitis.  Its other uses have not, as far as I know, been subject to any formal approval process.

I’ve written here a couple of times before about the use of triclosan.  It is suspected, based on animal studies, of being an endocrine disruptor, boosting the effect of testosterone and estrogen, and reducing that of thyroid hormones.  Another animal study, reported last summer, suggests that triclosan can interfere with muscle function.   What is most striking, though, is that, for its main use, as an anti-bacterial agent in consumer products, there is essentially no evidence that it has any value at all.  As the FDA website, and other publications, have said for some time:

For other consumer products, FDA has not received evidence that the triclosan provides an extra benefit to health. At this time, the agency does not have evidence that triclosan in antibacterial soaps and body washes provides any benefit over washing with regular soap and water.

This is not to diminish, in any way, the importance on washing in  general, and washing ones hands in particular.  (The Centers for Disease Control have resources on hand hygiene.)  But, as the FDA’s note suggests, the evidence suggests  that ordinary soap and water work just fine.  As I wrote in an earlier post:

My own conclusion is that, since I have seen no evidence that these anti-bacterial products provide any benefit, and since there may be some risk, they are not worth using, especially since they cost more than plain old soap.

Apart from the possible negative effects of any particular chemical, there is a general argument for not using anti-microbial products indiscriminately.  There is a possibility that excessive usage may contribute to antibiotic resistance, and there is also a risk of disrupting the normal population of microbes that are part of our personal biosystems, which can lead to serious health problems.  It hardly seems worth much risk to use something, like triclosan, that in most cases doesn’t seem to work anyway.


%d bloggers like this: