Fishy Business

March 3, 2013

Have you ever seen these places that feature “fish sandwiches”? I always think, “Well, that’s kind of general.” I mean, I wouldn’t order something called a “meat sandwich,” would you?
     — George Carlin

As the late comedian George Carlin pointed out, many menus, particularly in fast food restaurants, are not too informative about the ingredients of their fish sandwiches.  It may be, though, that their practice is more honest than that of some much tonier and more expensive establishments.

The Washington Post reported, in a recent article, that consumers buying fish run a considerable risk of getting something other than what the label says.   The ocean conservation organization, Oceana, conducted an investigation from 2010 through 2012, collecting more than 1,200 seafood samples from 674 retail outlets in 21 states to determine if they were honestly labeled.  The results, while not entirely surprising, are not good news for seafood lovers: on the order of one-third of the fish offered for sale in groceries and restaurants is mislabeled.

Ninety-five percent of the sushi restaurants, 52 percent of other restaurants and 27 percent of grocery stores surveyed sold mis­labeled seafood.

The study, in the interest of fairness,  did not identify the establishments surveyed, because it is in general not possible to determine where the mislabeling originated.  (You can download a copy of the study [PDF] from the Oceana site; there is also a summary [PDF] of key findings.)

Ordering some varieties of fish almost guarantees that the customer will get something else.  Fish labeled snapper was something else (often tilapia, a cheaper fish) in 87% of the samples tested.  Tuna was also frequently mislabeled, with 59% of the samples labeled tuna being something else.  The samples collected by Oceana included 46 varieties of fish; of these, 27 varieties were sometimes mislabeled.  Of the grocery stores, restaurants, and sushi venues sampled, 44% sold at least some mislabeled fish.  Of the outlets sampled, 18% of grocery stores, 38% of restaurants, and 74% of sushi venues had mislabeled fish.

Incorrectly labeled fish was found all over the US.  For example, 39% of the samples from New York City were mislabeled; in Chicago, the figure was 32%, in Denver 36%, and in Southern California 52%.  Every snapper sample from Washington DC was mislabeled, and every sushi venue sampled there had mislabeled fish.

Some questionable labeling may exist for historical reasons.   For example, some sushi establishments offer “white tuna”, which usually turns out to be a species called escolar; “white tuna” is not the name of any specific species of fish.  The term is also used to designate the kind of albacore tuna that comes in cans.  Sometimes, too, new names for fish species are introduced for marketing reasons.  The Post article cites the case of the Patagonian toothfish, which sold much better (and actually became a threatened species) once it was relabeled as Chilean sea bass.  Another change for marketing reasons was relabeling of the dolphin fish (Coryphaena hippurus or C. equiselis) as mahi-mahi, so that customers would not think they were eating Flipper.

There are some potential health implications of this widespread mislabeling, too  Escolar, the fish called “white tuna”, can cause severe digestive distress in some individuals.  Some of the substitute fish, such as tilefish and king mackerel, are identified by the FDA as being inadvisable for sensitive groups, on account of high levels of mercury.

Some of the problem, I think, comes from the American propensity to prefer food that is divorced, as much as possible, from its natural origins.  Even leaving aside frozen and packaged foods, most fresh meat and seafood in the US is sold wrapped in little plastic trays.  In contrast, the neighborhood fishmonger in London, where I lived for about six years, always had whole fish displayed for sale.  On business and holiday trips to France, I often ate in restaurants where the fish were laid out on ice in a display case; the diner was invited to inspect them and make a selection. In most US restaurants, you will not find anything resembling a whole fish,

These results, while disturbing, are not particularly surprising.  We have seen before that food products and medicines have been contaminated by ingredients from unscrupulous suppliers; bogus electronic parts have found their way into defense systems.  One of the unintended consequences of economic globalization has been the development of supply chains that are long and frequently rather opaque.  According to Oceana, more than 90% of the seafood consumed in the US is imported, and only a very small percentage of that is inspected by government regulators.

There are some efforts underway to address the problem.  One idea is to tag fish with a traceable ID number, so that prospective purchasers can determine its origin.  A seafood supplier in Washington DC, Profish Ltd., has its own tracing program called FishPrint that it offers to its commercial customers.  I hope these efforts will succeed.  If they do, not only will consumers be better informed, but conservation programs will also benefit.

Caveman Cuisine

April 7, 2012

I’ve mentioned here before the hypothesis, proposed by Dr. Richard Wrangham, Professor of Biological Anthropology at Harvard University, that cooking is not only a distinctly human trait, but also a practice that helped shape human evolution.  The idea is that, because cooking makes nutrients in food more accessible with less effort, the switch to cooked food made it easier to support our big, metabolically expensive brains. One key question about the hypothesis is a historical one: did the routine use of fire for cooking precede or follow the anatomical changes that mark modern humans?

The BBC News site has a report on some new research [abstract], published in the Proceedings of the National Academy of Sciences,  that may shed some additional light on the question.    A team of researchers from South Africa, Canada, the US, Israel, and Germany explored the Wonderwerk Cave in the Northern Cape province of South Africa, and found sediment layers containing burned bone and ashes from plants, dating back approximately 1 million years.  Other evidence indicates that our ancestor Homo erectus used the cave.

Stone tools found at Wonderwerk Cave indicate the ancestor in question may have been Homo erectus, a species whose existence has been documented as far back as 1.8 million years ago.

The fire residue was found approximately 30 meters inside the cave entrance, making it unlikely to be the result of a wildfire spreading from the outside.  There is also evidence that fires occurred multiple times at the same spots.  (The complete paper is available here [PDF].)  If the results hold up under scrutiny, they will push back the earliest date of confirmed use of fire by about 300,000 years.

In an article on this research at the New Scientist, Dr. Wrangham says that the results are most interesting, but need further exploration.

Wrangham calls the work an “exciting breakthrough”. “There are other sites in Africa, more than 1 million years old, that now bear re-examination,” he says.

This evidence is not conclusive proof of Wrangham’s idea, of course, but it does suggest that some of our ancestors may have been more advanced than perhaps we thought.

Antibiotics in Agriculture

December 28, 2010

I’ve written several times before (here, for example) about the problem of bacteria resistant to antibiotics.  From a present-day perspective, it is hard to realize that infections, before the advent of antibiotics, were a very serious health threat, and not infrequently fatal.  So the possibility of running out of effective antibiotic therapies is something to take seriously.

It is obvious that any use of antibiotics creates evolutionary selection pressure favoring resistant organisms,  (Sir Alexander Fleming, discoverer of penicillin, saw evidence of developed resistance in his work.)   The use of antibiotics to treat infections in people has clearly been a huge positive for human health.  We don’t want to lose that; we do want to discourage doctors prescribing antibiotics indiscriminately (or just to make the patient feel that something is being done), and we want to encourage people to finish any course of antibiotics that is prescribed.

There are other potential sources of antibiotic resistance, though.  There is evidence that some other commonly used chemicals (such as triclosan) may also create selection pressure for antibiotic resistance; and large quantities of antibiotics are used in agriculture, not just to treat sick animals, but as a regular addition to food or water, in order to promote faster growth.  The food industry has always dismissed the idea that this could be a meaningful source of antibiotic resistance in humans; it has been difficult to get information on the quantities of antibiotics actually used.

A recent article at Wired provides some fresh evidence, based on data from a post at the Center for a Livable Future [CLF] blog.   (The blog is written by staff at the Center for a Livable Future at Johns Hopkins Bloomberg School of Public Health; posts on the blog are personal views, not official positions of the School.)   The FDA recently released information on the total amount of various antibiotics sold in the US for veterinary or agricultural purposes.  The CLF researchers have managed to come up with figures for the total amount of antibiotics used in humans, from a separate study.  The combined results show that 80% of the antibiotics used in the US are used on farm animals.   For example, about 10.2 million pounds of tetracycline antibiotics are given to animals each year, more than the amount of all antibiotics given to people.

The food industry, of course, denies that this is a problem, partially on the basis that some of the antibiotics that they use are not exactly the same as the ones used in human medicine (many of them are identical).  This is really a red herring; as the CLF authors wrote:

LivableFutureBlog readers might recall an October blog post in which Dr. Ellen Silbergeld, professor of environmental health sciences at the Johns Hopkins Bloomberg School of Public Health, warned that, “Bacteria respond to chemical structures, not brand names, and resistance to one member of a pharmaceutical class results in cross resistance to all other members of the same class.”

I have not heard anyone object to the therapeutic use of antibiotics for animals that are actually sick; the issue is the routine addition of antibiotics to the animals’ diets to promote faster (= cheaper) growth.   Since this practice is of economic value to the producers, they are virtually certain to resist any attempt to curtail it, but  the scientific evidence suggests that to continue with business as usual would be almost breathtakingly stupid.


The Ice Man Came

September 13, 2010

The “This Day in Tech” blog at Wired has an interesting article about the first delivery of ice, collected in New England, to Calcutta, India, on September 13, 1833.

The transoceanic operation, undertaken by the Tudor Ice Co., began in early May 1833, when approximately 180 tons of freshwater ice was loaded into the insulated hold of the sailing ship Tuscany in Boston.

The ice was “harvested” from frozen lakes and ponds during the winter.  (Thoreau writes about one such operation at Walden Pond, in Concord, Massachusetts, in the chapter “The Pond in Winter” in Walden.)

Of course, refrigeration had not yet been invented, so the ships carrying the ice, and the storage places for it, had to be built with considerable insulation — contemporary accounts suggest that a foot or more of insulating material surrounded the ice on all sides.

Having relatively pure ice, in a country with a warm climate, was a novelty and a luxury.  Even the Romans, who made a form of ice cream, had to rely on snow brought down from the mountains by runners.   The New England ice was, apparently, a big hit in India:

Locals marveled at the giant, icy cubes as they were unloaded from the specially outfitted seafaring vessels.

It would only be a few decades before mechanical refrigeration made it possible to make ice in India and other places where it never occurred naturally.  But it’s still kind of startling to realize that something that we take so much for granted had to be shipped halfway around the world not so long ago.

Wasting Energy

August 2, 2010

A recent article at the New Scientist gives some startling estimates of the amount of food that is wasted in the US.  Current estimates are that about 16 percent of energy consumption in the US is tied to food production, distribution, storage, and preparation.  Yet, according to a paper [abstract, PDF link to full paper available] by Amanda Cuéllar and Michael Webber of the University of Texas at Austin, published in the journal Environmental Science and Technology by the American Chemical Society, about 25% of food produced is wasted (and this is surely an underestimate, since it does not count food wasted “on the farm”, or waste in fishing).   From this, the authors calculate that about 2.15 × 1015 kiloJoules of energy is effectively wasted on the production of food that is thrown away.

That’s more than could be gained from many popular strategies to improve energy efficiency. It is also more than projections for how much energy the US could produce by making ethanol biofuel from grains.

The biggest areas of waste are in dairy foods and vegetables.

This really is an astonishingly large number.  The only bright spot is this: just think how fat some of our fellow citizens would be if it were all eaten — not a pretty picture.


May 8, 2010

I’ve written here several times about the development of antibiotic resistant bacteria, such as methicillin resistant Staphylococcus aureus, owing to the well-intentioned but indiscriminate use of antibiotics and other anti-microbial agents.  Using these agents creates selection pressure for the evolution of resistant organisms, and overuse accelerates it.

There is no reason to suspect that this general principle applies only to microbes, of course, and an article in the New York Times reports that the extensive agricultural use of a popular weed killer, glyphosate, sold by Monsanto under the trade name Roundup, has led to a similar sort of problem.   Roundup was originally developed and patented by Monsanto, and its use really took off when the company introduced seed varieties of corn, cotton, and soybeans that had been genetically modified to resist the herbicide.

Sales took off in the late 1990s, after Monsanto created its brand of Roundup Ready crops that were genetically modified to tolerate the chemical, allowing farmers to spray their fields to kill the weeds while leaving the crop unharmed. Today, Roundup Ready crops account for about 90 percent of the soybeans and 70 percent of the corn and cotton grown in the United States.

Roundup was eagerly adopted by farmers because it is fairly safe to use, killed a wide variety of weeds, and breaks down relatively quickly in the environment.

However, the law of unintended consequences has still not been repealed; as one might have expected, widespread use of Roundup created selection pressure for the evolution of resistant weed species.

Just as the heavy use of antibiotics contributed to the rise of drug-resistant supergerms, American farmers’ near-ubiquitous use of the weedkiller Roundup has led to the rapid growth of tenacious new superweeds.

This has forced farmers to return to some methods, such as frequent plowing, that they had abandoned, as well as the use of more toxic weedkillers.  These changes can bring along adverse environmental impacts of their own.

Monsanto, Bayer, and other agricultural chemical companies are now exploring the use of herbicide combinations, and the development of crop varieties resistant to other, older herbicides, in order to keep the problem of resistant weeds manageable.  Still, this is another reminder that these large, uncontrolled experiments have a habit of producing some rather undesirable consequences.

Skip the Soda

April 28, 2010

You’ve probably heard that drinking too much soda is a bad idea, perhaps because the sugar will rot your teeth, or perhaps because all those extra calories are bound to show up somewhere.  Now, according to some recent research summarized at the PhysOrg site, there’s another potential reason to avoid soda, and other processed foods containing high levels of phosphates: they make you get older faster.

New research published online in the FASEB Journal shows that high levels of phosphates may add more “pop” to sodas and processed foods than once thought. That’s because researchers found that the high levels of phosphates accelerate signs of aging.

The research team, led by Dr. M. Shawkat Razzaque of the Harvard School of Dental Medicine, studied three groups of mice.  The first group had a genetic modification that resulted in their having abnormally high phosphate levels in their bodies.  A second group had a modification that resulted in low phosphate levels.  The third group was genetically identical to the second, but was fed a diet that produced high phosphate levels.

Mice in both the first and the third group — the mice that had high phosphate levels, induced by either genetics or diet — had substantially shorter lifespans than the mice in the second group.   There was evidence that high phosphate levels were associated with renal, cardiovascular, and skeletal diseases.  From the paper’s abstract:

The results of our dietary and genetic manipulation studies provide in vivo evidence for phosphate toxicity accelerating the aging process and suggest a novel role for phosphate in mammalian aging.

Perhaps it’s just as well that I’ve always liked coffee better.

%d bloggers like this: