Fuel Cell Development Agreement Signed

January 28, 2013

The BBC News has a report that a new agreement to develop fuel-cell technology has been reached by three major auto manufacturers: Ford, Daimler AG, and the Renault-Nissan alliance.   The aim of the joint project is to speed up the development of fuel cells as an automotive power source, and also to encourage the development of supporting infrastructure (e.g., hydrogen filling stations).

Ford, Renault-Nissan and Daimler have agreed to jointly develop a fuel cell system to try to speed up the availability of zero-emission vehicles.

The carmakers hope to launch “the world’s first affordable, mass-market fuel cell car” by 2017.

Fuel cells, which produce electricity by combining hydrogen and oxygen, are an environmentally attractive technology because their “exhaust” product is water.  To date, though, they have not seen widespread use, except in specialized applications like the space program, because the cost of manufacturing the devices is too high,  But Daimler, at least, has been interested for some time; back in 2009, I wrote about the development of prototype fuel-cell vehicles by Mercedes-Benz.  Work on the project will be carried out at several locations around the world.  The partners hope that their combined global presence will also increase the visibility and impact of the project.

The companies have issued a joint press release.

Update Monday, 28 January, 21:55 EST

Wired also has an article on this development, focused especially on the infrastructure issues involved.


TSA Pulls Plug on “Porno Scanners”

January 21, 2013

I have written several times about the ongoing controversy over the US Transportation Safety Administration’s [TSA] use of full-body scanners (which the TSA calls “Advanced Imaging Technology”[AIT]) as part of its security protocol for screening air travelers.   The machines began to be introduced in the fall of 2010, and immediately created controversy.  One criticism, voiced by many security professionals, was that the effectiveness of the machines was questionable.  Another issue was the very detailed anatomical images produced by the devices, which led some privacy advocates to dub them “Porno Scanners”.  There was also a safety concern with one type of scanner, which uses backscatter X-ray technology, since it would expose the passenger to a small dose of ionizing radiation.  (A second type of scanner, which uses millimeter-wavelength radio waves, does not involve radiation exposure.)

Last summer, there were also developments in a court case, brought by a group of plaintiffs led by the Electronic Privacy Information Center [EPIC], challenging the use of the AIT devices, and asking the court to force the TSA to follow the normal review process for new government regulations.  On July 15, 2011, the US Circuit Court of Appeals for the District of Columbia had ruled that the TSA had to follow the normal procedure for issuing new regulations, as specified in the Administrative Procedures Act of 1946.  The TSA has now begun to comply with the review process, and has commissioned the National Academy of Sciences to look at the question of radiation exposure from the X-ray devices. It has also, as ordered by Congress, moved to replace the “anatomically correct” scan images with generic body images generated by software.

Now, according to an article at the Washington Post, the TSA has decided to remove 174 of the backscatter X-ray scanners from airports, because the vendor has not managed to equip them with the new generic-imaging software.

The Transportation Security Administration will remove 174 full-body scanners from airport security checkpoints, ending a $40 million contract for the machines, which caused a uproar because they revealed spectral naked forms of passengers.

TSA Administrator John S. Pistole issued the order this week after concluding that new software that made the machines less intrusive could not be developed by a June 1 deadline mandated by Congress.

The new software has apparently been successfully developed for the millimeter-wave scanners, which will continue to be used, and which will replace most of the X-ray machines that are being removed.

I have felt all along that the most disturbing part of this story was not the “porno” images, or even the safety questions, but the TSA’s apparent attitude that, because the machines were being used to “prevent terrorism”, it could just ignore inconvenient laws and regulations.  So this climb-down is a good thing, though it will doubtless be “spun” as something else.

There are also brief articles on this story at Ars Technica and Wired.


More New Old Tech

January 19, 2013

I’ve mentioned here before some instances in which current technical problems have sometimes been amenable to old technologies, dusted off and updated a bit.  For example, there is the use of the venerable AC induction motor, patented by Nikola Tesla in 1888, in new electric vehicles, as well as the renewed interest in the use of DC power distribution for data centers.

Now, according to an article at Technology Review, another old technology, for a type of Diesel engine, is getting another look.  The basic design, called the Jumo engine, was originated back in the 1930s by Junkers, a German aircraft manufacturer.  It was dirty, but very efficient.  In contrast to a conventional Diesel engine, which uses a single piston per cylinder to compress air and fuel, the Jumo engine uses two pistons per cylinder, compressing the air-fuel mixture between them.  The efficiency advantage arises from expending less energy heating up the cylinder head, leaving more to drive the pistons.

A California company called Achates Power has updated the engine design to allow it to meet current emission standards, at least in a one-cylinder prototype.  The US Army has given Achates, together with a partner company, AVL Powertrain Engineering, a $4.9 million grant to develop a multi-cylinder prototype.  The company believes that the engine can be made smaller and cheaper than existing Diesel engines, while boosting fuel economy by 20%.  Compared to a gasoline engine, the fuel economy would of course look even better.

This is still a prototype, and the new design is not likely to make the 2014 model year for new cars.  Still, it is encouraging that progress can be made without requiring a “great leap forward” in every instance.


Requiring Black Boxes for Cars

January 5, 2013

Back in May of 2012, I wrote about the current and potential use of event data recorders [EDRs], so-called “black boxes”, in automobiles.  Similar devices have been used for years on commercial aircraft. and the data obtained from them has been of great value in understanding crashes and improving safety.   Many newer cars already have some kind of event recording device.  The Department of Transportation’s National Highway Traffic Safety Administration [NHTSA] has established some required standards for the amount and type of data that must be collected by installed EDRs; however, the installation of the devices was not required.

The NHTSA has now published a proposed regulation (Docket No. NHTSA-2012-0177) in the December 13, 2012 edition of the Federal Register (copies available as plain text or PDF) that would require the installation of EDRs in most autos and light trucks manufactured on or after September 1, 2014.  These devices would be required to meet the already existing standards for data collection.

The proposed regulation is open to public comment until February 11, 2013.  You can submit comments online using the docket page at the Regulations.gov web site (it also has a viewable copy of the rulemaking notice).  Alternatively, you can submit comments by mail or fax by following the instructions in the notice.  All submitted comments will become a matter of public record; online submissions can be viewed via the docket page.

There are legitimate privacy issues surrounding the collection of this data, and the ownership of the collected data needs to be clarified.  Still, there is a good case to be made, on safety grounds, for collecting the data; it should be possible to arrive at a reasonable trade-off.


Combining Catalysts for Biofuels

November 7, 2012

Back in September of last year, I wrote about a process, developed by scientists at Tulane University, that used bacteria (of genus Clostridium) to produce butanol [C4H9OH] from cellulose.   Ars Technica now has a report on some further research along the same lines by a group of researchers at the University of California, Berkeley [UCB].  The process uses a combination of bacterial fermentation and metal catalysts to produce longer-chain hydrocarbons (~11 carbon atoms); the resulting mixture has characteristics similar to petroleum-based diesel fuel.  The paper describing this process has been published in Nature [abstract]; UCB has also issued a news release.

The first stage of the process involves a fermentation originally described by the chemist Chaim Weizmann, in which the bacterium Clostridium acetobutylicum ferments sugars into a mixture of butanol, ethanol [C2H5OH], and acetone [CH3-CO-CH3].  (Weizmann, born near Pinsk in what is now Belarus, emigrated to Britain, where he became a lecturer in chemistry at the University of Manchester.  Later in life, he would become the first president of Israel.)   The process, discovered at the beginning of World War I, was originally valued for the acetone produced, which was needed to produce cordite, a replacement for gunpowder.  Left to its own devices, the reaction shuts down in time, because these metabolic products are harmful to the bacteria.

The UCB scientists have found that a class of organic solvents, in particular glyceryl tributyrate, can be used to extract the butanol and acetone from the fermentation mixture, leaving most of the ethanol behind in the original, water-based solution.  The researchers then used a catalyst of potassium phosphate [K3PO4] and palladium [Pd] metal in a condensation reaction, in which the acetone and butanol combine to produce a longer-chain ketone.  Further condensation produces a mixture of ketones, about half of which is an 11-carbon compound.  This mixture, although not chemically the same as conventional diesel fuel, has similar properties, so that it can be used as a feedstock for fuel production.

At present, this process is not economically competitive with producing fuel from petroleum.  One issue is the cost of the palladium catalyst; however,  the researchers feel that alternative catalysts, less expensive but equally effective, can be found.  The fermentation and extraction process is already fairly efficient, compared to conventional distillation.

As with the previous work in this area, there is a good deal of work to be done before the research leads to a commercially viable process.  Nonetheless, it is encouraging that different avenues are being explored.  After all, petroleum and other fossil fuels were formed by chemical transformations of organic materials, albeit over long time spans.  We just need to speed things up a bit.


SAE Endorses Electric Vehicle Charging Standard

October 19, 2012

Back in September of last year, I wrote about an announcement that a group of auto manufacturers (Audi, BMW, Daimler, Ford, General Motors, Porsche and Volkswagen) had agreed on a standard set of connections and protocols for charging the batteries in electric vehicles [EVs].  Now, a post on the “Autopia” blog at Wired reports that the Society of Automotive Engineers [SAE] has officially adopted a version of this standard (called J1772 Revision B) for the United States and Europe.  The standard specifies the connectors and electrical interfaces to be used in public charging stations for electric and plug-in hybrid vehicles.

Using electricity instead of fossil fuels as a vehicle energy source has some significant attractions; but one problem that needs to be solved, in order for large-scale adoption of EVs to become reality, is the establishment of a charging infrastucture.  (We don’t think much about this with respect to our traditional, gasoline-powered cars, since the refueling infrastructure — gas stations — has been in place for many years.)  Having standards for the charging system is of obvious importance: imagine a world where a different kind of gas pump was required, depending on whether you drove a VW, or a Toyota, or a Ford.  It is, in a way, analogous to the question of whether we should drive on the right (as in the US), or on the left (as in the UK).  It isn’t obvious, at least to me, that either choice has any intrinsic or essential merit relative to the other; however, it is clearly quite useful for all of us to agree on a single choice.

Agreeing on a standard is also complicated by the number of factors to be taken into account.  It’s probably a fair assumption that most EV owners, most of the time, will use their standard domestic electricity supply (whatever that is, another variable) to recharge their car’s batteries.  That can be a slow process, though (measured in hours), using the standard US domestic supply at (nominally) 110-115 volts AC.  The standard also has to provide for an implementation that is safe to use in an uncontrolled environment (that is,  outdoors) in less-than-ideal conditions.

The new standard also makes some technical progress, while remaining backward-compatible with earlier versions of the J1772 standard.  In particular, it allows for high-voltage (~ 500 volts) direct current [DC] charging, which could reduce the time required for a full charge to 30 minutes or less.

This agreement on a standard is a good thing, but the picture is somewhat clouded, because some Japanese carmakers (particularly Mitsubishi and Nissan) had already adopted a Japanese standard called CHAdeMO to accommodate fast charging.

“We are disappointed that SAE has approved a fast-charging standard that will not accommodate more than 70 percent of the electric vehicles on U.S. roadways today,” Nissan America said in a statement. “At the time of launch, the Nissan Leaf was designed to comply with the CHAdeMO standard of quick charging, which was the only existing quick-charge standard certified at the time.”

Now, if this difference was only about two alternatives connectors and voltage levels and that sort of thing, as long as the standards are published, we should shortly expect to see adapters, to go from CHAdeMO to J1772-RevB, and vice versa.   I hope, though, that the auto makers will recognize that having a common, agreed, standard for recharging EVs is something they all should want.


Take the Road Train, Revisited

September 16, 2012

I’ve written here before about some of the work being done to develop “self-driving” cars, including Google’s tests of a fully-autonomous vehicle, and Volvo’s work on developing “road trains”, essentially convoys of semi-autonomous vehicles that follow a lead vehicle with a human driver.  Volvo’s  work is part of the European Union’s Project SARTRE (Safe Road Trains for the Environment).

The New Scientist site has an article reporting on a recent demonstration of the road train technology.  This approach probably has the higher likelihood of practical application in the near term, because it is largely based on technology that is already present, at least in some high-end cars.

Almost all the sensors and actuators that keep me from flying off the road now come as standard in most new Volvos (and other manufacturers for that matter). They are the exact same ones that enable cars to stay in lanes and avoid hitting other cars and pedestrians.

In contrast, completely autonomous cars, like those being tested by Google, require a considerable amount of added equipment to function.

Both approaches have the potential to provide significant improvements in safety.  The autonomous “driver” will not drive while sleepy or intoxicated; nor will it be distracted by sightseeing, fiddling with a cell phone, or turning around to smack the kid in the back seat.  An automatic system can also react more quickly than a human driver.

That faster reaction time means, in practice, that cars, particularly in a road train system, can follow one another much closer than would be safe or legal with a human driver  .  In the test reported in the article, the following distance at a speed of 90 km/hour [56 mph] was about 6 meters [19.7 feet].  By comparison, with a driver reaction time of 500 milliseconds, about 80 feet of additional separation would be needed at the same speed.  Putting vehicles closer together, with fewer speed fluctuations, should help reduce road congestion.  Obviously, all this assumes that the lead driver is highly competent.

The ability to follow other vehicles more  closely also might improve fuel economy, by the phenomenon that cyclists everywhere know as “drafting”.  As speed increases, the amount of power required just to overcome air resistance increases as the third power of the vehicle’s relative air speed (that is, taking into account any head- or tail-wind).   At a speed of 15 mph on level ground, for example, most of a cyclist’s power is used just to make a hole in the air. [Source: Bicycling Science, 2nd Edition, by Frank R. Whitt and David G. Wilson; Cambridge MA: MIT Press, 1997].  The effect is not so pronounced for cars, since they are typically more streamlined (that is, have a lower drag coefficient), but it is still significant.

Vehicles driving in such tight formations with fewer speed fluctuations should dramatically reduce congestion, says Erik Coelingh, Volvo’s senior technical specialist who is heading the research near Gothenburg. The reduction in drag could potentially cut fuel consumption by as much as 20 per cent, he says.

The technology is certainly interesting, and seems to have a good deal of potential.  Whether the legal and cultural obstacles to its adoption can be overcome remains to be seen.


%d bloggers like this: