Poorly Defined Questions Lead to Meaningless Poll Results, by Indur Goklany
Master Resource, January 23, 2009
The American Geophysical Union’s house organ, Eos, has an article entitled “Examining the Scientific Consensus on Climate Change,” written by Peter Doran and Kendall Zimmerman of the University of Illinois at Chicago. (h/t Roger Pielke, Sr.)
The paper explains that the two “primary questions” asked were:
1. When compared with pre-1800s levels, do you think that mean global temperatures have generally risen, fallen, or remained relatively constant?
2. Do you think human activity is a significant contributing factor in changing mean global temperatures?
The article reports that over 90 percent of the responders agreed with the statement that mean global temperatures have “risen” relative to the pre-1800 levels (Question 1). But this is unremarkable. Everyone knows that in the 1700s and early 1800s the world was in the grip of the Little Ice Age, so a recovery toward normalcy alone would lead to global warming. No one I know — and I know plenty of skeptics (not yet a criminal offense, I hope) — has ever disputed this point. So the real surprise to me is that it wasn’t 99 percent–plus!
With respect to the second question, the authors report that 82 percent agreed that “human activity is a significant contributing factor” in the increased global temperature.
But what precisely does this mean? How did respondents interpret the terms “human activity” and “significant contributing factor”?
I am reminded that under the Clean Air Act, an area that has relatively clean air is deemed to be “significantly deteriorated” if the concentration of an air pollutant in that area increases by 2–8% of the National Ambient Air Quality Standard. Under this interpretation, a contribution of 2–8% would be significant! So the question is: What instructions or guidance, if any, was provided to respondents in interpreting what precisely was meant by “significant contributing factor” in the second question? Is a 2% contribution significant? Why not 20%, or 40%, or 60%?
In addition, what exactly is the “human activity” that is supposed to be significantly contributing to global warming? Is it land clearance, paving over surfaces, producing black-carbon emissions, generating well-mixed greenhouse gases, or a combination of all these activities? I claim that most people would interpret “human activity” to encompass “all of the above.”
Moreover, how do the researchers know that there is a uniform understanding of the question, and what precisely is that uniform understanding?
Given that the two key terms are imprecisely defined, a person who believed that well-mixed greenhouse gases have contributed no more than, say, 10% to the global warming since 1800 might be compelled to agree that “human activity is a significant contributing factor” to post-1800 global warming.
From the perspective of policy makers, to whom this article is partly addressed, identifying the precise kind of human activity that contributes significantly to global warming (however defined) is absolutely critical. If well-mixed greenhouse gases (GHGs), for example, carbon dioxide, do not contribute significantly, then no amount of GHG reductions will make a significant difference (however defined) to the degree of future warming. Thus, the imprecise wording of Question 2 precludes this poll from closing the debate among policy makers.
The fact that the article doesn’t raise, let alone address, the issues posed by its inexact question indicates that the Eos article is premature, if not erroneous, in its claim that “the debate on the authenticity of global warming and the role played by human activity is largely nonexistent among those who understand the nuances and scientific basis of long-term climate processes.” It also reflects poorly on the pollsters and Eos reviewers — assuming it was reviewed — that they missed such obvious difficulties with the study.
The article also claims that ‘[t]he challenge, rather, appears to be how to effectively communicate this fact to policy makers and to a public that continues to mistakenly perceive debate among scientists.”
But the real challenge is: How can pollsters accurately communicate their questions to potential respondents so that there is a common and precise understanding on everyone’s part regarding the question posed. Without this, one is condemned to obtain meaningless poll results. And the rest of the world is subjected to reading (and reading about) one more poorly designed study.
Bipartisan Alliance, a Society for the Study of the US Constitution, and of Human Nature, where Republicans and Democrats meet.
Sunday, January 25, 2009
Brazil Expands Investment in Offshore Drilling Projects
Brazil Expands Investment in Offshore Drilling Projects, by Andrew Downie
TNYT, January 24, 2009
SÃO PAULO, Brazil — Brazil’s state-controlled oil company, Petrobras, announced a crisis-busting investment plan Friday to spend more than $174 billion over the next five years, much of it for prodigious deep-water oil and gas exploration.
The investment covers the 2009-2013 period and represents a rise of 55 percent over the $112.4 billion the company had vowed to spend on development between 2008 and 2012.
This investment is “very robust and very important for the continuity of Petrobras’s growth,” José Sergio Gabrielli, the company’s chief executive, told reporters Friday at a news conference in Rio de Janeiro.
Petrobras, whose full name is Petróleo Brasileiro, had promised to unveil its spending plans in September but delayed the announcement several times because of the world’s financial turmoil.
In 2007 and 2008, Petrobras and partners including Repsol YPF of Spain and the BG Group of Britain discovered vast deposits of oil under more than 4,000 meters of water, rock and salt.
Although the finds are at previously untapped depths and will be costly to extract, they hold an estimated 8 billion to 12 billion barrels of oil, according to Petrobras figures. Company officials and oil experts say that other reserves of that size could be nearby.
One of the finds alone, named the Tupi, holds the equivalent of 5 billion to 8 billion barrels of light crude oil and is the world’s biggest new field since a 12-billion-barrel find in Kazakhstan in 2000.
President Luiz Inácio Lula da Silva of Brazil has said repeatedly that developing these oil reserves is vital to the country’s future, and Petrobras has set aside $28 billion to that end.
In all, new drilling could produce 219,000 barrels a day by 2013, 582,000 barrels a day by 2015 and 1.82 million barrels a day by 2020, he predicted.
Natural gas extraction would rise from 7 million cubic meters a day in 2013 to 40 million a day in 2020, the company added.
Petrobras produced a daily average of 2.18 million barrels of oil and gas last year.
TNYT, January 24, 2009
SÃO PAULO, Brazil — Brazil’s state-controlled oil company, Petrobras, announced a crisis-busting investment plan Friday to spend more than $174 billion over the next five years, much of it for prodigious deep-water oil and gas exploration.
The investment covers the 2009-2013 period and represents a rise of 55 percent over the $112.4 billion the company had vowed to spend on development between 2008 and 2012.
This investment is “very robust and very important for the continuity of Petrobras’s growth,” José Sergio Gabrielli, the company’s chief executive, told reporters Friday at a news conference in Rio de Janeiro.
Petrobras, whose full name is Petróleo Brasileiro, had promised to unveil its spending plans in September but delayed the announcement several times because of the world’s financial turmoil.
In 2007 and 2008, Petrobras and partners including Repsol YPF of Spain and the BG Group of Britain discovered vast deposits of oil under more than 4,000 meters of water, rock and salt.
Although the finds are at previously untapped depths and will be costly to extract, they hold an estimated 8 billion to 12 billion barrels of oil, according to Petrobras figures. Company officials and oil experts say that other reserves of that size could be nearby.
One of the finds alone, named the Tupi, holds the equivalent of 5 billion to 8 billion barrels of light crude oil and is the world’s biggest new field since a 12-billion-barrel find in Kazakhstan in 2000.
President Luiz Inácio Lula da Silva of Brazil has said repeatedly that developing these oil reserves is vital to the country’s future, and Petrobras has set aside $28 billion to that end.
In all, new drilling could produce 219,000 barrels a day by 2013, 582,000 barrels a day by 2015 and 1.82 million barrels a day by 2020, he predicted.
Natural gas extraction would rise from 7 million cubic meters a day in 2013 to 40 million a day in 2020, the company added.
Petrobras produced a daily average of 2.18 million barrels of oil and gas last year.
Libertarian views on global warming: "Glacier Slowdown in Greenland: How Inconvenient"
Glacier Slowdown in Greenland: How Inconvenient
In this week’s Science magazine, science writer Richard Kerr reports on some of the goings-on at this past December’s annual meeting of the American Geophysical Union.
While he didn’t cover our presentation at the meeting in which we described our efforts at creating a reconstruction of ice melt across Greenland dating back into the late 1700s (we found that the greatest period of ice melt occurred in the decades around the 1930s), Kerr did cover some other recent findings concerning the workings of Greenland’s cryosphere in his article titled “Galloping Glaciers of Greenland Have Reined Themselves In.”
Here is how Kerr starts things off:
Things were looking bad around southeast Greenland a few years ago. There, the streams of ice flowing from the great ice sheet into the sea had begun speeding up in the late 1990s. Then, two of the biggest Greenland outlet glaciers really took off, and losses from the ice to the sea eventually doubled. Some climatologists speculated that global warming might have pushed Greenland past a tipping point into a scary new regime of wildly heightened ice loss and an ever-faster rise in sea level.
And some non-climatologists speculated disaster from rapidly rising seas as well.
During his An Inconvenient Truth tour, Gore was fond of spinning the following tale:
[E]arlier this year [2006], yet another team of scientists reported that the previous twelve months saw 32 glacial earthquakes on Greenland between 4.6 and 5.1 on the Richter scale - a disturbing sign that a massive destabilization may now be underway deep within the second largest accumulation of ice on the planet, enough ice to raise sea level 20 feet worldwide if it broke up and slipped into the sea. Each passing day brings yet more evidence that we are now facing a planetary emergency - a climate crisis that demands immediate action to sharply reduce carbon dioxide emissions worldwide in order to turn down the earth’s thermostat and avert catastrophe.
Oh how things have changed in the past 2 years.
For one, the “team of scientists” that reported on the Greenland earthquakes now think that the earthquakes were the result of processes involved with glacial calving, rather than something “underway deep within the second largest accumulation of ice on the planet” (Nettles et al., 2008).
For another, Gore’s “massive destabilization” mechanism for which the earthquakes were a supposed bellwether (meltwater lubrication of the flow channel) has been shown to be ineffective at producing long-term changes in glacier flow rate (e.g. (Joughin et al., 2008; van de Wal et al., 2008).
And for still another, the recent speed-up of Greenland’s glaciers has even more recently slowed down.
Here is how Kerr describes the situation:
So much for Greenland ice’s Armageddon. “It has come to an end,” glaciologist Tavi Murray of Swansea University in the United Kingdom said during a session at the meeting. “There seems to have been a synchronous switch-off” of the speed-up, she said. Nearly everywhere around southeast Greenland, outlet glacier flows have returned to the levels of 2000. An increasingly warmer climate will no doubt eat away at the Greenland ice sheet for centuries, glaciologists say, but no one should be extrapolating the ice’s recent wild behavior into the future.
The last point is driven home by new results published last week (and described in our last WCR and in our piece over at MasterResource) by researchers Faezeh Nick and colleagues. They modeled the flow of one of Greenland’s largest glaciers and determined that while glaciers were quite sensitive to changing conditions at their calving terminus, that they responded rather quickly to them and the increase in flow rate was rather short lived. Nick et al. included these words of warning: “Our results imply that the recent rates of mass loss in Greenland’s outlet glaciers are transient and should not be extrapolated into the future.”
All told, it is looking more like the IPCC’s estimates of a few inches of sea level rise from Greenland during the 21st century aren’t going to be that far off—despite loud protestations to the contrary from high profile alarm pullers.
Maybe Gore will go back and remove the 12 pages worth of picture and maps from his book showing what high profile places of the world will look like with a 20-foot sea level rise (“The site of the World Trade Center Memorial would be underwater”). But then again, probably not—after all the point is not to be truthful in the sense of reflecting a likely possibility, but to scare you into a particular course of action.
References:
Intergovernmental Panel on Climate Change, 2007. Climate Change 2007: The Physical Science Basis. Solomon, S., et al. (eds.), Cambridge University Press, Cambridge, U.K., 996pp.
Joughin, I., et al., 2008. Seasonal speedup along the western flank of the Greenland Ice Sheet. Science, 320, 781-783.
Kerr, R. A., 2009. Galloping glaciers of Greenland have reined themselves in. Science, 323, 458.
Nettles, M., 2008. Step-wise changes in glacier flow speed coincide with calving and glacial earthquakes at Helheim Glacier, Greenland. Geophysical Research Letters, 35, doi:10.1029/2008GL036127.
Nick, F. M., et al., 2009. Large-scale changes in Greenland outlet glacier dynamics triggered at the terminus. Nature Geoscience, DOI:10.1038, published on-line January 11, 2009.
van de Wal, R. S. W., et al., 2008. Large and rapid melt-induced velocity changes in the ablation zone of the Greenland ice sheet. Science, 321, 111-113.
In this week’s Science magazine, science writer Richard Kerr reports on some of the goings-on at this past December’s annual meeting of the American Geophysical Union.
While he didn’t cover our presentation at the meeting in which we described our efforts at creating a reconstruction of ice melt across Greenland dating back into the late 1700s (we found that the greatest period of ice melt occurred in the decades around the 1930s), Kerr did cover some other recent findings concerning the workings of Greenland’s cryosphere in his article titled “Galloping Glaciers of Greenland Have Reined Themselves In.”
Here is how Kerr starts things off:
Things were looking bad around southeast Greenland a few years ago. There, the streams of ice flowing from the great ice sheet into the sea had begun speeding up in the late 1990s. Then, two of the biggest Greenland outlet glaciers really took off, and losses from the ice to the sea eventually doubled. Some climatologists speculated that global warming might have pushed Greenland past a tipping point into a scary new regime of wildly heightened ice loss and an ever-faster rise in sea level.
And some non-climatologists speculated disaster from rapidly rising seas as well.
During his An Inconvenient Truth tour, Gore was fond of spinning the following tale:
[E]arlier this year [2006], yet another team of scientists reported that the previous twelve months saw 32 glacial earthquakes on Greenland between 4.6 and 5.1 on the Richter scale - a disturbing sign that a massive destabilization may now be underway deep within the second largest accumulation of ice on the planet, enough ice to raise sea level 20 feet worldwide if it broke up and slipped into the sea. Each passing day brings yet more evidence that we are now facing a planetary emergency - a climate crisis that demands immediate action to sharply reduce carbon dioxide emissions worldwide in order to turn down the earth’s thermostat and avert catastrophe.
Oh how things have changed in the past 2 years.
For one, the “team of scientists” that reported on the Greenland earthquakes now think that the earthquakes were the result of processes involved with glacial calving, rather than something “underway deep within the second largest accumulation of ice on the planet” (Nettles et al., 2008).
For another, Gore’s “massive destabilization” mechanism for which the earthquakes were a supposed bellwether (meltwater lubrication of the flow channel) has been shown to be ineffective at producing long-term changes in glacier flow rate (e.g. (Joughin et al., 2008; van de Wal et al., 2008).
And for still another, the recent speed-up of Greenland’s glaciers has even more recently slowed down.
Here is how Kerr describes the situation:
So much for Greenland ice’s Armageddon. “It has come to an end,” glaciologist Tavi Murray of Swansea University in the United Kingdom said during a session at the meeting. “There seems to have been a synchronous switch-off” of the speed-up, she said. Nearly everywhere around southeast Greenland, outlet glacier flows have returned to the levels of 2000. An increasingly warmer climate will no doubt eat away at the Greenland ice sheet for centuries, glaciologists say, but no one should be extrapolating the ice’s recent wild behavior into the future.
The last point is driven home by new results published last week (and described in our last WCR and in our piece over at MasterResource) by researchers Faezeh Nick and colleagues. They modeled the flow of one of Greenland’s largest glaciers and determined that while glaciers were quite sensitive to changing conditions at their calving terminus, that they responded rather quickly to them and the increase in flow rate was rather short lived. Nick et al. included these words of warning: “Our results imply that the recent rates of mass loss in Greenland’s outlet glaciers are transient and should not be extrapolated into the future.”
All told, it is looking more like the IPCC’s estimates of a few inches of sea level rise from Greenland during the 21st century aren’t going to be that far off—despite loud protestations to the contrary from high profile alarm pullers.
Maybe Gore will go back and remove the 12 pages worth of picture and maps from his book showing what high profile places of the world will look like with a 20-foot sea level rise (“The site of the World Trade Center Memorial would be underwater”). But then again, probably not—after all the point is not to be truthful in the sense of reflecting a likely possibility, but to scare you into a particular course of action.
References:
Intergovernmental Panel on Climate Change, 2007. Climate Change 2007: The Physical Science Basis. Solomon, S., et al. (eds.), Cambridge University Press, Cambridge, U.K., 996pp.
Joughin, I., et al., 2008. Seasonal speedup along the western flank of the Greenland Ice Sheet. Science, 320, 781-783.
Kerr, R. A., 2009. Galloping glaciers of Greenland have reined themselves in. Science, 323, 458.
Nettles, M., 2008. Step-wise changes in glacier flow speed coincide with calving and glacial earthquakes at Helheim Glacier, Greenland. Geophysical Research Letters, 35, doi:10.1029/2008GL036127.
Nick, F. M., et al., 2009. Large-scale changes in Greenland outlet glacier dynamics triggered at the terminus. Nature Geoscience, DOI:10.1038, published on-line January 11, 2009.
van de Wal, R. S. W., et al., 2008. Large and rapid melt-induced velocity changes in the ablation zone of the Greenland ice sheet. Science, 321, 111-113.
Medicine's Miracle Man
Medicine's Miracle Man. By John E. Calfee
Maurice Hilleman's remarkable period of industrial scientific research yielded the most cost-effective medicines ever made.
The American, Friday, January 23, 2009
The pharmaceutical industry has been under attack for longer than most people realize. In the 1950s and 1960s, when for the first time in history we had quite a few drugs that worked very well—including many antibiotics, the first miracle drugs—there was the full panoply of congressional hearings, outraged newspaper editorials, and dour experts who described an industry in which prices were too high, marketing too important, and innovation in decline amid a flood of “me-too” drugs barely distinguishable from the original innovative brands. But I doubt that the atmosphere then was as hostile as it has been in the past five years or so. A flood of books, including some by authors with academic credentials, have re-circulated many of the same arguments (albeit with more emphasis on safety). The more scholarly works include Merrill Goozner’s The $800 Million Pill: The Truth Behind the Cost of New Drugs; Jerome Kassirer’s On The Take: How Medicine’s Complicity with Big Business Can Endanger Your Health; and Jerry Avorn’s Powerful Medicines: The Benefits, Risks, and Costs of Prescription Drugs. Others have a more muckrakian tone, beginning with the muchquoted The Truth About the Drug Companies by former New England Journal of Medicine editor Marcia Angell, and continuing on to many others including Ray Moynihan and Alan Cassels’s Selling Sickness: How the World’s Biggest Pharmaceutical Companies Are Turning Us All Into Patients; Howard Brody’s Hooked: Ethics, the Medical Professions, and the Pharmaceutical Industry; Alison Bass’s Side-effects: A Prosecutor, a Whistleblower, and a Bestselling Antidepressant on Trial (about the drug Paxil); and Philip R. Lee’s Pills, Profits, and Politics.
What’s with R&D?
To my mind, the most serious of these indictments focus on industry research. No doubt, the stakes are high for the industry. If drugs are truly innovative life-modifiers or life-savers, the argument over prices and spending tends to be marginalized. But if there hasn’t been a lot of innovation and if the innovation we do get comes mainly from the taxpayer-supported National Institutes of Health and other nonprofit organizations, the politics of drugs becomes difficult for the industry to handle.
We need to look ahead, and when we do it’s hard not to get excited. The entire field of immunology has taken off along with so much else in this age of biotechnology.
I have had occasion to write about innovation and its sources in the pages of The American. As I explained in “The Golden Age of Medical Innovation” (March/April 2007), the critics have paid too much attention to the annual count of new drug approvals by the FDA and too little attention to two crucial developments. One is the increasing importance of research that occurs after a drug is approved. Newer drugs, especially so-called biotech drugs including monoclonal antibodies, involve complex biological processes that are themselves subject to ever more sophisticated research on everything from DNA to drug interactions. Basic research and clinical trials have been running side by side, often with drugs themselves serving as research tools to find out what happens when a particular receptor is suppressed (such as the epidermal growth factor receptor, or EGFR, to cite a target that is important for cancer and much more). Sometimes, scientists harvest a series of improved treatments using existing drugs without actually getting a new one approved. Rituxan, originally approved for certain types of non-Hodgkin’s lymphoma cancer, is now approved for other types of cancer along with multiple sclerosis, rheumatoid arthritis, and Crohn’s Disease, and is being researched to treat lupus, idiopathic thrombocytopenia purpura, and chronic lymphocytic leukemia.
The other phenomenon that has been largely lost in popular discussion of drug R&D and its discontents is the extraordinary role played by “follow-on” drugs (a much more accurate term than “me-too”). The story with statin cholesterol-reducing drugs, where a decade or more of research on follow-ons revolutionized the prevention and treatment of coronary heart disease, is a familiar one. Similar stories are playing out now, but much faster. Competition among rapidly developed drugs to attack a promising target (such as tumor necrosis factor inhibitors for rheumatoid arthritis) can bring about revolutions in treatments as doctors and patients dance through one drug after another while dealing with the unique mix of side effects and drug resistance that plague each individual patient.
The interested reader can get a flavor of this blend of basic science and practical drug development by reading the fascinating discussion by Jan Vermorken, et al., of evolving treatments for head-and-neck cancer in the September 11, 2008 issue of The New England Journal of Medicine. Much of this story involves Erbitux, the monoclonal antibody that put Martha Stewart in jail after a disappointing FDA decision put the owners of ImClone, the developer, into a panic. The many years of up-and-down research and results on that drug, costing hundreds of millions or even billions of dollars back when no revenues were in sight, is probably as good an example as any of the vagaries and payoffs from high-risk drug research informed by ongoing work in pure science.
Hilleman set out to develop vaccines for the chief life-threatening viral and bacterial infectious diseases of childhood. Amazingly, he came close to clearing the table.
In another recent article in The American, I addressed the thorny question of the role of publicly supported basic research in drug development (“The Indispensable Industry,” May/June 2008). To put it in the simplest terms, a close look reveals a striking pattern that seems to be little noticed by the critics of private drug development: no matter how far-flung the curiosity-driven NIH-supported research is, the only results that seem to get translated into useable drugs are the ones that are grabbed by drug firms and put through the difficult research necessary to produce appropriate quantities of promising substances to run through years of arduous clinical trials. Take away the private sector, and basic research nearly always languishes with little practical effect, as is unceasingly and tragically illustrated by the dearth of new drugs and vaccines for malaria and tuberculosis. Sometimes, the drug firms themselves do perform crucial basic research, as in the case of Genentech’s Avastin for cancer and Lucentis to prevent blindness. These were the fruits of the firm’s own top-tier basic research forces.
Just Two Words
But there is something else in drug development that hardly gets talked about: the sheer energy and determination that you find in the private sector. Combine that with substantial financial resources and you get what John Maynard Keynes called “animal spirits,” a singular motivating force in creative capitalism. When this force attacks big problems, the results can be both spectacular and unexpected, sometimes with fabulous benefits for mankind. It so happens that animal spirits are very much involved in one of the great blessings of modern medicine: an armamentarium of vaccines, mainly given to children, which have been saving lives by the millions at astonishingly low costs. “The most cost-effective treatments ever created by mankind” is a typical summary of the value of vaccines for mumps, measles, rubella (German measles), and half a dozen or so others, including those for diphtheria, whooping cough, hepatitis A, and hepatitis B.
Where, you might ask, did all those life-saving vaccines come from? Amazingly, for half or more of them, the answer can be summarized in two words: Hilleman and Merck. You’ve likely never heard of Maurice Hilleman even though he probably saved more lives than any other scientist in the 20th century. For most of his career, Hilleman was a biologist at Merck, where he developed one vaccine after another, stretching through four extraordinary decades of productive work. Along the way, he pioneered new ways to create, test, and manufacture vaccines, and played a crucial role in the creation of an entirely different class of drugs known as interferons.
We know a lot about Hilleman’s career thanks to a wonderful book published last year by Paul Offit: Vaccinated: One Man’s Quest to Defeat the World’s Deadliest Diseases. Offit was the perfect vehicle for getting this story the attention it deserves. A prominent academic immunologist at Children’s Hospital of Philadelphia and the University of Pennsylvania Medical School, Offit is also a vaccine developer. He is a co-inventor and co-developer of Rotateq, the first fully successful vaccine for rotavirus, a cause of deadly dehydration that kills thousands of children annually in poor nations.
Offit is attuned to public policy. He has been a member of the Advisory Committee on Immunization Practices, whose child vaccination recommendations are gospel for physicians and payers. His previous book, The Cutter Incident, was an insightful historical account of how litigation over an early miracle vaccine—for polio—helped shape (very much for the worse) the entire litigation environment of vaccines and pharmaceuticals. Offit’s academic journal articles and newspaper op-eds on the consequences of unscientific attacks on vaccine safety are required reading for anyone interested in this contentious topic.
Offit’s Vaccinated is informed by 11 interviews with the 85-year-old Hilleman in 2005, during the last months of his life before he succumbed to cancer. Judging by dozens of meaty quotes, Offit is a probing interviewer, capturing a great scientist’s personality and working style to a degree that cannot be matched without personal experience with the subject, and is seldom matched even then.
His basic strategy was simple: solve whatever problems had to be solved in order to reach the goal, which was usually a new vaccine.
Who was Maurice Hilleman and what did he do? Born to a German-American family in 1919 and raised on a Montana farm near his birthplace, Hilleman was a brilliant student on scholarship at Montana State University. After graduation he moved to the Midwest intellectual mecca at the University of Chicago, where in 1944 he finished a Ph.D. in microbiology based on groundbreaking research on the chlamydia bacterium (previously thought to be a virus). To the dismay of his new intellectual peers, Hilleman left academia to work for a pharmaceutical firm, E.R. Squibb, where he achieved advances in flu vaccine development and manufacturing. In 1948, he moved to the Walter Reed military hospital in Washington. His work there culminated in an extraordinary episode in 1957 when he correctly forecast the arrival of a new Asian Flu to which almost no one was immune. He led the development and manufacturing (by private firms) of a vaccine in time to save hundreds of thousands of lives and perhaps many more.
In 1957 Hilleman returned to the private sector, this time at Merck, where he was head of virus and cell biology in Merck’s relatively new vaccine enterprise. Hilleman apparently set out to develop vaccines for the chief life-threatening viral and bacterial infectious diseases of childhood. Amazingly, he came close to clearing the table. First was the mumps, with the approval in 1967 of the “Jeryl Lynn” vaccine based on a mumps virus taken from his daughter of that same name. A measles vaccine arrived the next year. In 1969, we got a vaccine for rubella. Hilleman soon concocted the immensely useful idea of combining these three vaccines into a single shot. Approved in 1971, this proved a blessing to untold millions of small children and their mothers. The 1981 vaccine for hepatitis B (not really a childhood disease, of course) was a scientific and technological tour de force essentially from start to finish. In 1995 came the hepatitis A vaccine. For chicken pox, pneumococcus, and Hib (haemophilus influenzae type b), Hilleman transformed relatively untested vaccines into the mass-produced tools with which we are now familiar. It is hard to imagine the cumulative benefits of this research. (Hilleman also developed a vaccine for a destructive form of chicken cancer, rescuing a substantial part of the poultry industry.) Hilleman’s work sometimes ranged beyond vaccines. Starting in the late 1950s, he figured out how to mass-produce a newly discovered virus-killing substance in chickens called interferon. He soon detailed interferon’s basic physical, chemical, and biological properties, discovering that it was produced in many animals, including humans, and that it could impede or kill many viruses, such as those involved in cancer. He correctly predicted that interferon could be used to treat chronic infections and cancer. Today, it is used against hepatitis B, hepatitis C, and several types of cancer.
Problem Solving for Fun and Profit
This is more than the history of medicine, science, and technology. It is also business history, a classic story of problem solving for fun and profit and humanity. How was Hilleman able to accomplish so much in basic research, drug development, and manufacturing technology, often working essentially from scratch because vaccine development was still in its infancy when he set to work? The answer lies in Hilleman’s decision to work at Merck instead of pursuing a top-tier academic career. He realized that to attack the most pressing illnesses susceptible to immune-based prevention, he would have to marshal massive forces even after solving the purely intellectual puzzles. Merck had supported that kind of work before in Max Tishler’s research on the vitamin B complex. Offit tells us relatively little about internal Merck affairs, but it is clear that Hilleman enjoyed an extraordinary degree of autonomy combined with generous funding increases for low-profit products (now there’s a combination we’d all like to have!).
The Nobel Prize committee was not willing to award a prize to an industry scientist. It is hard not to see this as a miscarriage of scientific justice.
Hilleman sometimes exercised an iron fist over such normally mundane matters as manufacturing, where any deviation from his recipe could result in undetectable dangers. Indeed, Hilleman was apparently a bit of a tyrant, demanding almost as much of his staff as of himself, facilitated by his mastery of the art of profanity. Nonetheless, he retained the respect and often the devotion of his hard-driven staff along with near-legendary status among his academic peers.
In 1984, when Hilleman reached Merck’s mandatory retirement age of 65, he refused to retire and Merck kept him on. One result was the hepatitis A virus vaccine that arrived in 1995, along with a steady stream of academic work of all sorts until shortly before his death in 2005. Hilleman never jettisoned the problem solving method of a successful Montana farmer. Like Orville and Wilbur Wright when they built the first successful heavier-than-air flying machine, Hilleman’s basic strategy was simple: solve whatever problems had to be solved in order to reach the goal, which was usually a new vaccine. The list of problems included daunting scientific puzzles, excruciating judgments about whether dangerous side effects had been defeated, and the vagaries of regulation (much easier before the FDA got into the action).
As the 80-plus-year-old Hilleman approached death, Offit and other academic scientists lobbied the Nobel committee to award Hilleman the Nobel Prize for Medicine, based partly on his vaccine work and partly on his contributions to the basic science of interferons. The committee made clear that it was not going to award the prize to an industry scientist. (Offit has assured me that the situation was even more hopeless than he describes in his book.) It is hard not to see this as a miscarriage of scientific justice. Perhaps Hilleman would have done better if his volcanic personality had not included a surprising element of self-effacement. None of the vaccines or the crucial agents or processes he created were named after himself. At one point, he even called the developer of a new rubella vaccine to say that he thought it should replace his own because it was better. Hilleman’s absence from the academic and public spotlight was quite extraordinary. In one of the most striking of the dozens of anecdotes told by Offit, Hilleman’s death was announced to a meeting of prominent public health officials, epidemiologists, and clinicians gathered to celebrate the 50th anniversary of the Salk polio vaccine. Not one of them recognized Hilleman’s name!
Next…
Thanks to Offit and his book, Hilleman’s light and the extraordinary research achieved by the Merck company will shine for many, many years. What about vaccine research itself? There have been formidable obstacles. One was the liability system, which in the 1980s nearly killed off the child vaccine market before Congress removed child vaccines from the liability system altogether. Another, more persistent problem has been low reimbursement rates, especially by government, for traditional child vaccines (including most of Hilleman’s crop). This can discourage new research and production, and cause shortages. The situation was sufficiently worrisome to trigger a 2003 report by the federally sponsored Institute of Medicine entitled “Financing Vaccines in the 21st Century: Assuring Access and Availability.” Reimbursement seems to have improved recently. Better yet, newer vaccines are sufficiently protected by patents so that prices are set through ordinary market forces rather than government fiat. Merck and its competitors, such as GlaxoSmithKline and Sanofi-Aventis plus smaller firms, have developed a series of important new vaccines—notable among them are the pneumococcal vaccine, a vaccine for the human papilloma virus (which causes cervical cancer), and two rotavirus vaccines (including the one co-invented by Offit). Traditional vaccine research is now flourishing but will probably never again be dominated by a single person’s laboratory like the one run by Hilleman in his prime.
Hilleman was apparently a bit of a tyrant, demanding almost as much of his staff as of himself, facilitated by his mastery of the art of profanity.
But we need to look ahead, and when we do it’s hard not to get excited. The entire field of immunology—roughly speaking, the harnessing of the human immune system to fight disease—has taken off along with so much else in this age of biotechnology. We are discovering faster and more efficient ways to manufacture traditional vaccines (especially for the flu), better methods for identifying newly arrived infectious agents such as avian flu (the dreaded “bird flu” that could cause an epidemic on the scale of the one in 1918 that killed millions worldwide), and new techniques for developing vaccines once their targets have been identified.
And there is the extraordinary prospect of therapeutic vaccines, i.e., vaccines that harness the immune system to attack illnesses already present in the body rather than just preparing the body to reject infections that have not yet been encountered. None has been approved, but a brain cancer vaccine from the biotech firm Dendreon received a favorable rating from an FDA advisory committee and may yet gain approval from the FDA (despite its reluctance to approve highly innovative drugs in this era of attacks on it for paying too much attention to new benefits and too little attention to safety). Alzheimer’s Disease vaccines have achieved striking results against the beta-amyloid plaques typically found in the brains of Alzheimer’s patients. Other therapeutic vaccines are in various stages of testing.
It’s about time for the biotech revolution to hit the vaccine industry in a big way. It has already upended the treatment of rheumatoid arthritis and a few cancers, and is starting to do the same for multiple sclerosis and other conditions including rare diseases like psoriasis. Now let us see what happens in this once-quiet corner of the biopharmaceutical market. As Hilleman’s career demonstrates, when industrial science is harnessed to the profit motive, enormous advances in human welfare are possible.
John E. Calfee is a resident fellow at the American Enterprise Institute, which is about to publish a new book on recent developments in the vaccine market, U.S. Vaccine Markets: Overview, Case Studies, and Comparisons with Pharmaceuticals and Other Biologics, by economists Ernest Berndt, Rena N. Denoncourt, and Anjli C. Warner of MIT. It provides the best summary yet published of vaccine development in the past two decades, along with a preview of what is on the way.
Maurice Hilleman's remarkable period of industrial scientific research yielded the most cost-effective medicines ever made.
The American, Friday, January 23, 2009
The pharmaceutical industry has been under attack for longer than most people realize. In the 1950s and 1960s, when for the first time in history we had quite a few drugs that worked very well—including many antibiotics, the first miracle drugs—there was the full panoply of congressional hearings, outraged newspaper editorials, and dour experts who described an industry in which prices were too high, marketing too important, and innovation in decline amid a flood of “me-too” drugs barely distinguishable from the original innovative brands. But I doubt that the atmosphere then was as hostile as it has been in the past five years or so. A flood of books, including some by authors with academic credentials, have re-circulated many of the same arguments (albeit with more emphasis on safety). The more scholarly works include Merrill Goozner’s The $800 Million Pill: The Truth Behind the Cost of New Drugs; Jerome Kassirer’s On The Take: How Medicine’s Complicity with Big Business Can Endanger Your Health; and Jerry Avorn’s Powerful Medicines: The Benefits, Risks, and Costs of Prescription Drugs. Others have a more muckrakian tone, beginning with the muchquoted The Truth About the Drug Companies by former New England Journal of Medicine editor Marcia Angell, and continuing on to many others including Ray Moynihan and Alan Cassels’s Selling Sickness: How the World’s Biggest Pharmaceutical Companies Are Turning Us All Into Patients; Howard Brody’s Hooked: Ethics, the Medical Professions, and the Pharmaceutical Industry; Alison Bass’s Side-effects: A Prosecutor, a Whistleblower, and a Bestselling Antidepressant on Trial (about the drug Paxil); and Philip R. Lee’s Pills, Profits, and Politics.
What’s with R&D?
To my mind, the most serious of these indictments focus on industry research. No doubt, the stakes are high for the industry. If drugs are truly innovative life-modifiers or life-savers, the argument over prices and spending tends to be marginalized. But if there hasn’t been a lot of innovation and if the innovation we do get comes mainly from the taxpayer-supported National Institutes of Health and other nonprofit organizations, the politics of drugs becomes difficult for the industry to handle.
We need to look ahead, and when we do it’s hard not to get excited. The entire field of immunology has taken off along with so much else in this age of biotechnology.
I have had occasion to write about innovation and its sources in the pages of The American. As I explained in “The Golden Age of Medical Innovation” (March/April 2007), the critics have paid too much attention to the annual count of new drug approvals by the FDA and too little attention to two crucial developments. One is the increasing importance of research that occurs after a drug is approved. Newer drugs, especially so-called biotech drugs including monoclonal antibodies, involve complex biological processes that are themselves subject to ever more sophisticated research on everything from DNA to drug interactions. Basic research and clinical trials have been running side by side, often with drugs themselves serving as research tools to find out what happens when a particular receptor is suppressed (such as the epidermal growth factor receptor, or EGFR, to cite a target that is important for cancer and much more). Sometimes, scientists harvest a series of improved treatments using existing drugs without actually getting a new one approved. Rituxan, originally approved for certain types of non-Hodgkin’s lymphoma cancer, is now approved for other types of cancer along with multiple sclerosis, rheumatoid arthritis, and Crohn’s Disease, and is being researched to treat lupus, idiopathic thrombocytopenia purpura, and chronic lymphocytic leukemia.
The other phenomenon that has been largely lost in popular discussion of drug R&D and its discontents is the extraordinary role played by “follow-on” drugs (a much more accurate term than “me-too”). The story with statin cholesterol-reducing drugs, where a decade or more of research on follow-ons revolutionized the prevention and treatment of coronary heart disease, is a familiar one. Similar stories are playing out now, but much faster. Competition among rapidly developed drugs to attack a promising target (such as tumor necrosis factor inhibitors for rheumatoid arthritis) can bring about revolutions in treatments as doctors and patients dance through one drug after another while dealing with the unique mix of side effects and drug resistance that plague each individual patient.
The interested reader can get a flavor of this blend of basic science and practical drug development by reading the fascinating discussion by Jan Vermorken, et al., of evolving treatments for head-and-neck cancer in the September 11, 2008 issue of The New England Journal of Medicine. Much of this story involves Erbitux, the monoclonal antibody that put Martha Stewart in jail after a disappointing FDA decision put the owners of ImClone, the developer, into a panic. The many years of up-and-down research and results on that drug, costing hundreds of millions or even billions of dollars back when no revenues were in sight, is probably as good an example as any of the vagaries and payoffs from high-risk drug research informed by ongoing work in pure science.
Hilleman set out to develop vaccines for the chief life-threatening viral and bacterial infectious diseases of childhood. Amazingly, he came close to clearing the table.
In another recent article in The American, I addressed the thorny question of the role of publicly supported basic research in drug development (“The Indispensable Industry,” May/June 2008). To put it in the simplest terms, a close look reveals a striking pattern that seems to be little noticed by the critics of private drug development: no matter how far-flung the curiosity-driven NIH-supported research is, the only results that seem to get translated into useable drugs are the ones that are grabbed by drug firms and put through the difficult research necessary to produce appropriate quantities of promising substances to run through years of arduous clinical trials. Take away the private sector, and basic research nearly always languishes with little practical effect, as is unceasingly and tragically illustrated by the dearth of new drugs and vaccines for malaria and tuberculosis. Sometimes, the drug firms themselves do perform crucial basic research, as in the case of Genentech’s Avastin for cancer and Lucentis to prevent blindness. These were the fruits of the firm’s own top-tier basic research forces.
Just Two Words
But there is something else in drug development that hardly gets talked about: the sheer energy and determination that you find in the private sector. Combine that with substantial financial resources and you get what John Maynard Keynes called “animal spirits,” a singular motivating force in creative capitalism. When this force attacks big problems, the results can be both spectacular and unexpected, sometimes with fabulous benefits for mankind. It so happens that animal spirits are very much involved in one of the great blessings of modern medicine: an armamentarium of vaccines, mainly given to children, which have been saving lives by the millions at astonishingly low costs. “The most cost-effective treatments ever created by mankind” is a typical summary of the value of vaccines for mumps, measles, rubella (German measles), and half a dozen or so others, including those for diphtheria, whooping cough, hepatitis A, and hepatitis B.
Where, you might ask, did all those life-saving vaccines come from? Amazingly, for half or more of them, the answer can be summarized in two words: Hilleman and Merck. You’ve likely never heard of Maurice Hilleman even though he probably saved more lives than any other scientist in the 20th century. For most of his career, Hilleman was a biologist at Merck, where he developed one vaccine after another, stretching through four extraordinary decades of productive work. Along the way, he pioneered new ways to create, test, and manufacture vaccines, and played a crucial role in the creation of an entirely different class of drugs known as interferons.
We know a lot about Hilleman’s career thanks to a wonderful book published last year by Paul Offit: Vaccinated: One Man’s Quest to Defeat the World’s Deadliest Diseases. Offit was the perfect vehicle for getting this story the attention it deserves. A prominent academic immunologist at Children’s Hospital of Philadelphia and the University of Pennsylvania Medical School, Offit is also a vaccine developer. He is a co-inventor and co-developer of Rotateq, the first fully successful vaccine for rotavirus, a cause of deadly dehydration that kills thousands of children annually in poor nations.
Offit is attuned to public policy. He has been a member of the Advisory Committee on Immunization Practices, whose child vaccination recommendations are gospel for physicians and payers. His previous book, The Cutter Incident, was an insightful historical account of how litigation over an early miracle vaccine—for polio—helped shape (very much for the worse) the entire litigation environment of vaccines and pharmaceuticals. Offit’s academic journal articles and newspaper op-eds on the consequences of unscientific attacks on vaccine safety are required reading for anyone interested in this contentious topic.
Offit’s Vaccinated is informed by 11 interviews with the 85-year-old Hilleman in 2005, during the last months of his life before he succumbed to cancer. Judging by dozens of meaty quotes, Offit is a probing interviewer, capturing a great scientist’s personality and working style to a degree that cannot be matched without personal experience with the subject, and is seldom matched even then.
His basic strategy was simple: solve whatever problems had to be solved in order to reach the goal, which was usually a new vaccine.
Who was Maurice Hilleman and what did he do? Born to a German-American family in 1919 and raised on a Montana farm near his birthplace, Hilleman was a brilliant student on scholarship at Montana State University. After graduation he moved to the Midwest intellectual mecca at the University of Chicago, where in 1944 he finished a Ph.D. in microbiology based on groundbreaking research on the chlamydia bacterium (previously thought to be a virus). To the dismay of his new intellectual peers, Hilleman left academia to work for a pharmaceutical firm, E.R. Squibb, where he achieved advances in flu vaccine development and manufacturing. In 1948, he moved to the Walter Reed military hospital in Washington. His work there culminated in an extraordinary episode in 1957 when he correctly forecast the arrival of a new Asian Flu to which almost no one was immune. He led the development and manufacturing (by private firms) of a vaccine in time to save hundreds of thousands of lives and perhaps many more.
In 1957 Hilleman returned to the private sector, this time at Merck, where he was head of virus and cell biology in Merck’s relatively new vaccine enterprise. Hilleman apparently set out to develop vaccines for the chief life-threatening viral and bacterial infectious diseases of childhood. Amazingly, he came close to clearing the table. First was the mumps, with the approval in 1967 of the “Jeryl Lynn” vaccine based on a mumps virus taken from his daughter of that same name. A measles vaccine arrived the next year. In 1969, we got a vaccine for rubella. Hilleman soon concocted the immensely useful idea of combining these three vaccines into a single shot. Approved in 1971, this proved a blessing to untold millions of small children and their mothers. The 1981 vaccine for hepatitis B (not really a childhood disease, of course) was a scientific and technological tour de force essentially from start to finish. In 1995 came the hepatitis A vaccine. For chicken pox, pneumococcus, and Hib (haemophilus influenzae type b), Hilleman transformed relatively untested vaccines into the mass-produced tools with which we are now familiar. It is hard to imagine the cumulative benefits of this research. (Hilleman also developed a vaccine for a destructive form of chicken cancer, rescuing a substantial part of the poultry industry.) Hilleman’s work sometimes ranged beyond vaccines. Starting in the late 1950s, he figured out how to mass-produce a newly discovered virus-killing substance in chickens called interferon. He soon detailed interferon’s basic physical, chemical, and biological properties, discovering that it was produced in many animals, including humans, and that it could impede or kill many viruses, such as those involved in cancer. He correctly predicted that interferon could be used to treat chronic infections and cancer. Today, it is used against hepatitis B, hepatitis C, and several types of cancer.
Problem Solving for Fun and Profit
This is more than the history of medicine, science, and technology. It is also business history, a classic story of problem solving for fun and profit and humanity. How was Hilleman able to accomplish so much in basic research, drug development, and manufacturing technology, often working essentially from scratch because vaccine development was still in its infancy when he set to work? The answer lies in Hilleman’s decision to work at Merck instead of pursuing a top-tier academic career. He realized that to attack the most pressing illnesses susceptible to immune-based prevention, he would have to marshal massive forces even after solving the purely intellectual puzzles. Merck had supported that kind of work before in Max Tishler’s research on the vitamin B complex. Offit tells us relatively little about internal Merck affairs, but it is clear that Hilleman enjoyed an extraordinary degree of autonomy combined with generous funding increases for low-profit products (now there’s a combination we’d all like to have!).
The Nobel Prize committee was not willing to award a prize to an industry scientist. It is hard not to see this as a miscarriage of scientific justice.
Hilleman sometimes exercised an iron fist over such normally mundane matters as manufacturing, where any deviation from his recipe could result in undetectable dangers. Indeed, Hilleman was apparently a bit of a tyrant, demanding almost as much of his staff as of himself, facilitated by his mastery of the art of profanity. Nonetheless, he retained the respect and often the devotion of his hard-driven staff along with near-legendary status among his academic peers.
In 1984, when Hilleman reached Merck’s mandatory retirement age of 65, he refused to retire and Merck kept him on. One result was the hepatitis A virus vaccine that arrived in 1995, along with a steady stream of academic work of all sorts until shortly before his death in 2005. Hilleman never jettisoned the problem solving method of a successful Montana farmer. Like Orville and Wilbur Wright when they built the first successful heavier-than-air flying machine, Hilleman’s basic strategy was simple: solve whatever problems had to be solved in order to reach the goal, which was usually a new vaccine. The list of problems included daunting scientific puzzles, excruciating judgments about whether dangerous side effects had been defeated, and the vagaries of regulation (much easier before the FDA got into the action).
As the 80-plus-year-old Hilleman approached death, Offit and other academic scientists lobbied the Nobel committee to award Hilleman the Nobel Prize for Medicine, based partly on his vaccine work and partly on his contributions to the basic science of interferons. The committee made clear that it was not going to award the prize to an industry scientist. (Offit has assured me that the situation was even more hopeless than he describes in his book.) It is hard not to see this as a miscarriage of scientific justice. Perhaps Hilleman would have done better if his volcanic personality had not included a surprising element of self-effacement. None of the vaccines or the crucial agents or processes he created were named after himself. At one point, he even called the developer of a new rubella vaccine to say that he thought it should replace his own because it was better. Hilleman’s absence from the academic and public spotlight was quite extraordinary. In one of the most striking of the dozens of anecdotes told by Offit, Hilleman’s death was announced to a meeting of prominent public health officials, epidemiologists, and clinicians gathered to celebrate the 50th anniversary of the Salk polio vaccine. Not one of them recognized Hilleman’s name!
Next…
Thanks to Offit and his book, Hilleman’s light and the extraordinary research achieved by the Merck company will shine for many, many years. What about vaccine research itself? There have been formidable obstacles. One was the liability system, which in the 1980s nearly killed off the child vaccine market before Congress removed child vaccines from the liability system altogether. Another, more persistent problem has been low reimbursement rates, especially by government, for traditional child vaccines (including most of Hilleman’s crop). This can discourage new research and production, and cause shortages. The situation was sufficiently worrisome to trigger a 2003 report by the federally sponsored Institute of Medicine entitled “Financing Vaccines in the 21st Century: Assuring Access and Availability.” Reimbursement seems to have improved recently. Better yet, newer vaccines are sufficiently protected by patents so that prices are set through ordinary market forces rather than government fiat. Merck and its competitors, such as GlaxoSmithKline and Sanofi-Aventis plus smaller firms, have developed a series of important new vaccines—notable among them are the pneumococcal vaccine, a vaccine for the human papilloma virus (which causes cervical cancer), and two rotavirus vaccines (including the one co-invented by Offit). Traditional vaccine research is now flourishing but will probably never again be dominated by a single person’s laboratory like the one run by Hilleman in his prime.
Hilleman was apparently a bit of a tyrant, demanding almost as much of his staff as of himself, facilitated by his mastery of the art of profanity.
But we need to look ahead, and when we do it’s hard not to get excited. The entire field of immunology—roughly speaking, the harnessing of the human immune system to fight disease—has taken off along with so much else in this age of biotechnology. We are discovering faster and more efficient ways to manufacture traditional vaccines (especially for the flu), better methods for identifying newly arrived infectious agents such as avian flu (the dreaded “bird flu” that could cause an epidemic on the scale of the one in 1918 that killed millions worldwide), and new techniques for developing vaccines once their targets have been identified.
And there is the extraordinary prospect of therapeutic vaccines, i.e., vaccines that harness the immune system to attack illnesses already present in the body rather than just preparing the body to reject infections that have not yet been encountered. None has been approved, but a brain cancer vaccine from the biotech firm Dendreon received a favorable rating from an FDA advisory committee and may yet gain approval from the FDA (despite its reluctance to approve highly innovative drugs in this era of attacks on it for paying too much attention to new benefits and too little attention to safety). Alzheimer’s Disease vaccines have achieved striking results against the beta-amyloid plaques typically found in the brains of Alzheimer’s patients. Other therapeutic vaccines are in various stages of testing.
It’s about time for the biotech revolution to hit the vaccine industry in a big way. It has already upended the treatment of rheumatoid arthritis and a few cancers, and is starting to do the same for multiple sclerosis and other conditions including rare diseases like psoriasis. Now let us see what happens in this once-quiet corner of the biopharmaceutical market. As Hilleman’s career demonstrates, when industrial science is harnessed to the profit motive, enormous advances in human welfare are possible.
John E. Calfee is a resident fellow at the American Enterprise Institute, which is about to publish a new book on recent developments in the vaccine market, U.S. Vaccine Markets: Overview, Case Studies, and Comparisons with Pharmaceuticals and Other Biologics, by economists Ernest Berndt, Rena N. Denoncourt, and Anjli C. Warner of MIT. It provides the best summary yet published of vaccine development in the past two decades, along with a preview of what is on the way.
Was There Ever a Default on U.S. Treasury Debt?
Was There Ever a Default on U.S. Treasury Debt? By Alex J. Pollock
AEI, Friday, January 23, 2009
As the government continues to bailout financial institutions and finance these rescues with government debt, one might wonder whether a default on Treasury debt is imaginable. It is--in 1933, the United States intentionally defaulted on its Treasury debt, an action that was supported by both Congress and the Supreme Court.
As the bailouts in the current bust inexorably mount, financed in rapidly increasing U.S. government debt, one might wonder whether a default on Treasury debt is imaginable. In the course of history, did the U.S. ever default on its debt?
Well, yes: The United States quite clearly and overtly defaulted on its debt as an expediency in 1933, the first year of Franklin Roosevelt's presidency. This was an intentional repudiation of its obligations, supported by a resolution of Congress and later upheld by the Supreme Court.
Granted, the circumstances were somewhat different in those days, since government finance still had a real tie to gold. In particular, U.S. bonds, including those issued to finance the American participation in the First World War, provided the holders of the bonds with an unambiguous promise that the U.S. government would give them the option to be repaid in gold coin.
Nobody doubted the clarity of this "gold clause" provision or the intent of both the debtor, the U.S. Treasury, and the creditors, the bond buyers, that the bondholders be protected against the depreciation of paper currency by the government.
Unfortunately for the bondholders, when President Roosevelt and the Congress decided that it was a good idea to depreciate the currency in the economic crisis of the time, they also decided not to honor their unambiguous obligation to pay in gold. On June 5, 1933, Congress passed a "Joint Resolution to Assure Uniform Value to the Coins and Currencies of the United States," of which two key points were as follows:
"Provisions of obligations which purport to give the obligee a right to require payment in gold obstruct the power of the Congress."
"Every provision contained in or made with respect to any obligation which purports to give the obligee a right to require payment in gold is declared to be against public policy."
"Purport"? "Against public policy"? Interesting rhetoric. In plain terms, the Congress was repudiating the government's obligations. So the bondholders got only depreciated paper money. The resulting lawsuits ended up in the Supreme Court, which upheld the ability of the government to refuse to pay in gold by a vote of 5-4.
The Supreme Court gold clause opinions of 1935 make instructive reading. The majority opinion, written by Chief Justice Hughes, includes these thoughts:
"The question before the Court is one of power, not policy."
"Contracts, however express, cannot fetter the constitutional authority of the Congress."
Justice McReynolds, writing on behalf of the four dissenting justices, left no doubt about their view:
"The enactments here challenged will bring about the confiscation of property rights and repudiation of national obligations."
"The holder of one of these certificates was owner of an express promise by the United States to deliver gold coin of the weight and fineness established."
"Congress really has inaugurated a plan primarily designed to destroy private obligations, repudiate national debts, and drive into the Treasury all gold within the country in exchange for inconvertible promises to pay, of much less value."
"Loss of reputation for honorable dealing will bring us unending humiliation."
The clearest summation of the judicial outcome was in the concurring opinion of Justice Stone, as a member of the majority:
"While the government's refusal to make the stipulated payment is a measure taken in the exercise of that power, this does not disguise the fact that its action is to that extent a repudiation."
"As much as I deplore this refusal to fulfill the solemn promise of bonds of the United States, I cannot escape the conclusion, announced for the Court, that the government, through exercise of its sovereign power, has rendered itself immune from liability."
So five of the nine justices explicitly stated that the obligations of the United States had been repudiated. There can be no doubt that the candid conclusion of this highly interesting chapter of our national financial history is that, under sufficient threat, crisis and pressure, a clear default on Treasury bonds did occur.
About 250 years ago, in a celebrated essay, "Of Public Credit," David Hume wrote:
"Contracting debt will almost infallibly be abused in every government. It would scarcely be more imprudent to give a prodigal son a credit in every banker's shop in London, than to empower a statesman to draw bills upon posterity."
Hume would have looked down from philosophical Valhalla in 1933-35 and seen his views confirmed. What, one wonders, would he be thinking now?
Alex J. Pollock is a resident fellow at AEI.
AEI, Friday, January 23, 2009
As the government continues to bailout financial institutions and finance these rescues with government debt, one might wonder whether a default on Treasury debt is imaginable. It is--in 1933, the United States intentionally defaulted on its Treasury debt, an action that was supported by both Congress and the Supreme Court.
As the bailouts in the current bust inexorably mount, financed in rapidly increasing U.S. government debt, one might wonder whether a default on Treasury debt is imaginable. In the course of history, did the U.S. ever default on its debt?
Well, yes: The United States quite clearly and overtly defaulted on its debt as an expediency in 1933, the first year of Franklin Roosevelt's presidency. This was an intentional repudiation of its obligations, supported by a resolution of Congress and later upheld by the Supreme Court.
Granted, the circumstances were somewhat different in those days, since government finance still had a real tie to gold. In particular, U.S. bonds, including those issued to finance the American participation in the First World War, provided the holders of the bonds with an unambiguous promise that the U.S. government would give them the option to be repaid in gold coin.
Nobody doubted the clarity of this "gold clause" provision or the intent of both the debtor, the U.S. Treasury, and the creditors, the bond buyers, that the bondholders be protected against the depreciation of paper currency by the government.
Unfortunately for the bondholders, when President Roosevelt and the Congress decided that it was a good idea to depreciate the currency in the economic crisis of the time, they also decided not to honor their unambiguous obligation to pay in gold. On June 5, 1933, Congress passed a "Joint Resolution to Assure Uniform Value to the Coins and Currencies of the United States," of which two key points were as follows:
"Provisions of obligations which purport to give the obligee a right to require payment in gold obstruct the power of the Congress."
"Every provision contained in or made with respect to any obligation which purports to give the obligee a right to require payment in gold is declared to be against public policy."
"Purport"? "Against public policy"? Interesting rhetoric. In plain terms, the Congress was repudiating the government's obligations. So the bondholders got only depreciated paper money. The resulting lawsuits ended up in the Supreme Court, which upheld the ability of the government to refuse to pay in gold by a vote of 5-4.
The Supreme Court gold clause opinions of 1935 make instructive reading. The majority opinion, written by Chief Justice Hughes, includes these thoughts:
"The question before the Court is one of power, not policy."
"Contracts, however express, cannot fetter the constitutional authority of the Congress."
Justice McReynolds, writing on behalf of the four dissenting justices, left no doubt about their view:
"The enactments here challenged will bring about the confiscation of property rights and repudiation of national obligations."
"The holder of one of these certificates was owner of an express promise by the United States to deliver gold coin of the weight and fineness established."
"Congress really has inaugurated a plan primarily designed to destroy private obligations, repudiate national debts, and drive into the Treasury all gold within the country in exchange for inconvertible promises to pay, of much less value."
"Loss of reputation for honorable dealing will bring us unending humiliation."
The clearest summation of the judicial outcome was in the concurring opinion of Justice Stone, as a member of the majority:
"While the government's refusal to make the stipulated payment is a measure taken in the exercise of that power, this does not disguise the fact that its action is to that extent a repudiation."
"As much as I deplore this refusal to fulfill the solemn promise of bonds of the United States, I cannot escape the conclusion, announced for the Court, that the government, through exercise of its sovereign power, has rendered itself immune from liability."
So five of the nine justices explicitly stated that the obligations of the United States had been repudiated. There can be no doubt that the candid conclusion of this highly interesting chapter of our national financial history is that, under sufficient threat, crisis and pressure, a clear default on Treasury bonds did occur.
About 250 years ago, in a celebrated essay, "Of Public Credit," David Hume wrote:
"Contracting debt will almost infallibly be abused in every government. It would scarcely be more imprudent to give a prodigal son a credit in every banker's shop in London, than to empower a statesman to draw bills upon posterity."
Hume would have looked down from philosophical Valhalla in 1933-35 and seen his views confirmed. What, one wonders, would he be thinking now?
Alex J. Pollock is a resident fellow at AEI.
Geithner's China Bash
Geithner's China Bash. WSJ Editorial
Jan 24, 2009
Timothy Geithner's tax oversights drew most of the media attention at his confirmation hearing, but the biggest news is the Treasury Secretary-designate's testimony Thursday that he'll ratchet up one of the Bush Administration's worst habits: China currency bashing.
In a written submission to the Senate Finance Committee, Mr. Geithner said the Obama Administration "believes that China is manipulating its currency." He says he wants Treasury to make "the fact-based case that market exchange rates are a central ingredient to healthy and sustained growth." The dollar promptly fell and gold jumped $40 on the news.
We're not sure what Mr. Geithner means by "market exchange rates," given that the supply of any modern currency is set by a monopoly known as the central bank. When Mr. Geithner says China is "manipulating" its currency, what investors around the world hear is that he really wants Beijing to restrain the number of yuan in circulation and increase its value vis-a-vis the dollar. That's a call for a dollar devaluation to help U.S. exporters.
This would seem to be an especially crazy time to undermine the dollar, given that the Treasury will have to issue some $2 trillion to $3 trillion in new dollar debt in the next couple of years. A stronger yuan would also contribute to Chinese deflation and slower growth, which would only mean a deeper world recession. Even the Bush Treasury never formally declared China to be a currency "manipulator" in its periodic reports to Congress. If the Obama Treasury is now going to take that step, hold on to those gold bars. We're in for an even scarier ride than the Fun Slide of the last few months.
Jan 24, 2009
Timothy Geithner's tax oversights drew most of the media attention at his confirmation hearing, but the biggest news is the Treasury Secretary-designate's testimony Thursday that he'll ratchet up one of the Bush Administration's worst habits: China currency bashing.
In a written submission to the Senate Finance Committee, Mr. Geithner said the Obama Administration "believes that China is manipulating its currency." He says he wants Treasury to make "the fact-based case that market exchange rates are a central ingredient to healthy and sustained growth." The dollar promptly fell and gold jumped $40 on the news.
We're not sure what Mr. Geithner means by "market exchange rates," given that the supply of any modern currency is set by a monopoly known as the central bank. When Mr. Geithner says China is "manipulating" its currency, what investors around the world hear is that he really wants Beijing to restrain the number of yuan in circulation and increase its value vis-a-vis the dollar. That's a call for a dollar devaluation to help U.S. exporters.
This would seem to be an especially crazy time to undermine the dollar, given that the Treasury will have to issue some $2 trillion to $3 trillion in new dollar debt in the next couple of years. A stronger yuan would also contribute to Chinese deflation and slower growth, which would only mean a deeper world recession. Even the Bush Treasury never formally declared China to be a currency "manipulator" in its periodic reports to Congress. If the Obama Treasury is now going to take that step, hold on to those gold bars. We're in for an even scarier ride than the Fun Slide of the last few months.
Shaping up America's nuclear deterrent
Atomic Bombshells. WSJ Editorial
Shaping up America's nuclear deterrent.
The Secretary of the Air Force and the Air Force Chief of Staff lost their jobs last year after two incidents involving the misuse of nuclear materials. In one, nuclear-armed cruise missiles were loaded on a B-52 bomber and flown across the country without anyone noticing for a day and a half. In the other, nose cones fitted with nuclear triggers were erroneously shipped to Taiwan.
Neither of those mishaps ended badly, and in retrospect the nation can say thanks for the wake-up call. The blunders focused attention on a problem that might otherwise have gone undetected until catastrophe struck: the neglect of U.S. nuclear forces and -- even more dangerous -- a lack of understanding at the Pentagon about nuclear deterrence.
These are the key findings of the Pentagon's task force on nuclear weapons management, which recently released its final report. The task force was appointed by Defense Secretary Bob Gates in the wake of the Air Force scandals and was led by former Defense Secretary James Schlesinger. Its initial report, last September, examined the Air Force's errors in its stewardship of nuclear weapons and made several recommendations. These mostly have been implemented, and the latest report commends the Air Force for its swift action.
The task force has now cast its eye more broadly and concludes that the "lack of interest and attention have been widespread" throughout the Pentagon's leadership. The exception is the Navy, which is responsible for submarine-launched nuclear weapons. Even there, though, not all is well. While the report finds the Navy's handling of nukes acceptable, it says there is evidence of some "fraying around the edges."
The Schlesinger panel makes a series of recommendations aimed at improving oversight and policy. They include establishing a position of assistant secretary of defense for deterrence, reducing the nonnuclear related responsibilities of U.S. Strategic Command, and beefing up inspections.
But the task force's most worrisome finding will require a new mindset. The panel finds a "distressing degree" of inattention to the role of nuclear deterrence among senior civilian and military leaders, especially regarding its psychological and political value. It proposes educational measures to "enhance understanding" of why we have a nuclear deterrent -- which, put simply, is to avoid the use of nuclear weapons. If adversaries believe the U.S. deterrent is weak, they might be tempted to use nukes against us or threaten to do so.
But there's a proliferation point too. The U.S. provides a nuclear umbrella for 30-plus countries. If our allies lose confidence, Mr. Schlesinger said at a press conference announcing the report, "five or six of those nations are quite capable of beginning to produce nuclear weapons on their own." This is precisely the opposite of what the nuclear-free-world types like to argue: If only the U.S. would get rid of its nukes, other countries would follow suit.
It's now up to the Obama Administration to move on the task force's findings. But adopting the management and personnel changes the report recommends won't be enough. "Strengthening the credibility of our nuclear deterrent should begin at the White House," the report states. If the new President makes clear his commitment to the U.S. nuclear deterrent, that attitude will echo down the chain of command.
Shaping up America's nuclear deterrent.
The Secretary of the Air Force and the Air Force Chief of Staff lost their jobs last year after two incidents involving the misuse of nuclear materials. In one, nuclear-armed cruise missiles were loaded on a B-52 bomber and flown across the country without anyone noticing for a day and a half. In the other, nose cones fitted with nuclear triggers were erroneously shipped to Taiwan.
Neither of those mishaps ended badly, and in retrospect the nation can say thanks for the wake-up call. The blunders focused attention on a problem that might otherwise have gone undetected until catastrophe struck: the neglect of U.S. nuclear forces and -- even more dangerous -- a lack of understanding at the Pentagon about nuclear deterrence.
These are the key findings of the Pentagon's task force on nuclear weapons management, which recently released its final report. The task force was appointed by Defense Secretary Bob Gates in the wake of the Air Force scandals and was led by former Defense Secretary James Schlesinger. Its initial report, last September, examined the Air Force's errors in its stewardship of nuclear weapons and made several recommendations. These mostly have been implemented, and the latest report commends the Air Force for its swift action.
The task force has now cast its eye more broadly and concludes that the "lack of interest and attention have been widespread" throughout the Pentagon's leadership. The exception is the Navy, which is responsible for submarine-launched nuclear weapons. Even there, though, not all is well. While the report finds the Navy's handling of nukes acceptable, it says there is evidence of some "fraying around the edges."
The Schlesinger panel makes a series of recommendations aimed at improving oversight and policy. They include establishing a position of assistant secretary of defense for deterrence, reducing the nonnuclear related responsibilities of U.S. Strategic Command, and beefing up inspections.
But the task force's most worrisome finding will require a new mindset. The panel finds a "distressing degree" of inattention to the role of nuclear deterrence among senior civilian and military leaders, especially regarding its psychological and political value. It proposes educational measures to "enhance understanding" of why we have a nuclear deterrent -- which, put simply, is to avoid the use of nuclear weapons. If adversaries believe the U.S. deterrent is weak, they might be tempted to use nukes against us or threaten to do so.
But there's a proliferation point too. The U.S. provides a nuclear umbrella for 30-plus countries. If our allies lose confidence, Mr. Schlesinger said at a press conference announcing the report, "five or six of those nations are quite capable of beginning to produce nuclear weapons on their own." This is precisely the opposite of what the nuclear-free-world types like to argue: If only the U.S. would get rid of its nukes, other countries would follow suit.
It's now up to the Obama Administration to move on the task force's findings. But adopting the management and personnel changes the report recommends won't be enough. "Strengthening the credibility of our nuclear deterrent should begin at the White House," the report states. If the new President makes clear his commitment to the U.S. nuclear deterrent, that attitude will echo down the chain of command.
A Primer on Whether Stimuli Stimulate
Does Stimulus Stimulate? By Bruce Bartlett
Forbes, Jan 23, 2009, 12:01 AM ET
In a few weeks, Congress will likely enact the largest fiscal stimulus legislation in history. Surprisingly, the whole idea of such a stimulus is much more controversial among A-list economists than I would have expected, given the depth and breadth of the economic malaise. Although the debate is rather technical, it's important to try to understand it because much is at stake.
Eighty years ago, the conventional view among economists was that government had nothing to do with business cycles--it neither caused them nor was there anything it could do about them. They were like the weather; you just coped the best you could.
Eventually, economists came to understand that vast numbers of individuals and businesses throughout the economy don't make exactly the same mistakes simultaneously unless something has changed the rules of the game. Government isn't always responsible--bubbles can occur on their own, as they have over the centuries--but systemic errors usually result from government policy.
The Federal Reserve, our nation's central bank, is the institution mainly responsible for altering the terms of trade. That is because it has the power to change the value of the currency, which is the intermediary in every single economic transaction, and also to alter the terms of every intertemporal transaction--those between the present and future, such as saving today to consume tomorrow--by raising or lowering the interest rate.
No one today believes that the Great Depression just happened or dragged on as long as it did because the private sector kept making mistake after mistake after mistake. It only made them and continued to do so because government interfered with the normal operations of the market and prevented readjustment from taking place.
The Great Depression resulted from a confluence of governmental errors--the Fed was too easy for too long in the 1920s, tightened too much in 1928-29 and then failed to fix its mistake, thus bringing on a general deflation that was very difficult to arrest once downward momentum had set in. Herbert Hoover compounded the problem by signing into law the Smoot-Hawley Tariff and sharply raising taxes in 1932.
Unfortunately, Franklin D. Roosevelt misunderstood the nature of the economy's problem and tried to fix prices to keep them from falling--thus preventing the very readjustment that would have brought about recovery. (See this paper by UCLA economists Harold Cole and Lee Ohanian.) He doesn't seem to have ever understood the critical role of Fed policy and mistakenly thought that arbitrarily raising the price of gold would make money easier.
Then, in 1937, just as the economy was starting to build some upward momentum, Roosevelt decided to raise taxes and cut spending, and the Fed suddenly concluded that inflation, rather than deflation, was the main problem and tightened monetary policy. (Note: According to the National Bureau of Economic Research, the Great Depression was basically two severe recessions--one from August 1929 to March 1933, and another from May 1937 to June 1938--not a continuous downturn.)
The result was an economic setback that didn't really end until both monetary and fiscal policy became expansive with the onset of World War II. At that point, no one worried any more about budget deficits, and the Fed pegged interest rates to ensure that they stayed low, increasing the money supply as necessary to achieve this goal.
It was then and only then that the Great Depression truly ended. As a consequence, economists concluded that an expansive monetary and fiscal policy, which had been advocated by economist John Maynard Keynes throughout the 1930s, was the key to getting out of a depression.
Keynes was right, but many of his followers weren't. They thought that budget deficits would stimulate growth under all circumstances, not just those of a deflationary depression. When this medicine was applied inappropriately, as it was in the 1960s and 1970s, the result was inflation.
Economists then concluded that it was a mistake to pursue countercyclical fiscal policy, and the idea of "fine-tuning" became a derogatory term. Even those who continued to believe it was theoretically possible to counter recessions with public works or government jobs programs were eventually forced to concede that it was impossibly difficult to make them work in a timely manner.
In the 1980s and 1990s, economists came around to the view that only monetary policy could act quickly enough to reverse or moderate a recession. But they never really came to grips with the Fed's responsibility for causing recessions in the first place. It always tightened a little too much when inflation was the problem and eased too much when slow growth was the problem.
For a time, a cult grew up around Fed Chairman Alan Greenspan. Many who should have known better convinced themselves that the "Maestro," as journalist Bob Woodward called him, would fix everything. Investors began seriously talking about a "Greenspan put"--the idea that the Fed would always protect them from a severe decline in the market. Nitwits wrote and bought books predicting astronomical levels for the stock market because Greenspan had permanently reduced the level of risk.
As we have seen, the Fed could not prevent the greatest financial downturn the world has seen since 1929. This has revived the idea that fiscal policy must be the engine that pulls us out.
Somewhat surprisingly, there has been rather heated opposition to the very principle of fiscal stimulus--a return to pre-Keynesian economics. And among those expressing dissent are some of the leading lights of economic theory over the last 40 years.
To be sure, the idea that fiscal policy was impotent never entirely disappeared. In 1969, economist Milton Friedman argued strenuously that only monetary policy really matters and that fiscal policy has no meaningful effect. Said Friedman, "In my opinion, the state of the budget by itself has no significant effect on the course of nominal income, on inflation, on deflation or on cyclical fluctuations."
Yet at the same time, monetarists argued that monetary policy had no lasting effect on the same economic variables. In the long run, they said, monetary policy could only affect nominal incomes, not real incomes. Real incomes were a function of things like growth of the labor force and productivity per work hour.
This led to a renewed emphasis on fiscal policy, but on the tax side rather than the spending side, as Keynesians tend to focus. Supply-siders argued that certain changes in tax policy--lowering marginal tax rates, reducing taxes on entrepreneurial income--were especially powerful, economically. Keynesians think that just putting dollars in peoples' pockets in order to stimulate consumption is the key to growth.
We have now had several tests of the Keynesian idea--most recently with last year's $300 tax rebate, which was supposed to prevent a recession. According to a new paper by University of Michigan economists Matthew Shapiro and Joel Slemrod, only a third of the money was spent, thus providing very little "bang for the buck."
The failure of rebates has shifted the focus to public works and other direct spending measures as a means of stimulating aggregate spending. A study by Obama administration economists Christina Romer and Jared Bernstein predicts that the stimulus plan being debated in Congress will raise the gross domestic product by $1.57 for every $1 spent.
Such a multiplier effect has been heavily criticized by a number of top economists, including John Taylor of Stanford, Gary Becker and Eugene Fama of the University of Chicago and Greg Mankiw and Robert Barro of Harvard. The gist of their argument is that the government cannot expand the economy through deficit spending because it has to borrow the funds in the first place, thus displacing other economic activities. In the end, the government has simply moved around economic activity without increasing it in the aggregate.
Other reputable economists have criticized this position as being no different from the pre-Keynesian view that helped make the Great Depression so long and deep. Paul Krugman of Princeton, Brad DeLong of the University of California at Berkeley and Mark Thoma of the University of Oregon have been outspoken in their belief that theory and experience show that government spending can expand the economy under conditions such as we are experiencing today.
I think the critics of an activist fiscal policy are forgetting the essential role of monetary policy as it relates to fiscal policy. As Keynes was very clear about, the whole point of fiscal stimulus is to mobilize monetary policy and inject liquidity into the economy. This is necessary when nominal interest rates get very low, as they are now, because Fed policy becomes impotent. Keynes called this a liquidity trap, and I think there is strong evidence that we are in one right now.
The problem is that fiscal stimulus needs to be injected right now to counter the liquidity trap. If that were the case, I think we might well get a very high multiplier effect this year. But if much of the stimulus doesn't come online until next year, when we are likely to be past the worst of the slowdown, then crowding out will greatly diminish the effectiveness of the stimulus, just as the critics argue. According to the Congressional Budget Office, only a fraction of proposed infrastructure spending can be spent before October of next year; the bulk would come long after.
Thus the argument really boils down to a question of timing. In the short run, the case for stimulus is overwhelming. But in the longer run, we can't enrich ourselves by borrowing and printing money. That just causes inflation.
The trick is to front-load the stimulus as much as possible while putting in place policies that will tighten both fiscal and monetary policy next year. As terrible as our economic crisis is right now, we don't want to repeat the errors of the past and set off a new round of stagflation.
For this reason, I think there is a better case for stimulating the economy through tax policy than has been made. Congress can change incentives instantly by, for example, saying that new investments in machinery and equipment made after today would qualify for a 10% Investment Tax Credit, and this measure would be in effect only for investments largely completed this year. Businesses will start placing orders tomorrow. By contrast, it will take many months before spending on public works begins to flow through the economy, and it is very hard to stop it when the economy turns around.
Stimulus based on private investment also has the added virtue of establishing a foundation for future growth, whereas consumption spending does not. As economist Hal Varian of the University of California at Berkeley recently put it, "Private investment is what makes possible future increases in production and consumption. Investment tax credits or other subsidies for private sector investment are not as politically appealing as tax cuts for consumers or increases in government expenditure. But if private investment doesn't increase, where will the extra consumption come from in the future?"
Forbes, Jan 23, 2009, 12:01 AM ET
In a few weeks, Congress will likely enact the largest fiscal stimulus legislation in history. Surprisingly, the whole idea of such a stimulus is much more controversial among A-list economists than I would have expected, given the depth and breadth of the economic malaise. Although the debate is rather technical, it's important to try to understand it because much is at stake.
Eighty years ago, the conventional view among economists was that government had nothing to do with business cycles--it neither caused them nor was there anything it could do about them. They were like the weather; you just coped the best you could.
Eventually, economists came to understand that vast numbers of individuals and businesses throughout the economy don't make exactly the same mistakes simultaneously unless something has changed the rules of the game. Government isn't always responsible--bubbles can occur on their own, as they have over the centuries--but systemic errors usually result from government policy.
The Federal Reserve, our nation's central bank, is the institution mainly responsible for altering the terms of trade. That is because it has the power to change the value of the currency, which is the intermediary in every single economic transaction, and also to alter the terms of every intertemporal transaction--those between the present and future, such as saving today to consume tomorrow--by raising or lowering the interest rate.
No one today believes that the Great Depression just happened or dragged on as long as it did because the private sector kept making mistake after mistake after mistake. It only made them and continued to do so because government interfered with the normal operations of the market and prevented readjustment from taking place.
The Great Depression resulted from a confluence of governmental errors--the Fed was too easy for too long in the 1920s, tightened too much in 1928-29 and then failed to fix its mistake, thus bringing on a general deflation that was very difficult to arrest once downward momentum had set in. Herbert Hoover compounded the problem by signing into law the Smoot-Hawley Tariff and sharply raising taxes in 1932.
Unfortunately, Franklin D. Roosevelt misunderstood the nature of the economy's problem and tried to fix prices to keep them from falling--thus preventing the very readjustment that would have brought about recovery. (See this paper by UCLA economists Harold Cole and Lee Ohanian.) He doesn't seem to have ever understood the critical role of Fed policy and mistakenly thought that arbitrarily raising the price of gold would make money easier.
Then, in 1937, just as the economy was starting to build some upward momentum, Roosevelt decided to raise taxes and cut spending, and the Fed suddenly concluded that inflation, rather than deflation, was the main problem and tightened monetary policy. (Note: According to the National Bureau of Economic Research, the Great Depression was basically two severe recessions--one from August 1929 to March 1933, and another from May 1937 to June 1938--not a continuous downturn.)
The result was an economic setback that didn't really end until both monetary and fiscal policy became expansive with the onset of World War II. At that point, no one worried any more about budget deficits, and the Fed pegged interest rates to ensure that they stayed low, increasing the money supply as necessary to achieve this goal.
It was then and only then that the Great Depression truly ended. As a consequence, economists concluded that an expansive monetary and fiscal policy, which had been advocated by economist John Maynard Keynes throughout the 1930s, was the key to getting out of a depression.
Keynes was right, but many of his followers weren't. They thought that budget deficits would stimulate growth under all circumstances, not just those of a deflationary depression. When this medicine was applied inappropriately, as it was in the 1960s and 1970s, the result was inflation.
Economists then concluded that it was a mistake to pursue countercyclical fiscal policy, and the idea of "fine-tuning" became a derogatory term. Even those who continued to believe it was theoretically possible to counter recessions with public works or government jobs programs were eventually forced to concede that it was impossibly difficult to make them work in a timely manner.
In the 1980s and 1990s, economists came around to the view that only monetary policy could act quickly enough to reverse or moderate a recession. But they never really came to grips with the Fed's responsibility for causing recessions in the first place. It always tightened a little too much when inflation was the problem and eased too much when slow growth was the problem.
For a time, a cult grew up around Fed Chairman Alan Greenspan. Many who should have known better convinced themselves that the "Maestro," as journalist Bob Woodward called him, would fix everything. Investors began seriously talking about a "Greenspan put"--the idea that the Fed would always protect them from a severe decline in the market. Nitwits wrote and bought books predicting astronomical levels for the stock market because Greenspan had permanently reduced the level of risk.
As we have seen, the Fed could not prevent the greatest financial downturn the world has seen since 1929. This has revived the idea that fiscal policy must be the engine that pulls us out.
Somewhat surprisingly, there has been rather heated opposition to the very principle of fiscal stimulus--a return to pre-Keynesian economics. And among those expressing dissent are some of the leading lights of economic theory over the last 40 years.
To be sure, the idea that fiscal policy was impotent never entirely disappeared. In 1969, economist Milton Friedman argued strenuously that only monetary policy really matters and that fiscal policy has no meaningful effect. Said Friedman, "In my opinion, the state of the budget by itself has no significant effect on the course of nominal income, on inflation, on deflation or on cyclical fluctuations."
Yet at the same time, monetarists argued that monetary policy had no lasting effect on the same economic variables. In the long run, they said, monetary policy could only affect nominal incomes, not real incomes. Real incomes were a function of things like growth of the labor force and productivity per work hour.
This led to a renewed emphasis on fiscal policy, but on the tax side rather than the spending side, as Keynesians tend to focus. Supply-siders argued that certain changes in tax policy--lowering marginal tax rates, reducing taxes on entrepreneurial income--were especially powerful, economically. Keynesians think that just putting dollars in peoples' pockets in order to stimulate consumption is the key to growth.
We have now had several tests of the Keynesian idea--most recently with last year's $300 tax rebate, which was supposed to prevent a recession. According to a new paper by University of Michigan economists Matthew Shapiro and Joel Slemrod, only a third of the money was spent, thus providing very little "bang for the buck."
The failure of rebates has shifted the focus to public works and other direct spending measures as a means of stimulating aggregate spending. A study by Obama administration economists Christina Romer and Jared Bernstein predicts that the stimulus plan being debated in Congress will raise the gross domestic product by $1.57 for every $1 spent.
Such a multiplier effect has been heavily criticized by a number of top economists, including John Taylor of Stanford, Gary Becker and Eugene Fama of the University of Chicago and Greg Mankiw and Robert Barro of Harvard. The gist of their argument is that the government cannot expand the economy through deficit spending because it has to borrow the funds in the first place, thus displacing other economic activities. In the end, the government has simply moved around economic activity without increasing it in the aggregate.
Other reputable economists have criticized this position as being no different from the pre-Keynesian view that helped make the Great Depression so long and deep. Paul Krugman of Princeton, Brad DeLong of the University of California at Berkeley and Mark Thoma of the University of Oregon have been outspoken in their belief that theory and experience show that government spending can expand the economy under conditions such as we are experiencing today.
I think the critics of an activist fiscal policy are forgetting the essential role of monetary policy as it relates to fiscal policy. As Keynes was very clear about, the whole point of fiscal stimulus is to mobilize monetary policy and inject liquidity into the economy. This is necessary when nominal interest rates get very low, as they are now, because Fed policy becomes impotent. Keynes called this a liquidity trap, and I think there is strong evidence that we are in one right now.
The problem is that fiscal stimulus needs to be injected right now to counter the liquidity trap. If that were the case, I think we might well get a very high multiplier effect this year. But if much of the stimulus doesn't come online until next year, when we are likely to be past the worst of the slowdown, then crowding out will greatly diminish the effectiveness of the stimulus, just as the critics argue. According to the Congressional Budget Office, only a fraction of proposed infrastructure spending can be spent before October of next year; the bulk would come long after.
Thus the argument really boils down to a question of timing. In the short run, the case for stimulus is overwhelming. But in the longer run, we can't enrich ourselves by borrowing and printing money. That just causes inflation.
The trick is to front-load the stimulus as much as possible while putting in place policies that will tighten both fiscal and monetary policy next year. As terrible as our economic crisis is right now, we don't want to repeat the errors of the past and set off a new round of stagflation.
For this reason, I think there is a better case for stimulating the economy through tax policy than has been made. Congress can change incentives instantly by, for example, saying that new investments in machinery and equipment made after today would qualify for a 10% Investment Tax Credit, and this measure would be in effect only for investments largely completed this year. Businesses will start placing orders tomorrow. By contrast, it will take many months before spending on public works begins to flow through the economy, and it is very hard to stop it when the economy turns around.
Stimulus based on private investment also has the added virtue of establishing a foundation for future growth, whereas consumption spending does not. As economist Hal Varian of the University of California at Berkeley recently put it, "Private investment is what makes possible future increases in production and consumption. Investment tax credits or other subsidies for private sector investment are not as politically appealing as tax cuts for consumers or increases in government expenditure. But if private investment doesn't increase, where will the extra consumption come from in the future?"