Monday, March 30, 2009
The Progressive: We should stop treating our schools as businesses
We should stop treating our schools as businesses.
The Progressive, March 24, 2009
Since the early 20th century, prominent business leaders have acted on the belief that since they are good at making money, they are the most qualified people to decide how to best educate the country’s young.
Entranced by the power and efficiency of American industry, many educational leaders have looked to these businessmen for leadership and for models of operation. They have tried to govern school systems as if they were corporations, organize schools as if they were something akin to factories and orient education toward testing and tracking students toward presumed “real world” destinies.
Today’s mantra is to allow the much-ballyhooed magic of the market to solve educational problems. Thus the emphasis on consumer choice among schools through vouchers or charters or plans to pay teachers based on test-score improvements.
There are many flaws inherent in imagining that schools will work well once they adopt factory or free-market models. Perhaps most fundamental is the presumption that schools work best when they emulate business.
But schools are not businesses.
When they flourish, they are living communities defined by powerful and caring collaboration.
Students are not things to be produced. They are human beings who are learning and growing in ways that are too complex for any standardized scores to truly measure.
Nor are teachers mere robots that drill students in how to take a test. The most talented and dedicated teacher is better nourished by a supportive work culture than by narrow appeals to individual self-interest, which pit teacher against teacher.
The purposes of schooling should not be degraded into privatized preparation toward the fattest paycheck.
Clearly, schools should prepare students to earn decent livelihoods. But just as importantly, they should prepare students to look toward — and even demand — jobs that are a major source of fulfillment and creative expression.
Schools should go far beyond preparing students for work. There are many non-market (perhaps even anti-market) lessons that schools impart: They inculcate an appreciation of the arts, establish healthy habits of exercise, teach cooperation, promote citizenship and show our children how to live together peacefully.
If schools do these tasks well, students when they become adults are much more likely to participate in socially positive ways, such as creating art and music, preventing domestic violence, working for racial equality, promoting clean energy and opposing war.
We have to remember, education is a humane and human process with social values beyond the bottom line. Business leaders have no expertise in this quest, and business models do not apply.
For that matter, now that casino capitalism has imploded, isn’t it time to stop looking to the corporate elite for advice on how to run the schools? These “experts”— the bankers and corporate CEOs — couldn’t even manage the one thing they are supposed to be good at: running their own businesses.
Educators should shed their subordinate status and sense of inferiority. Schools work best when teachers — in dialogue with parents and other citizens — design the educational experience, not corporate officials.
Wayne Au, Bill Bigelow and David Levine are editors of Rethinking Schools (www.rethinkingschools.org), a quarterly magazine based in Milwaukee.
WaPo: Immigrants from the storm-ravaged Haiti should be allowed to stay in the US
Immigrants from the storm-ravaged island should be allowed to stay in the United States.
Monday, March 30, 2009; A16
HAITI WAS already an island of unimaginable suffering, a country ravaged by war and roving gangs where four out of five residents lived in extreme poverty. Then, in less than a month last year, four vicious storms lashed it, killing up to 800 people, leaving as many as 1 million homeless and inflicting at least $1 billion in damage -- 15 percent of the country's gross domestic product. The State Department cautions visitors that there are no "safe areas" in Haiti, and that "kidnapping, death threats, murders, drug-related shootouts, armed robberies, break-ins and carjackings are common." Yet, it is U.S. policy to deport the estimated 30,000 Haitians in this country back to this hotbed of violence and squalor. The United States grants temporary protected status (TPS) to immigrants from countries with extreme economic or political conditions; Haitian immigrants more than qualify.
Haitians in the United States are one of the few sources of stability for their home country, sending back remittances that total an estimated one-fourth of the Haiti's GDP. Deporting Haitians, and thereby diminishing millions of dollars in what is essentially foreign aid, would devastate a country that can ill afford to take more economic hits. A surge of deportees, who would arrive without either homes or jobs, would also place an impossible burden on Haiti's skeletal social services.
Critics say that granting TPS would bring a rush of Haitians to the United States in search of citizenship. But TPS would apply only to Haitians in the United States at the time the order is issued. As the Miami-Dade Board of County Commissioners in Florida wrote recently in a letter to President Obama, there was no "mass exodus" of Haitians to the United States after the Clinton administration granted a stay of deportation in 1998.
The Bush administration was consistently inflexible on the issue, turning down Haitian applications for TPS with minimal explanation. After last year's storms, the Bush administration temporarily suspended the deportations, only to resume them months later, while the country was still reeling from the disasters. The Obama administration has so far maintained the Bush administration's policy, but advocates have met with Homeland Security Secretary Janet Napolitano and were encouraged by her response.
Mr. Obama recently issued an order that allowed Liberian immigrants to stay temporarily in the United States. Immigrants from Somalia, El Salvador, Nicaragua and Honduras have also been granted TPS in recent years. Why are immigrants from a disaster-wracked country that is the poorest in the hemisphere less deserving?
WaPo on Russia: Mr. Obama isn't contemplating change solely on the part of the US
Mr. Obama isn't contemplating change solely on the part of the United States.
WaPo, Monday, March 30, 2009; A16
WITH A FIRST presidential meeting set for this week between Barack Obama and Russia's Dmitry Medvedev, it appears that the two sides may have different ideas of what to expect from the "reset" in relations that the Obama administration has promised.
The Russian view seems to be that the resetting has to come primarily from the Americans. Foreign Minister Sergei Lavrov furthered that impression in an interview with the Financial Times last week. "Practically on any problematic issue which we inherited from the past eight years, I understand the Obama administration is undertaking a review which we welcome," Mr. Lavrov said. Russian officials appear to hope that such a review will mean less U.S. pressure to form a united front against Iran's development of a nuclear weapon and, above all, acceptance of a Russian "sphere of influence" over countries that were once part of the Soviet Union or the Warsaw Pact -- what Mr. Medvedev has called "a region of privileged interest."
Indications from Washington, recently reinforced by Mr. Obama, suggest that his administration does not share this view of a one-sided need for change. The administration is hoping for improved relations across a range of issues, including Iran, Afghanistan, and fighting terrorism and the spread of nuclear weapons. It will be more willing than the Bush administration to engage in arms control talks, especially to extend the Strategic Arms Reduction Treaty, which expires at the end of this year. But Mr. Obama has given no signs of being less alarmed than was President George W. Bush about Iran's nuclear program and certainly has shown no willingness to acquiesce in the "privileged" position that Mr. Medvedev claims over his neighbors.
On the contrary, at the same time that Vice President Biden introduced the "reset" concept, in a speech in February in Munich, he also repudiated the concept of spheres of influence. And after meeting with the secretary general of NATO last Wednesday, Mr. Obama reiterated the point. "My administration is seeking a reset of the relationship with Russia," the president said, "but . . . we are going to continue to abide by the central belief that countries who seek and aspire to join NATO are able to join NATO." The message: Georgia and Ukraine, former Soviet republics, should be free to form and join alliances as they choose, notwithstanding Russia's vitriolic objections.
The administration believes, in other words, that it can develop constructive relations with Russia without sacrificing the interests of Russia's neighbors. Whether such a reset will be acceptable to Mr. Medvedev or to Russia's de facto top ruler, Prime Minister Vladimir Putin, remains to be seen.
Conservative comments on WaPo and Abu Zubaydah
The Corner/NRO, Mar 30, 2009
Excerpts:
[The assault] on the CIA program continues with today’s front-page story about the interrogation of Abu Zubaydah: “Detainees Harsh Treatment Foiled No Plots.” The story, like so many on this program, is rife with errors and misinformation.
For example, the Post states:
“Abu Zubaida quickly told U.S. interrogators of [Khalid Sheikh] Mohammed and of others he knew to be in al-Qaeda, and he revealed the plans of the low-level operatives who fled Afghanistan with him. Some were intent on returning to target American forces with bombs; others wanted to strike on American soil again, according to military documents and law enforcement sources. Such intelligence was significant but not blockbuster material. Frustrated, the Bush administration ratcheted up the pressure — for the first time approving the use of increasingly harsh interrogations, including waterboarding.”
This is either uninformed or intentionally misleading.
In fact, what Abu Zubaydah disclosed to the CIA during this period was that the fact that KSM was the mastermind behind the 9/11 attacks and that his code name was “Muktar” – something Zubaydah thought we already knew, but in fact we did not. Intelligence officials had been trying for months to figure out who “Muktar” was. This information provided by Zubaydah was a critical piece of the puzzle that allowed them to pursue and eventually capture KSM. This fact, in and of itself, discredits the premise of the Post story – to suggest that the capture of KSM was not information that “foiled plots” to attack America is absurd on the face of it.
The Post also acknowledges that Zubaydah’s “interrogations led directly to the arrest of Jose Padilla” but dismisses Padilla as the man behind a fanciful “dirty bomb” plot and notes that Padilla was never charged in any such plot. In fact, Padilla was a hardened terrorist who had trained in al Qaeda camps in Afghanistan, and was a protégé of al Qaeda’s third in command, Mohammed Atef. And when he was captured, Padilla was being prepared for a much more sinister and realistic attack on America.
In June of 2001, Padilla met in Afghanistan with Atef, who asked him if he was willing to undertake a mission to blow up apartment buildings in the United States using natural gas. He agreed, and was sent to a training site near the Kandahar airport to prepare for the attack under close supervision of an al Qaeda explosives expert, who taught him about switches, circuits, and timers needed to carry it out. He was training in Afghanistan when Coalition forces launched Operation Enduring Freedom. Atef was killed by a Coalition airstrike, and Padilla joined the other al Qaeda operatives fleeing Afghanistan.
It was at this time that he met Abu Zubaydah, who helped arranged his passage across the Afghan-Pakistan border. At the time, Padilla told Zubaydah of his idea of a “dirty bomb” plot. Zubaydah was skeptical but sent him to see KSM, and told KSM he was free to use Padilla for his planned follow on operations in the US. Instead of the dirty bomb plot, KSM directed Padilla and an accomplice to undertake the apartment buildings operation for which he had initially trained. KSM’s right-hand man, Ammar al Baluchi, gave Padilla $10,000 in cash, travel documents, a cell phone, and an email address to be used to notify al Baluchi once Padilla arrived in America. The night before his departure, KSM, al Baluchi, and KSM’s nephew and 9/11 plotter Ramzi bin al Shibh hosted a farewell dinner for him and his accomplice. Think about that for a moment: Padilla was feted at a dinner the night of his departure for America by the mastermind of 9/11, and two of his key accomplices.
Padilla left Pakistan on April 5, 2002 bound for the US by way of Zurich. En route, he spent a month in Egypt, and then arrived in Chicago’s O’Hare airport on May 8 where he was apprehended – because, even the Post acknowledges, of the information provided by Abu Zubaydah. At the time of his apprehension, he was carrying the $10,000 given him by his al Qaeda handlers, the cell phone, and the email address for al Baluchi. (For a detailed account of Jose Padilla’s activities, see this speech by former Deputy Attorney General James Comey.
So again, the premise of the Post story, is wrong.
Since his capture, Abu Zubaydah had provided the CIA with the critical link that had identified KSM as “Muktar” and the mastermind of 9/11, as well as information that led to the capture of Padilla and the disruption of a planned attack on the American homeland. The CIA knew he had more information that could save American lives, but now he had stopped talking. So the CIA used enhanced interrogation techniques to get him talking again -- and these techniques worked.
Zubaydah soon he began to provide information on key al Qaeda operatives, including information that helped us find and capture more of those responsible for the attacks on September the 11th, including Ramzi bin al Shibh. At the time of his capture, bin al Shibh had been working in Karachi on follow-on operations against the West – including a plot to hijack passenger planes in Europe and fly them into Heathrow airport. Bin al Shibh had identified four operatives for the operation, when he was taken into custody.
Together Zubaydah and bin al Shibh provided information that helped in the planning and execution of the operation that captured KSM. KSM then provided information that led to the capture of a Southeast Asian terrorist named Zubair—an operative with the terrorist network Jemmah Islamiyah, or JI. Zubair then provided information that led to the capture of a JI terrorist leader named Hambali—KSM's partner in developing a plot to hijack passenger planes and fly them into the tallest building on the West Coast: the Library Tower in Los Angeles. Told of Hambali's capture, KSM identified Hambali's brother "Gun Gun" as his successor and provided information that led to his capture. Hambali's brother then gave us information that led us to a cell of JI operatives that were going to carry out the West Coast plot.
KSM also provided vital information that led to the disruption of an al Qaeda cell that was developing anthrax for attacks inside the United States. He gave us information that helped us capture Ammar al Baluchi. At the time of his capture, al Baluchi was working with bin al Shibh on the Heathrow plot, as well as a plot to carry out an attack against the US consulate in Karachi. According to his CIA biography, al Baluchi “was within days of completing preparations for the Karachi plot when he was captured.”
In addition, KSM and other senior terrorists helped identify individuals that al Qaeda deemed suitable for Western operations, many of whom we had never heard about before. These included terrorists who were sent to case targets inside the United States, including financial buildings in major cities on the East Coast. They painted a picture of al Qaeda's structure and financing, and communications and logistics. They identified al Qaeda's travel routes and safe havens, and explained how al Qaeda's senior leadership communicates with its operatives in places like Iraq. They provided information that allowed the CIA to make sense of documents and computer records that we have seized in terrorist raids. They identified voices in recordings of intercepted calls, and helped us understand the meaning of potentially critical terrorist communications. It is the official assessment of our intelligence community that “Were it not for this program, our intelligence community believes that al Qaeda and its allies would have succeeded in launching another attack against the American homeland.”
And the whole chain I have just described began with the interrogation of Abu Zubaydah.
The Left is desperate to discredit the efficacy of this program, and they have launched a desperate campaign to destroy it. Last week it was the leak of an ICRC document describing some of the techiques allegedly used in the program – one of the most damaging leaks of classified information since the war on terror began because it allows al Qaeda to train against the techniques. And now we have this highly uninformed front-page story in the Washington Post. All of this is incredibly damaging to the security of the United States. And if America is attacked again, those responsible for the disclosure of this information will bear much of the blame.
Saturday, March 28, 2009
Nosophobia and Fear of Invisible Toxins
ACSH, March 27, 2009
“If it smells bad, it’s bad; if it smells good, it’s bad,” says Aileen Gagney, asthma and environmental health manager with the American Lung Association in Seattle. (1) Obviously then, the key to a healthy life is to have no smells around you. How unfortunate, since we are excellent smellers!
The tongue can detect sweetness at a dilution of one part in 200, saltiness at one in 400, sourness at one in 130,000, and bitterness at one in 2 million. (2) All of this pales when compared with our ability to detect extremely low levels of smells (i.e., in the range of 50 parts per trillion to 800 parts per billion. (3)
If you are inclined to agree with Ms. Gagney, perhaps you have nosophobia, the irrational fear of contracting a disease -- or perhaps I should say nose-ophobia, since we so often hear of people fearful of contracting a disease by smelling. Let’s talk about fragrances and smoking to provide some present-day examples.
Elizabeth Whelan reports, “Fragrances now join a growing list of allegedly harmful products -- plastic bottles, rubber duckies, shower curtains, Astroturf, traditional produce raised with agricultural chemicals, aspartame, acrylamide, etc. The list seems to be growing like…well, ‘toxic mold.’ Nosophobia is causing us to abandon safe, useful products of modern technology to avoid phantom risks while obscuring the real risks around us. While parents are fretting over BPA traces in baby bottles or phthalates in plastic toys, they may well be giving short shrift to the real threats to their children’s health, including failure to use seatbelts, bike helmets, smoke detectors, vaccinations, proper nutrition and exercise.” (4)
When University of Washington professor Anne Steinemann analyzed a variety of fragranced consumer products such as air fresheners, laundry supplies, personal care products and cleaners, she found 100 different volatile organic compounds (VOCs) measuring 300 micrograms/m3, or more. (5)
Wow, you say! Sounds scary! But, let’s look at concentrations. VOCs were identified from GC/MS headspace analysis. Only those with a headspace concentration of greater than 300 micrograms/m3 were reported. Average headspace concentrations of VOCs for the six products tested ranged from 1,000 micrograms/m3 to 74,000 micrograms/m3. In Steinemann’s work, 10 of the 100 volatile organic compounds identified qualified under federal rules as toxic or hazardous, and two of those; 1,4-dioxane and acetaldehyde are classified as Hazardous Air Pollutants (HAPs). (5)
What are OSHA permissible exposure limits for 1,4-dioxane and acetaldehyde? The current OSHA permissible exposure limit for dioxane is 100 ppm (350 milligrams/m3) as an 8-hour time-weighted average. Note: THIS IS 1000 TIMES THE LEVEL reported in Steinemann’s paper. A similar case can be made for acetaldehyde. The OSHA 8-hour time-weighted average for acetaldehyde is 200 ppm (360 milligrams/m3). So, for the fragrance scare we’re talking about the amounts are considerably below allowable regulations. Yet, the comment is made; “Consumers are breathing these chemicals. No one is doing anything about it.” (1) Great scare tactics!
These days scientists can find anything in any thing and as this example shows, it can lead to a problem. The minute that something is found in food, in someone’s blood, in the air, etc., some folks get very concerned and start creating a lot of fuss. The very act of being able to measure something can give the impression that if it’s quantifiable, it’s dangerous. How unfortunate, since scientists are getting more clever all the time. Folks forget the old adage that ‘the dose makes the poison,’ and act on the principle that just the fact that anything is found is cause for alarm.
Here’s another one for nosophobiasts. There is now such a thing as ‘third hand smoke.’ Dr. Jonathan Winickoff, lead author of a recent paper in Pediatrics, says the following, “If the smell of the cigarettes lingers, so does the danger. Your nose isn’t lying. If your body detects it, then it’s there,” he says. Sounds like the opening comments in this article from Aileen Gagney.
What was the scientific study, which incidentally was given great TV coverage by Dr. Nancy Snyderman of the Today Show? Dr. Winickoff and his colleagues surveyed 1,500 households across the US and asked folks if they agreed with the statements:
• Breathing air in a car today where people smoked yesterday can harm the health of babies and children.
• Breathing air in a room where people smoked yesterday can harm the health of babies and children.
Those surveyed who stated they agreed or agreed strongly were categorized as believing third hand smoke harmed the health of babies and children. This is the “evidence” that third hand smoke had been “identified” as a health danger. (6)
Think about this for a moment. You could have been called as part of this survey and had your chance to play ‘scientist’ and provide data for this analysis. You may not like third hand smoke, but is a telephone solicitation of non-scientists really scientific evidence that it is bad?
There turned out to be much more to this story. What consumers didn’t hear from reporters was that it was conducted by the National Social Climate Survey of Tobacco Control, a special interest group working to legislate bans on tobacco. The Tobacco Consortium, which sets the group’s agenda, is chaired by Dr. Winickoff of Harvard and the lead author of this paper. Makes one wonder about the review process for papers that appear in the journal Pediatrics.
For that matter, a fair number of scientists debate whether second hand smoke exposure during childhood is harmful to long-term health. Sandy Szwarc reports, “The world’s largest study ever done to examine the association between exposure to environmental tobacco smoke (ETS) and lung cancer was conducted by the International Agency for Research on Cancer in Lyon, France, and published in 1998 in the Journal of the National Cancer Institute. (7) It included lung cancer patients up to 74 years of age, and a control group, in 12 centers from seven European countries, looking at cases of lung cancer and exposures to ETS. They found ‘no association between childhood exposure and ETS and lung cancer risk.’” (8)
The authors of the Pediatrics article suggested dangers at exposure levels far below the levels of second hand smoke -- third hand smoke exposures -- even without regard to ventilation, the number of cigarettes parents smoked, or length of exposure.
Szwarc adds, “Any exposure at all, and at levels barely detectable with modern instrumentation, is now being suggested as able to cause cancer and brain damage in children. This turns everything that science knows about toxicology on its head and denies the most fundamental law: that the dose makes the poison. In other words, there is no credible medical evidence to support the suggestion that trace exposures lingering in the air or on clothing are harming children.” (8)
In an entertaining and insightful opinion piece for the Daily Mail, Tom Utley, a smoker described what scare tactics like these, that are such obvious hoaxes, actually undermine the effectiveness of efforts to reduce smoking and exposures to children. No one is arguing that smoking is a healthful habit or would encourage young people to take up smoking, but he ‘most vehemently challenged Dr. Winickoff’s right to dress up this insulting, scaremongering, palpable drivel as science.’ Utley adds, “The very last thing I want is to encourage anybody to take up my disgusting and ruinously expensive habit, which I’m sure will be the death of me. But then I hear the latest hysterical rubbish from the anti-smoking lobby and my determination to remain silent goes the same way as my annual New Year’s resolution to give up the vile weed. Why, when there are so many excellent reasons to quit, do these fanatics feel obliged to keep on inventing new and obviously bogus ones? (9)
Szwarc concludes, “This is one of the most egregious examples of the increasingly common and unethical practice of politicizing science and using a ‘study’ to advance an agenda of a biased media failing to do its job. Neither is in the interest of the facts or truth. The healthiest thing for all of us might be a helpful dose of common sense and respect for other people’s choices. Otherwise, we may next hear about the dangers of fourth hand smoke, a term coined by John Boston of the Santa Clarita Valley Signal: someone sitting next to someone who is thinking about someone else smoking. And someone will believe it and report is as news.” (8)
References
1. Lisa Stiffler, “Fresh scent may hide toxic secret,” seattlepi.com, July 23, 2008
2. Peter Farb amd George Armelagos, Consuming Passions, (Boston, Houghton Mifflin Company, 1980), 22
3. Herbert S. Rosenkranz and Albert R. Cunningham, “Environmental odors and health hazards,” The Science of the Total Environment, 313, 15, 2003
4. Elizabeth M. Whelan, “Toxics (from Seattle Post-Intelligencer)”, American Council on Science and Health, July 25, 2008
5. Anne C. Steinemann, “Fragranced consumer products and undisclosed ingredients,” Environ. Impact Asses. Rev., (2008), doi;10.1016/j.eiar.2008.05.002
6. Jonathan P Winickoff et al., “Beliefs About the Health Effects of ‘Third hand’ Smoke and Home Smoking Bans,” Pediatrics, 123, e74, January 1, 2009
7. B. P. Agudo et al., “Multicenter case-control study of exposure to environmental tobacco smoke and lung cancer in Europe,” J. Natl. Cancer Inst., 90, 1440, October 7, 1998
8. Sandy Szwarc, “Third hand smoke and chemtrials -- invisible toxin fears,” junkfoodscience.com, January 10, 2009
9. Tom Utley, dailymail.co.uk, January 9, 2009
Jack Dini is a scientist and science writer living in Livermore, CA.
Three Mile Island and Chernobyl: What Went Wrong and Why Today’s Reactors Are Safe
Heritage, WebMemo #2367, March 27, 2009
This Saturday marks the 30th anniversary of the partial meltdown of the Three Mile Island (TMI) nuclear reactor. This ofccasion is a good time to consider the advances in nuclear power safety since that time and discuss the misinformation about this incident and the 1986 nuclear accident in Chernobyl, Ukraine, which is often associated with TMI.
Three Mile Island: What Happened
On March 28, 1979, a cooling circuit pump in the non-nuclear section of Three Mile Island's second station (TMI-2) malfunctioned, causing the reactor's primary coolant to heat and internal pressure to rise. Within seconds, the automated response mechanism thrust control rods into the reactor and shut down the core. An escape valve opened for 10 seconds to vent steam into a pressurizer, as it was supposed to, but it failed to close. Control room operators only saw that a "close" command was sent to the relief valve, but nothing displayed the valve's actual position.[1] With the valve open, too much steam escaped into the pressurizer, sending misinformation to operators that there was too much pressure in the coolant system. Operators then shut down the water pumps to relieve the "pressure."
Operators allowed coolant levels inside the reactor to fall, leaving the uranium core exposed, dry, and intensely hot. Even though inserting control rods halted the fission process, the TMI-2 reactor core continued to generate about 160 megawatts of "decay" heat, declining over the next three hours to 20 megawatts.[2] Approximately one-third of the TMI-2 reactor was exposed and began to melt.
By the time operators discovered what was happening, superheated and partially radioactive steam built up in auxiliary tanks, which operators then moved to waste tanks through compressors and pipes. The compressors leaked. The steam leakage released a radiation dose equivalent to that of a chest X-ray scan, about one-third of the radiation humans absorb in one year from naturally occurring background radiation.[3] No damage to any person, animal, or plant was ever found.[4]
The Outcome
The local population of 2 million people received an average estimated dose of about 1 millirem--miniscule compared to the 100-125 millirems that each person receives annually from naturally occurring background radiation in the area. Nationally, the average person receives 360 millirems per year.[5]
No significant radiation effects on humans, animals, or plants were found. In fact, thorough investigation and sample testing of air, water, milk, vegetation, and soil found that there were negligible effects and concluded that the radiation was safely contained.[6] The most recent and comprehensive study was a 13-year evaluation of 32,000 people living in the area that found no adverse health effects or links to cancer.[7]
Technological Improvements and Lessons Learned
A number of technological and procedural changes have been implemented by industry and the Nuclear Regulatory Commission (NRC) to considerably reduce the risk of a meltdown since the 1979 incident. These include:
- Plant design and equipment upgrades, including fire protection, auxiliary feedwater systems, containment building isolation, and automatic plant shut down capabilities;
- Enhanced emergency preparedness, including closer coordination between federal, state, and local agencies;
- Integration of NRC observations, findings, and conclusions about plant performance and management into public reports;
- Regular plant performance analysis by senior NRC managers who identify plants that require additional regulatory attention;
- Expansion of NRC's resident inspector program, whereby at least two inspectors live nearby and work exclusively at each plant;
- Expanded performance- and safety-oriented inspections;
- Additional accident safety equipment to mitigate and monitor conditions during accidents; and[8]
- Establishment of the Institute for Nuclear Power Operators, an industry-created non-profit organization that evaluates plants, promotes training and information sharing, and helps individual plants overcome technical issues.
Chernobyl: What Happened
Seven years after the incident at Three Mile Island, on April 25, 1986, a crew of engineers with little background in reactor physics began an experiment at the Chernobyl nuclear station. They sought to determine how long the plant's turbines' inertia could provide power if the main electrical supply to the station was cut. Operators chose to deactivate automatic shutdown mechanisms to carry out their experiment.[9]
The four Chernobyl reactors were known to become unstable at low power settings,[10] and the engineers' experiment caused the reactors to become exactly that. When the operators cut power and switched to the energy from turbine inertia, the coolant pump system failed, causing heat and extreme steam pressure to build inside the reactor core. The reactor experienced a power surge and exploded, blowing off the cover lid of the reactor building, and spewed radioactive gasses and flames for nine days.
The episode was exacerbated by a second design flaw: The Chernobyl reactors lacked fully enclosed containment buildings, a basic safety installation for commercial reactors in the U.S.[11]
The Outcome
Chernobyl was the result of human error and poor design. Of the approximately 50 fatalities, most were rescue workers who entered contaminated areas without being informed of the danger.
The World Heath Organization says that up to 4,000 fatalities could ultimately result from Chernobyl-related cancers. Though these could still emerge, as yet, they have not. The primary health effect was a spike in thyroid cancer among children, with 4,000-5,000 children diagnosed with the cancer between 1992 and 2002. Of these, 15 children unfortunately died. Though these deaths were unquestionably tragic, no clear evidence indicates any increase in other cancers among the most heavily affected populations.
Interestingly, the World Health Organization has also identified a condition called "paralyzing fatalism," which is caused by "persistent myths and misperceptions about the threat of radiation."[12] In other words, the propagation of ignorance by anti-nuclear activists has caused more harm to the affected populations than has the radioactive fallout from the actual accident. Residents of the area assumed a role of "chronic dependency" and developed an entitlement mentality because of the meltdown.[13]
Technology Improvements and Lessons Learned
Comparing the technology of the nuclear reactor at Chernobyl to U.S. reactors is not fair. First, the graphite-moderated, water-cooled reactor at Chernobyl maintained a high positive void coefficient. While the scientific explanation[14] of this characteristic is not important, its real-life application is. Essentially, it means that under certain conditions, coolant inefficiency can cause heightened reactivity. In other words, its reactivity can rapidly increase as its coolant heats (or is lost) resulting in more fissions, higher temperatures, and ultimately meltdown.[15]
This is in direct contrast to the light-water reactors used in the United States, which would shut down under such conditions. U.S. reactors use water to both cool and moderate the reactor. The coolant keeps the temperature from rising too much, and the moderator is used to sustain the nuclear reaction. As the nuclear reaction occurs, the water heats up and becomes a less efficient moderator (cool water facilitates fission better than hot water), thus causing the reaction to slow down and the reactor to cool. This characteristic makes light water reactors inherently safe and is why a Chernobyl-like reactor could never be licensed in the U.S.
Given the inherent problems with the Chernobyl reactor design, many technological changes and safety regulations were put in place to prevent another Chernobyl-like meltdown from occurring. Designers renovated the reactor to make it more stable at lower power, have the automatic shutdown operations activate quicker, and have automated and other safety mechanisms installed.[16]
Chernobyl also led to the formation of a number of international efforts to promote nuclear power plant safety through better training, coordination, and implantation of best practices. The World Association of Nuclear Operators is one such organization and includes every entity in the world that operates a nuclear power plant.
Myths Persist
The circumstances, causes, and conditions of the Chernobyl meltdown are far removed from the American experience. Important lessons should be taken from both accidents. Thankfully, many improvements in the technology and regulatory safety of nuclear reactors are among them.
Jack Spencer is Research Fellow in Nuclear Energy and Nicolas D. Loris is a Research Assistant in the Thomas A. Roe Institute for Economic Policy Studies at The Heritage Foundation.
Notes
[1]World Nuclear Association, "Three Mile Island: 1979," March 2001, at http://www.world-nuclear.org/info/inf36.html (March 26, 2009).
[2]Smithsonian Institution, National Museum of American History, "Three Mile Island: The Inside Story," at http://americanhistory.si.edu/tmi/tmi03.htm (March 26, 2009).
[3]American Nuclear Society, "What Happened and What Didn't in the TMI-2 Accident," at http://www.ans.org/pi/resources/sptopics/tmi/whathappened.html (March 26, 2009).
[4]U.S. Nuclear Regulatory Commission, "Fact Sheet on the Three Mile Island Accident," http://www.nrc.gov/reading-rm/doc-collections/fact-sheets/3mile-isle.html (March 26, 2009). [5]United States Department of Energy, Office of Civilian Radioactive Waste Management, "Facts About Radiation," OCRWM Fact Sheet, January 2005, at http://www.ocrwm.doe.gov/factsheets/doeymp0403.shtml (November 6, 2008).
[6]Nuclear Regulatory Commission, "Fact Sheet on the Three Mile Island Accident."
[7]World Nuclear Association, "Three Mile Island: 1979."
[8] Fact Sheet on the Three Mile Island Accident" Nuclear Regulatory Commission at http://www.nrc.gov/reading-rm/doc-collections/fact-sheets/3mile-isle.html (June 24, 2008).
[9]For full description of what caused the accident at Chernobyl, see Richard Rhodes, Nuclear Renewal (New York: Penguin Books, 1993), ch. 5.
[10]World Nuclear Association, "Chernobyl Accident," May 2008, at http://www.world-nuclear.org/info/chernobyl/inf07.html (March 26, 2009).
[11]Simon Rippon et al., "The Chernobyl Accident," Nuclear News, June 1986, at http://www.ans.org/pi/resources/sptopics/chernobyl/docs/nn-1986-6-chernobyl-lores.pdf (March 26, 2009).
[12]Press release, "Chernobyl: The True Scale of the Accident," World Health Organization, International Atomic Energy Agency, and U.N. Development Programme, September 5, 2005, at http://www.who.int/mediacentre/news/releases/2005/pr38/en/print.html (November 6, 2008).
[13]World Nuclear Association, "Chernobyl Accident."
[14]"Neutron Kinetics of the Chernobyl Accident," ENS News, Summer 2006, at http://www.euronuclear.org/e-news/e-news-13/neutron-kinetics.htm (March 27, 2009).
[15]International Atomic Energy Agency, "The Chernobyl Accident: Updating of INSAG-1," 1992, at http://www-pub.iaea.org/MTCD/publications/PDF/Pub913e_web.pdf (August 27, 2008).
New Global Currency Proposal: Good Diplomatic Theater but Bad Policy
Heritage WebMemo #2364, March 26, 2009
Recently, both China and Russia have called for the replacement of the dollar as the international reserve currency of choice, suggesting use of IMF Special Drawing Rights (SDRs) instead. Don't rush to sell your greenbacks, however: The proposal has far more to do with the theater of international diplomacy than the workings of the world economy.
Even as he made the proposal, Chinese central banker Zhou Xiaochun acknowledged that it would require "extraordinary political vision and courage." That is diplomatic speak for "We know this is impossible." The fact that Zhou and his Russian counterpart in proposing the idea, Finance Minister Alexei Kudrin, placed the timeline for the change far in the future--30 years in the case of Kudrin--offers an additional strong clue that the proposal is politically motivated rather than intended to address a real and pressing economic problem.
For both Zhou and Kudrin, an attack on the dollar just before the G-20 economic summit is a great theatrical device with which to express displeasure at U.S. dominance of the international financial system. It is also a marker of their unhappiness with the ineffective U.S. approach to restoring world growth and protecting international trade and financial flows, sending a clear signal that they have no intention of rubber-stamping U.S. proposals at the summit.
The Dollar Works for Everybody
The U.S. dollar is currently the principal international reserve currency, a role it assumed following the collapse of the gold standard and the dissolution of the British Empire in the second half of the 20th century. The U.S. gains significant advantages from the use by others of the dollar as a reserve currency: Worldwide demand for dollars helps keep the value of the dollar high, meaning imported goods--including basic commodities like oil--are cheaper for Americans. The willingness of foreigners to hold dollar stocks (or dollar-denominated assets such as U.S. Treasury securities) makes possible the long-running U.S. trade deficit, to the benefit of American consumers.
Some assert that there is a corresponding cost to U.S. producers, who lose export opportunities or even jobs to foreign producers because of the high value of the dollar--a logical idea that holds true for countries whose currency is not held as a reserve. In most countries, if its currency appreciates in value, its exports become more expensive and demand for them decreases. However, demand for U.S. exports has consistently risen, even as the value of the dollar remained high. The reason is that the value to other countries of the dollar as a reserve currency creates additional demand for the dollar over and above what would be generated by normal trade flows. Thus people are willing to sell Americans more goods than they otherwise would in order to get extra dollars to hold as reserves. But they still want to buy U.S. goods and services, too.
The benefits for other countries are substantial as well. The utility of the reserve currency function--having a stable and readily convertible commodity (the dollar in this case) in which to hold wealth--is self-evident. The dollar provides a readily available medium of exchange; it can be used to pay for almost anything. Furthermore, consumers can also be confident that the dollar will buy about as much tomorrow as it buys today. It is hard to imagine international trade without some medium of exchange like the dollar.
Even more important, however, is the additional demand created in the United States for exports from other countries--a byproduct of strong global desire for dollars. This U.S. consumer demand has served as the primary engine of growth for the world economy. Worldwide, that growth has lifted millions of people out of poverty over the last decade. For China, this export-led growth has fueled job creation that promotes social stability and raises incomes, at least in certain sectors and geographic areas that have been permitted by the authoritarian Chinese government to link to the global economy.
An Unrealistic Solution to a Non-Existent Problem
The creation of a new international reserve currency to replace the dollar is a solution looking for a problem. So far, the dollar is working just fine as a reserve currency. The continued strength of the dollar testifies to its continued utility as a reserve currency and the confidence of the markets in its future value. That confidence extends, by the way, to the Russian and Chinese governments, both of which continue to hold large stocks of dollars.
In contrast, there are many reasons why moving to an SDR makes no sense:
- The SDR has no intrinsic value. The SDR is backed by nothing other than the good faith and credit, if you will, of the IMF. It has no intrinsic value and, at the moment at least, can't be used to purchase anything. It is true that people like to say that the dollar is backed only by the good faith and credit of the U.S. government. In reality, however, the dollar is backed by the goods and services produced by the American people and their willingness to trade those goods and services for dollars. With this willingness to trade real things for dollars extending to people around the world, the value of the dollar becomes backed not just by the U.S. people or the U.S. government but by literally all the world's producers and consumers interconnected through global supply chains: Arab oil traders, Bangladeshi textile producers, Japanese and Korean auto manufacturers, and, yes, even Russian finance ministers and Chinese central bankers. The IMF, by contrast, produces nothing.
- A one-size-fits-all international currency will not meet diverse world needs. Countries growing at different rates have different monetary policy needs. Faster-growing countries need a more rapidly increasing supply of money. Slower-growing countries must have less in order to prevent inflation. The SDR could not accommodate these differing needs. Its value is set by the policies of the IMF, which in turn are subject to the competing political and economic interests of international diplomats.
- Embracing the SDR will result in a loss of transparency. The process of setting the supply and value of the dollar is highly transparent. People around the world know exactly what the Federal Reserve is doing as it adds dollars to the system or adjusts interest rates. Even the rationale for the changes is quickly apparent from Fed statements and the open grilling to which the U.S. subjects Fed governors. The IMF is far more opaque: Each country's representative would likely tell a different story about what was done and why.
- The SDR will create new financial complexities and opportunities for corruption. Use of the SDR would add an additional step to every international transaction. Buyers and sellers would have to convert their local currency into SDRs. Some would have to change first into a convertible currency and then into SDRs. This would create new opportunities for arbitrage and corruption or, at the very least, make the process more expensive with an additional transaction fee. Fans of derivatives will love the SDR: Like derivatives, SDRs are an additional layer away from anything of real value. They will provide wonderful opportunities for manipulation and skimming of value by currency traders and financial speculators. They will also be less transparent and harder for normal people to understand. And they will be controlled by an international organization that has little if any democratic legitimacy or accountability.
A Silly Idea from Serious People
Why would the Russians and Chinese propose such a thing? They are serious people, and serious people do not usually propose silly ideas. The easy answer would be resentment toward the U.S. for its unique and dominant role in the system. This may be an element especially for the Russians. Far more likely, however, is that the Chinese and Russians are motivated instead by fear that the U.S. is not doing a very good job of managing its economy and its international economic role right now. Floating an unrealistic but provocative proposal is precisely the way to diplomatically express that concern without actually threatening the current system on which they depend just as does the rest of the world.
One would have to guess that the U.S. response--categorical defense of the dollar and its role as a reserve currency by both Treasury Secretary Geithner and Fed Chairman Bernanke[1]--was exactly what the Chinese and Russians were trying to inspire. It is a diplomatic coup for them as well as a warning to the U.S. that what this nation does right and what it does wrong has major implications for other nations as well as itself. America should not expect other countries to remain idle if U.S. policies begin hurting their economic growth as well as its own.
Ambassador Terry Miller is Director of the Center for International Trade and Economics at The Heritage Foundation.
Notes
[1]Geithner, in either a huge mistake or (hopefully) a simple misunderstanding of a reporter's question, initially gave a positive response to the idea of expanding the role of SDRs, but he, the President, and FED Chairman Bernanke quickly set things straight. See "Geithner and Bernanke Reject New Global Currency Idea," Reuters, March 24, 2009, at http://uk.news.yahoo.com/22/20090324/tpl-uk-forex-usa-geithner-sb-d1a0d5d.html (March 26, 2009).
Let's Put Bylines on Our 'National' Intelligence Estimates. Anonymity leads to mediocrity and politicization
Anonymity leads to mediocrity and politicization.
WSJ, Mar 28, 2009
Charles Freeman's withdrawal from his appointment as the chairman of the National Intelligence Council (NIC) offers an opportunity to assess whether personal views should have any role in intelligence analysis. Mr. Freeman's opinions on Israel, the Middle East and China proved too strong for critics. Yet the NIC's National Intelligence Estimates (NIEs) are often politicized and debased precisely because their anonymous authors need take no personal responsibility for their opinions.
No one knows if the upcoming new Iran estimate will be as politicized as was its 2007 predecessor, which damaged the diplomacy of both the U.S. and our European allies. In any case, the Obama administration likely will have one day a politically convulsive NIE that will make the president wonder why these estimates are ever drafted.
Anonymity is the byword of the intelligence profession. In operations, it is usually mandatory. In analysis, however, a collective effort that hides individual authorship is a dubious approach.
In theory, anonymity gives analysts protection from political pressure. And a collective judgment -- NIEs are the most consensual intelligence products that the executive branch puts out -- is supposed to be more reliable and convincing than the views of a particular analyst or agency.
In practice, however, this anonymous, collective approach has guaranteed that mediocre analysis goes unchallenged, and that analysts and senior officials within the NIC go unscarred when they're wrong. No one would want to invest money with a stockbroker who consistently gives bad advice. No one would want to retain a coach who loses more than he wins. Yet obtaining an analytical score card on analysts and their managers within the NIC, the CIA's Directorate of Intelligence or the office of the Director of National Intelligence is nearly impossible.
NIEs rarely offer insights not available elsewhere at far less cost. They have often been egregiously wrong -- my personal favorites were the NIEs written before Iran's Islamic Revolution that predicted the Pahlavi dynasty's survival into the 21st century -- and when right, often unspectacularly so (seeing enmity among the Serbs, Croats and Muslim Bosnians in post-Cold War Yugoslavia wasn't particularly perspicacious). Yet where once NIEs attracted little attention, they now can become political footballs even before they are finished.
In part, this is because the nature of Washington has changed. Estimates were once either closely guarded or easily forgotten -- the secrecy of estimates was better kept and no one expected presidents or members of Congress to accept them as guides for foreign policy. Today, Americans have unprecedented access to secret information. And within the State Department and Congress, where partisan policy battles are fierce, members feel no hesitation in using NIEs as political battering rams. At dizzying speeds, politicians and their allied journalists can denigrate an NIE for its "group think," as was the case with the 2003 report on Iraqi WMD. Or they can applaud another for its supposed willingness to speak truth to power -- as we heard with the Iran "no-nuke" NIE of 2007. With the system we have, this isn't going to change.
President George W. Bush missed an enormous opportunity to reassess the CIA's operational and analytical branches -- the vital center of the American intelligence community -- after 9/11. He embraced the status quo, putting it on steroids by increasing its funding, adding more personnel, and canning no one.
Identifying the primary drafters of NIEs -- or any major analytical report requested by Congress -- could significantly improve the quality of these analyses and diminish the potential for politicians, the media and the intelligence community to politically exploit the reports. Senior managers at the CIA, the NIC or in any of America's other intelligence agencies should have their names appended to an assessment if they've had any substantive role in writing its conclusions. Although everyone in the intelligence community likes to get their fingers into an NIE, there are usually just a small number of individuals who do the lion's share of the work. They should all be known, and should be expected to defend their conclusions in front of Congress and senior executive-branch officials.
Contrary to what some journalists suggested about the prelude to the Iraq war, good analysts live to be questioned by senior officials. Intrepid analysts want to get out of their cubicles.
Think tankers can generally run circles around government analysts and managers on substance, and especially in "strategic" vision, because they operate in more open, competitive and intellectually rigorous environments. Anonymous collective official analysis tends to smother talent and parrot the current zeitgeist in Washington. Liberating first-rate analysts from the bureaucratic disciplining and expectations of their own agencies by allowing them to build public reputations is probably the most efficient and inexpensive way to introduce "contrarian views," an oft-stated reason for Mr. Freeman's appointment.
Mr. Freeman's strongly held personal views proved to be his appointment's stumbling block. If Dennis Blair, the director of National Intelligence, believes that views such as those held by Mr. Freeman would help the intellectual mix at the NIC, then he should allow these views to be heard and argued in an open environment.
In an environment where analysts have publicly tracked reputations, we are likely to see people take their tasks more seriously instead of hiding behind their agencies. Those who are confident in their assessments won't fear change. But those who are fence-sitters, more concerned about melding their views into what is bureaucratically palatable and politically acceptable, will likely drift to the rear as they grope for what is accepted orthodoxy.
A more open system still may not make the intelligence community's product competitive with the best from the outside on the big issues ("Whither China, Russia and Iran?"). But it will certainly make estimates more interesting to read than what we have now. Denied the imprimatur of saying "the intelligence community believes," estimates will come down to earth. And from that angle, it will be much harder for anyone again to use an NIE to damage their political opponents.
Mr. Gerecht, a former Central Intelligence Agency officer, is a senior fellow at the Foundation for Defense of Democracies.
President Obama's plan for Afghanistan and Pakistan is ambitious and expensive. It is also hard-headed.
President Obama's plan for Afghanistan and Pakistan is ambitious and expensive. It is also hard-headed.
Saturday, March 28, 2009; page A12
THE STRATEGY for Afghanistan and Pakistan announced by President Obama yesterday is conservative as well as bold. It is conservative because Mr. Obama chose to embrace many of the recommendations of U.S. military commanders and the Bush administration, based on the hard lessons of seven years of war. Yet it is bold -- and politically brave -- because, at a time of economic crisis and war-weariness at home, Mr. Obama is ordering not just a major increase in U.S. troops, but also an ambitious effort at nation-building in both Afghanistan and Pakistan. He is right to do it.
Few Americans would dispute Mr. Obama's description yesterday of the continuing threat from Afghanistan and Pakistan's tribal areas. "Multiple intelligence estimates have warned that al-Qaeda is actively planning attacks on the U.S. homeland from its safe haven in Pakistan," he said. "And if the Afghan government falls to the Taliban -- or allows al-Qaeda to go unchallenged -- that country will again be a base for terrorists who want to kill as many of our people as they possibly can." The goal he stated was similarly simple and clear: "to disrupt, dismantle and defeat al-Qaeda in Pakistan and Afghanistan, and to prevent their return to either country in the future."
What distinguishes the president's plan -- and opens him to criticism from some liberals as well as conservatives -- is its recognition that U.S. goals cannot be achieved without a major effort to strengthen the economies and political institutions of Pakistan and Afghanistan. The Bush administration tried to combat the al-Qaeda threat with limited numbers of U.S. and NATO troops, targeted strikes against militants, and broad, mostly ineffective, aid programs. It provided large sums of money to the Pakistani army, with few strings attached, in the hope that action would be taken against terrorist camps near the Afghan border. The strategy failed: The Taliban has only grown stronger, and both the Afghan and Pakistani governments are dangerously weak.
The lesson is that only a strategy that aims at protecting and winning over the populations where the enemy operates, and at strengthening the armies, judiciaries, and police and political institutions of Afghanistan, can reverse the momentum of the war and, eventually, allow a safe and honorable exit for U.S. and NATO troops. This means more soldiers, more civilian experts and much higher costs in the short term: Mr. Obama has approved a total of 21,000 more U.S. troops and several hundred additional civilians for Afghanistan, and yesterday he endorsed two pieces of legislation that would provide Pakistan with billions of dollars in nonmilitary aid as well as trade incentives for investment in the border areas. More is likely to be needed: U.S. commanders in Afghanistan hope to obtain another brigade of troops and a division headquarters in 2010, and to double the Afghan army again after the expansion now underway is completed in 2011. Mr. Obama should support those plans.
Such initiatives are not the product of starry-eyed idealism or an attempt to convert either country into "the 51st state" but of a realistic appreciation of what has worked -- and failed -- during the past seven years. As Mr. Obama put it, "It's far cheaper to train a policeman to secure his or her own village or to help a farmer seed a crop than it is to send our troops to fight tour after tour of duty with no transition to Afghan responsibility." That effort will be expensive and will require years of steadiness. But it offers the best chance for minimizing the threat of Islamic jihadism -- to this country and to the world.
Friday, March 27, 2009
Did the Fed Cause the Housing Bubble? WSJ Symposium
WSJ, Mar 27, 2009
Don't Blame Greenspan. By David Henderson
It's become conventional wisdom that Alan Greenspan's Federal Reserve was responsible for the housing crisis. Virtually every commentator who blames Mr. Greenspan points to the low interest rates during his last few years at the Fed.
The link seems obvious. Everyone knows that the Fed can drive interest rates lower by pumping more money into the economy, right? Well, yes. But it doesn't follow that that's why interest rates were so low in the early 2000s. Other factors affect interest rates too. In particular, a sudden increase in savings will drive down interest rates. And such a shift did occur. As Mr. Greenspan pointed out on this page on March 11, there was a surge in savings from other countries. Although he names only China, some of the Middle Eastern oil-producing countries were also responsible for much of this new saving. Shift the supply curve to the right and, wonder of wonders, the price falls. In this case, the price of saving and lending is the interest rate.
But how do we know that it was an increase in saving, not an increase in the money supply, that caused interest rates to fall? Look at the money supply.
Since 2001, the annual year-to-year growth rate of MZM (money of zero maturity, which is M2 minus small time deposits plus institutional money market shares) fell from over 20% to nearly 0% by 2006. During that time, M2 (which is M1 plus time deposits) growth fell from over 10% to around 2%, and M1 (which is currency plus demand deposits) growth fell from over 10% to negative rates.
The annual growth rate of the monetary base, the magnitude over which the Fed has the most control, fell from 10% in 2001 to below 5% in 2006. Moreover, nearly all of the growth in the monetary base went into currency, an increasing proportion of which is held abroad.
Moreover, if the Fed was the culprit, why was the housing bubble world-wide? Do Mr. Greenspan's critics seriously contend that the Fed was responsible for high housing prices in, say, Spain?
This is not to say that the Greenspan Fed was blameless. Particularly disturbing is the way the lender-of-last-resort function has increased moral hazard, a trend to which Mr. Greenspan contributed and which current Fed Chairman Ben Bernanke has put on steroids.
But to the extent that the federal government is to blame, the main fed culprits are the beefed up Community Reinvestment Act and the run-amok Fannie Mae and Freddie Mac. All played a key role in loosening lending standards.
I'm not claiming that we should have a Federal Reserve. We simply can't depend on getting another good chairman like Mr. Greenspan, and are more likely to get another Arthur Burns or Ben Bernanke. Serious work by economists Lawrence H. White of the University of Missouri, St. Louis, and George Selgin of West Virginia University makes a persuasive case that abolishing the Fed and deregulating money would improve the macroeconomy. I'm making a more modest claim: Mr. Greenspan was not to blame for the housing bubble.
Mr. Henderson is a research fellow with the Hoover Institution, an economics professor at the Naval Postgraduate School, and editor of "The Concise Encyclopedia of Economics" (Liberty Fund, 2008).
What Savings Glut? By Gerald P. O'Driscoll Jr.
Alan Greenspan responded to his critics on these pages on March 11. He singled out an op-ed by John Taylor a month earlier, "How Government Created the Financial Crisis" (Feb. 9), for special criticism. Mr. Greenspan's argument defending his policy is two-fold: (1) the Fed controls overnight interest rates, but not "long-term interest rates and the home-mortgage rates driven by them"; and (2) a global excess of savings was "the presumptive cause of the world-wide decline in long-term rates."
Neither argument stands up to scrutiny. First, Mr. Greenspan writes as if mortgages were of the 30-year variety, financed by 30-year money. Would that it were so! We would not be in the present mess. But the post-2002 period was characterized by one-year adjustable-rate mortgages (ARMs), teaser rates that reset in two or three years, etc. Five-year ARMs became "long-term" money.
The Fed only determines the overnight, federal-funds rate, but movements in that rate substantially influence the rates on such mortgages. Additionally, maturity-mismatches abounded and were the source of much of the current financial stress. Short-dated commercial paper funded investment banks and other entities dealing in mortgage-backed securities.
Second, Mr. Greenspan offers conjecture, not evidence, for his claim of a global savings excess. Mr. Taylor has cited evidence from the IMF to the contrary, however. Global savings and investment as a share of world GDP have been declining since the 1970s. The data is in Mr. Taylor's new book, "Getting Off Track."
The former Fed chairman also cautions against excessive regulation as a policy response to the crisis. On this point I concur. He does not directly address, however, the Fed's policy response. From the beginning, the Fed diagnosed the problem as lack of liquidity and employed every means at its disposal to supply liquidity to credit markets. It has been to little avail and, in the process, the Fed has loaded up its balance sheet with dubious assets.
The credit crunch continues because many banks are capital-impaired, not illiquid. Treasury's policy shifts and inconsistencies under both administrations have sidelined potential private capital. Treasury became the capital provider of last resort. It was late to recognize the hole in banks' balance sheets and consistently underestimated its size. The need to provide second- and even third-round capital injections proves that.
In summary, Fed policy did help cause the bubble. Subsequent policy responses by that institution have suffered from sins of commission and omission. As Mr. Taylor argued, the government (including the Fed) caused, prolonged, and worsened the crisis. It continues doing so.
Mr. O'Driscoll is a senior fellow at the Cato Institute. He was formerly a vice president at the Federal Reserve Bank of Dallas.
Low Rates Led to ARMs. By Todd J. Zywicki
Alan Greenspan's argument that the Federal Reserve's policies on short-term interest rates had no impact on long-term mortgage interest rates overlooks the way in which its policies changed consumer behavior.
A simple yet powerful pattern emerges from survey data of the past 25 years collected by HSH Associates (the financial publishers): The spread between fixed-rate mortgages (FRMs) and ARMs typically hovers between 100 and 150 basis points, representing the premium that a borrower has to pay to induce the lender to bear the risk of interest-rate fluctuations. At times, however, the spread between FRMs and ARMs breaks out of this band and becomes either larger or smaller than average, leading marginal consumers to prefer one to the other. Sometimes the adjustment in the market share of ARMs lags behind changes in the size of the spread, but over time when the spread widens, the percentage of ARMs increases and vice-versa.
In 1987, before subprime lending was even a gleam in Angelo Mozilo's eye, the spread rose to 300 basis points and the share of ARMs eventually rose to almost 70%, according to the Federal Finance Housing Board. When the spread shrunk to near 100 basis points in the late-1990s, the percentage of ARMs fell into the single digits. Other periods of time show similar dynamics.
In the latest cycle the spread rose from under 50 basis points at the end of 2000 to 230 basis points in mid-2004 and the percentage of ARMs rose from 10% to 40%. The Fed's subsequent increases on short-term rates caused short- and long-term rates to converge, squeezing the spread to about 50 points by 2007 and reducing ARMs to less than 10% of the market.
Record-low ARM interest rates kept housing generally affordable even as buyers could stretch to pay higher prices. Low short-term interest rates, combined with tax and other policies, also drew speculative, short-term home-flippers into certain markets. As the Fed increased short-term rates in 2005-07, interest rate resets raised monthly payments, triggering the initial round of defaults and falling home prices. Foreclosure rates initially soared on both prime and subprime ARMS much more than for FRMs.
Why did the ARM substitution result in a wave of foreclosures this time, unlike prior times? During previous times with high percentages of ARMs, the dip in short-term interest rates was a leading indicator of an eventual decline in long-term rates, reflecting the general downward trend in rates of the past 25 years. By contrast, during this housing bubble the interest rate on ARMs were artificially low and eventually rose back to the level of FRMs. There were other factors that exacerbated the problem -- most notably increased risk-layering and a decline in underwriting standards -- but the Fed's artificial lowering of short-term interest rates and the resulting substitution by consumers to ARMs triggered the bubble and subsequent crisis.
Mr. Zywicki is a professor of law at George Mason University School of Law and a senior scholar at the university's Mercatus Center. He is writing a book on consumer bankruptcy and consumer credit.
The Fed Provided the Fuel. By David Malpass
The blame for the current crisis extends well beyond the Fed -- to banks, regulators, bond raters, mortgage fraud, the Bush administration's weak-dollar policy and Lehman bankruptcy decisions, and Congress's reckless housing policies through Fannie Mae and Freddie Mac and the Community Reinvestment Act.
But the Fed provided the key fuel with its 1% interest rate choice in 2003 and 2004 and "measured" (meaning inadequate) rate hikes in 2004-2006. It ignored inflationary dollar weakness, higher interest rate choices abroad, the Taylor Rule, and the booming performance of the U.S. and global economies.
Even by the Fed's own backward-looking inflation metrics, the core consumption deflator exceeded the Fed's 2% limit for 18 quarters in a row beginning with the second quarter of 2004, while 12-month Consumer Price Index (CPI) inflation hit 4.7% in September 2005 and 5.4% in July 2008. This despite the Fed's constant assurances that inflation would moderate (unlikely given the crashing dollar.)
Despite its role as regulator and rate-setter, the Fed claimed that it could not identify asset bubbles until they popped (see my rebuttal on this page "The Fed's Moment of Weakness," Sept. 25, 2002). It is clear that the Fed's interest rate polices cause wide swings in the value of the dollar and huge momentum-based capital flows. These bring predictable -- and avoidable -- deflations, inflations and asset bubbles.
Beginning in 2003, the Fed filled the liquidity punch bowl. Low rates and the weakening dollar created a monumental carry trade (borrow dollars, buy anything). This transmitted the Fed's monetary excess abroad and into commodities. As the punch bowl overflowed, even global bonds bubbled (prices rose, yields fell), contributing to the global housing boom. Alan Greenspan singled out this correlation in his March 11 op-ed on this page, "The Fed Didn't Cause the Housing Bubble."
Given this power, the Fed should itself stop the current deflation and the economic freefall. It has to add enough liquidity to offset frozen credit markets, the collapse in the velocity of money, and bank deleveraging (which has reversed the normal money multiplier.)
The Fed was on the right track in late November when it committed to purchasing $600 billion in longer-term, government-guaranteed securities. Equities rose globally, and some credit markets thawed, including a decline in mortgage rates and corporate bond spreads. However, the Fed reversed course in January, delaying its asset purchases and shrinking its balance sheet. Growth in the money supply stopped. Since then, the Fed increased the amount of assets it intends to purchase, but lengthened the time period rather than accelerating the pace of purchases.
Given the magnitude of the crisis and the stakes, the Fed should be buying safe assets fast, not parceling out a few billion. Confidence and money velocity would also increase if the Fed committed itself to dollar stability, not instability, to avoid causing future inflations and deflations.
Mr. Malpass is president of Encima Global LLC.
Loose Money and the Derivative Bubble. By Judy Shelton
The Fed owns this crisis. The buck stops there -- but it didn't.
Too many dollars were churned out, year after year, for the economy to absorb; more credit was created than could be fruitfully utilized. Some of it went into subprime mortgages, yes, but the monetary excess that fueled the most threatening "systemic risk" bubble went into highly speculative financial derivatives that rode atop packaged, mortgage-backed securities until they dropped from exhaustion.
The whole point of having a central bank is to calibrate the money supply to the genuine needs of an economy -- to purchase goods and services, to fund productive investment -- with the aim of achieving maximum sustainable long-term growth. Since price stability is a key factor toward that end, central bankers attempt to finesse the amount of money and credit in the system; if interest rates are kept too low too long, it causes an unwarranted expansion of credit. As the money supply increases relative to real economic production, the spillage of excess purchasing power results in higher prices for goods and services.
But not always. Sometimes the monetary excess finds its way into a narrow sector of the economy -- such as real estate, or equities, or rare art. This time it was the financial derivatives market.
In the last six years, according to the Bank for International Settlements, the derivatives market exploded as a global haven for speculative investment, its aggregate notional value rising more than fivefold to $684 trillion in 2008 from $127 trillion in 2002. Financial obligations amounting to 12 times the value of the entire world's gross domestic product were written and traded and retraded among financial institutions -- playing off every instance of market turbulence, every gyration in exchange rates, every nuanced statement uttered by a central banker in Washington or Frankfurt -- like so many tulip contracts.
The sheer enormity of this speculative bubble, let alone the speed at which it inflated, testifies to inordinately loose monetary policy from the Fed, keeper of the world's predominant currency. The fact that Fannie Mae and Freddie Mac provided the "underlying security" for many of the derivative contracts merely compounds the error of government intervention in the private sector. Politicians altered normal credit risk parameters, while the Fed distorted housing prices through perpetual inflation.
At this point, dickering over whether Alan Greenspan should have formulated monetary policy in strict accordance with an econometrically determined "rule," or whether the Fed even has the power to influence long-term rates, raises a more fundamental question: Why do we need a central bank?
"There are numbers of us, myself included, who strongly believe that we did very well in the 1870 to 1914 period with an international gold standard." That was Mr. Greenspan, speaking 17 months ago on the Fox Business Network.
In the rules-versus-discretion debate over how best to achieve sound money, that is the ultimate answer.
Ms. Shelton, an economist, is author of "Money Meltdown" (Free Press, 1994).
To Change Policy, Change The Law. By Vincent Reinhart
Anyone seeking an application of the principle that fame is fleeting need look no further than the assessment of Federal Reserve policy from 2002 to 2005.
At the beginning, capital spending was anemic, and considerable wealth had been destroyed by the equity crash. The recovery from the 1990-91 recession was "jobless," and the current one was following the same script. Moreover, inflation was so distinctly pointed down that deflation seemed a palpable threat.
Keeping the federal-funds rate low for a long time was viewed as appropriately balancing the risks to the Fed's dual objectives of maximum employment and price stability. Indeed, the Fed was seen as extending the stable economic performance since 1983 that had been dubbed the "Great Moderation."
Over the period 2002-2005, the federal-funds rate ran below the recommendation of the policy rule made famous by Stanford Professor John Taylor. No doubt, the Taylor Rule provides important guidance on how that rate should change in response to changes in the two mandated goals of policy. First, it should move up or down by more than any change in inflation. Second, the Fed should respond to changes in resource slack. That is, caring about unemployment is not a sign of weakness in a central banker but rather that of strength in better achieving good results.
The Taylor Rule is less helpful to practitioners of policy in anchoring the level of the federal-funds rate. The rule is fit to experience based on a notion of the rate that should prevail if inflation were at its goal and resources fully employed, which is known as the equilibrium funds rate. That is an important technicality. Using a faulty estimate of the equilibrium funds rate is like flying a plane that is otherwise perfect except for an unreliable altimeter. The exception looms large when flying over a mountainous region.
From 2002 to 2005, the economic landscape appeared especially changeable, with the contours shaped by lower wealth, lingering job losses, and looming disinflation. To Fed officials at the time, this indicated that the equilibrium funds rate was unusually low. Simply, the only way to provide lift to an economy in which resource use was slack and inflation pointed down was to keep policy accommodative relative to longer-term standards.
That was then. Now, policy during the period is seen as fueling a housing bubble.
The Fed is guilty as charged in setting policy to achieve the goals mandated in the law. Fed policy makers cannot be held responsible for the fuel to speculative fires provided by foreign saving and the thin compensation for risk that satisfied global investors. Nor can the chain of subsequent mistakes that drove a downturn into a debacle be laid at the feet of the Federal Open Market Committee of 2002 to 2005. If the results seem less than desirable in retrospect, change the law those policy makers were following, but do not blame them for following prevailing law.
Mr. Reinhart is a resident scholar at the American Enterprise Institute. From August 2001 to June 2007, he was the secretary and economist of the Federal Open Market Committee.
How Korea Solved Its Banking Crisis
The world can learn from our experience in the late '90s.
WSJ, Mar 27, 2009
When world leaders met at the G-20 summit in Washington, D.C., last November, our hope was that by the first quarter of this year we would have largely overcome the financial crisis. At that time, leaders were primarily concerned with macroeconomic stimulus -- primarily fiscal stimulus -- to shorten the severe global recession.
Unfortunately, we are still struggling to deal with the financial turmoil, and financial institutions have yet to regain investors' confidence. The U.S. government recently announced its expanded plan to buy troubled assets that have been burdening banks. While I join others in hoping for the success of this plan, I believe that a true recovery requires all countries to do everything they can to stabilize the economy. If world leaders fail to come up with creative ways to deal with the current difficulties, credit will not flow.
For this reason, when the G-20 leaders meet in London next week, solving the financial meltdown -- with a special focus on removing impaired assets from the balance sheets of financial institutions -- must be our priority.
In the late 1990s, Korea was hit by a financial crisis, and having successfully overcome it, we have valuable lessons to offer. By committing to the following basic principles based on the Korean experience, world leaders will be well prepared as they create a plan to remove impaired assets.
First, bold and decisive measures, rather than incremental ones, are required to regain market confidence. Korea's successful experience illustrates this point. The Korean government tapped various sources to raise a public fund of $127.6 billion (159 trillion KRW) during the period from 1997 to 2002 -- equivalent to 32.4% of Korea's GDP in 1997 -- to resolve impaired assets and recapitalize financial institutions. Given the magnitude of the current challenges, the world cannot afford a minimalist approach.
Second, our experience suggests that bank recapitalization and creating a "bad bank" are not mutually exclusive options; the simultaneous application of both can have a positive effect. Korea established a specialized independent agency, the Korea Asset Management Corporation (Kamco) as a bad bank, while at the same time, the Korea Deposit Insurance Corporation was involved in recapitalizing financial institutions. Kamco purchased the impaired assets and settled the gains or losses with the financial institutions involved once the assets recovered their value. It acquired impaired assets at $30.9 billion -- the book value of which amounted to $85.1 billion by 2002 -- and recovered $33.9 billion by 2008 by reselling to private investors through various methods, including public auctions, direct sales, international tenders, securitization and debt-equity swaps.
Third, it is critical to ensure that the implemented measures are made politically acceptable and that moral hazard is minimized. A special mechanism should be devised for shareholders, managers, workers and asset holders to bear their fair share of the burden. In the case of Korea, capital injections were limited to financial institutions that were systemically important and deemed to be viable after recapitalization.
Fourth, these measures should have built-in exit strategies with clear time frames. There should be a plan for shares of entities that are held by the government to be turned over to the private sector. Additionally, nationalization of banks shouldn't be a goal, but a temporary measure.
Fifth, although government will take the lead in such plans, private capital should be encouraged to fully participate in the process. Obviously, the process itself must be transparent. Korea's experience suggests that it would be useful, on a temporary basis, for governments to purchase impaired assets at a price agreed to with the troubled financial institutions, and then settle the gains or losses with the financial institutions after reselling. The problem of impaired assets today may be of a different nature, since they arise from off-balance sheet bundled derivatives. But this difficulty makes the ex post settlement scheme all the more useful.
Sixth, all forms of financial protectionism should be rejected in the process. Ideally, countries would have a common method for dealing with impaired assets. However, since countries have different financial realities, we should leave it up to each country to craft their own policy. And a coordinated effort is needed to ensure that regular cross-border capital flows are not interrupted.
To that effect, I welcome the G-20 finance ministers' agreement called "Restoring Lending: A Framework for Financial Repair and Recovery," which reflects Korea's proposal. Without abiding by these principles, macroeconomic stimulus measures will not do much good in alleviating the severe global economic recession.
Mr. Lee is president of the Republic of Korea.
Recent Demographic Trends in Metropolitan America
The Brookings Institution, Mar 27, 2009
The new administration taking shape in Washington inherits not only an economic crisis, but also a mammoth apparatus of agencies and programs, many of which were developed a generation or more ago. In view of that, a president and Congress striving to "build a smarter government" should develop new policies or retool old programs with the latest population trends in mind, especially those shaping and re-shaping metropolitan areas-our nation's engines of economic growth and opportunity. These include:
- Migration across states and metro areas has slowed considerably in the past two years due to the housing crisis and looming recession. About 4.7 million people moved across state lines in 2007-2008, down from a historic high of 8.4 million people at the turn of the decade. Population growth has dropped in Sun Belt migration magnets such as Las Vegas, NV, and Riverside, CA, and the state of Florida actually experienced a net loss of domestic migrants from 2007 to 2008. Meanwhile, out-migration has slowed in older regions such as Chicago and New York. Many Midwestern and Northeastern cities experienced greater annual population gains, or reduced population losses, in the past year.
- The sources and destinations of U.S. immigrants continue their long-run shifts. About 80 percent of the nation’s foreign-born population in 2007 hailed from Latin America and Asia, up from just 20 percent in 1970. The Southeast, traditionally an area that immigrants avoided, has become the fastest-growing destination for the foreign-born, with metro areas such as Raleigh, NC; Nashville, TN; Atlanta, GA; and Orlando, FL ranking among those with the highest recent growth rates. As they arrived in these new destinations, immigrants also began to move away from traditional communities in the urban core. Today, more than half of the nation’s foreign-born residents live in major metropolitan suburbs, while one-third live in large cities.
- Racial and ethnic minorities are driving the nation’s population growth and increasing diversity among its younger residents. Hispanics have accounted for roughly half the nation’s population growth since 2000. Already, racial and ethnic minorities represent 44 percent of U.S. residents under the age of 15, and make up a majority of that age group in 31 of the nation’s 100 largest metro areas (and a majority of the entire population in 15). Hispanic populations are growing most rapidly in the Southeast; Asian populations are rising in a variety of Sun Belt and high-tech centers; and the black population continues its move toward large Southern metro areas like Atlanta, Houston, and Washington, D.C.
- The next decade promises massive growth of the senior population, especially in suburbs unaccustomed to housing older people. As the first wave of baby boomers reaches age 65 in less than two years, the senior population is poised to grow by 36 percent from 2010 to 2020. Their numbers will grow fastest in the Intermountain West, the Southeast, and Texas, particularly in metro areas such as Raleigh, NC; Austin, TX; Atlanta, GA; and Boise, ID that already have large pre-senior populations (age 55 to 64). Because the boomers were the nation’s first fully “suburban generation,” their aging in place will cause many major metropolitan suburbs— such as those outside New York and Los Angeles—to “gray” faster than their urban counterparts.
- Amid rising educational attainment overall, the U.S. exhibits wide regional and racial/ethnic disparities. While 56 percent and 38 percent of Asian and white adults, respectively, held post-secondary degrees in 2007, the same was true of only 25 percent and 18 percent of blacks and Hispanics. These deep divides by race and ethnicity coincide with growing disparities across metropolitan areas owing to economic and demographic change. In knowledge-economy areas such as Boston, MA; Washington, D.C.; and San Francisco, CA, more than 40 percent of adults hold a bachelor’s degree. Meanwhile, in metro areas that have attracted large influxes of immigrants, such as Houston, TX; Greenville, NC; and most of California’s Central Valley, more than 20 percent of adults lack a high school diploma. And some Sun Belt metro areas, such as Las Vegas, NV, and Riverside, CA, have fast-growing populations at both ends of the attainment spectrum.
- Even before the onset of the current recession, poverty rose during the 2000s, and spread rapidly to suburban locations. Both the overall number of people living in poverty and the poverty rate rose from 2000 to 2007; today, working-age Americans account for a larger share of the poor than in the last 30 years. After diverging in the 1970s and 1980s, the gap between central-city and suburban poverty rates has narrowed somewhat. More notably, the suburban poor surpassed the central-city poor in number during this decade, and now outnumber them by more than 1.5 million. The suburban poor have spread well beyond older, inner-ring suburbs, which in 2005-2007 housed less than 40 percent of all poor suburban dwellers. Yet even as poverty spreads throughout the metropolis, the concentration of poverty in highly distressed communities—after dropping in the 1990s—appears to be rising once again in the 2000s.
Download the full report » (PDF)
Job Losses From Obama Green Stimulus Foreseen in Spanish Study
Bloomberg, Mar 27, 2009
Subsidizing renewable energy in the U.S. may destroy two jobs for every one created if Spain’s experience with windmills and solar farms is any guide.
For every new position that depends on energy price supports, at least 2.2 jobs in other industries will disappear, according to a study from King Juan Carlos University in Madrid.
U.S. President Barack Obama’s 2010 budget proposal contains about $20 billion in tax incentives for clean-energy programs. In Spain, where wind turbines provided 11 percent of power demand last year, generators earn rates as much as 11 times more for renewable energy compared with burning fossil fuels.
The premiums paid for solar, biomass, wave and wind power - - which are charged to consumers in their bills -- translated into a $774,000 cost for each Spanish “green job” created since 2000, said Gabriel Calzada, an economics professor at the university and author of the report.
“The loss of jobs could be greater if you account for the amount of lost industry that moves out of the country due to higher energy prices,” he said in an interview.
Spain’s Acerinox SA, the nation’s largest stainless-steel producer, blamed domestic energy costs for deciding to expand in South Africa and the U.S., according to the study.
“Microsoft and Google moved their servers up to the Canadian border because they benefited from cheaper energy there,” said the professor of applied environmental economics.