Yes He Is. By Bruce Reed
Obama calls himself a New Democrat, and shows what it means.
Slate, Wednesday, March 11, 2009, at 3:06 PM ET
For conservatives still trying to fit Barack Obama into their old tax-and-spend-liberal box, Tuesday was a very bad day. In the morning, the president gave a tough-minded education reform speech demanding more accountability from schools, teachers, students, and parents. The same afternoon, he brought members of the House New Democrat Coalition to the White House, and told them, "I am a New Democrat." According to Politico, Obama went on to describe himself as a fiscally responsible, pro-growth Democrat who supports free and fair trade and opposes protectionism.
So much for the ridiculous talk-radio bid to dub Obama a socialist. As Ruth Marcus points out in today's Washington Post, "The notion that President Obama has lurched to the left since his inauguration and is governing as an unreconstructed liberal is bunk." From his education reform agenda to his team of pragmatists to his heavy emphasis on responsibility, Obama is leading the country the way he promised he would: neither to the left nor right, but on a path that's new and different.
Full disclosure: I've always loved the term "New Democrat," and in the early '90s launched a magazine by that name for the Democratic Leadership Council, the organization I now head. The label and the philosophy behind it were an attempt to think anew and move past the ideological logjams of that era.
But that was then, and this is now. The job of my group and other progressive, reform-minded organizations isn't to label Barack Obama or hold him to some old standard – it's to help him enact his reform agenda and succeed at the standard he has set for himself. The challenges of 2009 are different from the challenges of 1992, and what it means to be a New Democrat now cannot be the same as what it meant back then.
Obama has always steered clear of labels, with good reason. One of the great hopes of his campaign and his presidency is the prospect of a new, post-partisan politics that leaves behind old debates and moves beyond old boundaries. That approach has become all the more necessary in the midst of an economic crisis that demands new answers and eschews rigid ideology in favor of doing what works.
The president is right that old labels don't mean anything, but new labels do – and in Obama's capable hands, the term "New Democrat" can take on new meaning. As Obama and others have observed, the traditional terms of the ideological debate – liberal and progressive, moderate and centrist, conservative and right-wing – are stale and imprecise. Obama has the opportunity to define a governing philosophy for our time on his own terms.
In his campaign and as president, Obama has put forth the core of his new philosophy for a new time. In January, he described it as "a grand bargain." "Our challenge is going to be identifying what works and putting more money into that, eliminating things that don't work, and making things that we have more efficient," he said. "Everybody's going to have to give. Everybody's going to have to have some skin in the game."
Obama's inaugural address, his joint address to Congress, and his budget all have reinforced that philosophy. On Sunday, the Washington Post dedicated 1,600 words to the president's use of the word "responsibility" – another sign that the "new era of responsibility" Obama promised is here to stay.
Obama's impressive education speech yesterday provided further proof of his bold agenda for reform. The president explained why transforming education is central to America's economic future, and outlined several smart steps to make it happen. In the economic recovery bill, he secured $100 billion to invest in education. On Tuesday, he committed once again to make sure that investment brings real change. As Rahm Emanuel told the Post, "The resources come with a bow tied around them that says 'Reform.'"
Obama called for rewarding good teachers and making it easier to remove bad ones; challenged states to stop capping the number of charter schools; urged states to adopt rigorous common standards; and repeated his pledge to cut the dropout rate in high schools and college. He also reminded the nation that more resources and more accountability from schools, teachers, and students won't change our education system unless Americans take more responsibility as parents.
On education, Obama showed a path out of gridlock that could work as well in solving other entrenched problems. "For decades, Washington has been trapped in the same stale debates that have paralyzed progress and perpetuated our educational decline," he said Tuesday. "Too many supporters of my party have resisted the idea of rewarding excellence in teaching with extra pay, even though it can make a difference in the classroom. Too many in the Republican Party have opposed new investments in early education, despite compelling evidence of its importance."
The tectonic plates on which the 20th century was built are shifting in the 21st. In the 1930s, New Dealers like FDR had to save capitalism from itself. In the 1990s, New Democrats like Bill Clinton had to modernize progressive government. Over the next few years, Barack Obama has to do both at the same time. For that, as Obama made clear again yesterday, a new president with a new approach is exactly what we need.
Bruce Reed, who was President Clinton's domestic policy adviser, is president of the Democratic Leadership Council and editor-in-chief of Blueprint magazine. Read his disclosure here.
Wednesday, March 11, 2009
Political appointees and lobbyists
Obama Curbing Only Lobbyists Who Disagree with Him. By Jeff Stier, Esq., and Henry I. Miller
Avoiding Conflict of Interest, or Conflicting Ideas?
ACSH, Sunday, February 8, 2009
In order to avoid conflicts of interest, President Barack Obama promised repeatedly that he would not appoint lobbyists to positions in his administration, and one of his first actions in office was to issue an executive order forbidding executive branch employees from working in an agency, or on a program, for which they have lobbied during the previous two years.
It's a commitment that owes more to rhetoric than reality -- brass-knuckle politics under the guise of integrity in government. The president already has violated both the letter and the spirit of his pledge. A politician breaking campaign promises? So what else is new, right? But in government, personnel choices are policy, and policy is what spells the difference between the success or failure of individuals and the nation. The president's choices will ensure that (left-wing) politics trumps good government -- and will have dire consequences for the nation's economy and the wellbeing of consumers.
Consider Bill Corr, the deputy secretary-designate of Health and Human Services. He was executive director of the Campaign for Tobacco Free Kids, which (despite its name), partnered with tobacco behemoth Altria to draft the legislation that would give Food and Drug Administration authority to regulate tobacco. The bill is widely expected to pass this year -- and because the FDA is part of HHS, Corr would be in an influential position as FDA decides how to implement its new authority. For example, regulators would have to decide whether to allow "harm reduction" claims for smokeless tobacco, which helps smokers reduce the risk of tobacco use -- a potentially useful approach that Corr's organization vigorously opposed under his leadership.
In spite of President Obama's commitment "to change the way Washington does business and curb the influence of lobbyists on our government," he found a way around the rules: Corr would recuse himself from deliberations on tobacco policy. But as President Obama (himself a smoker) must surely know, cigarettes are the most preventable cause of death in this country. How could the deputy secretary of HHS not be allowed to address tobacco issues? That alone should disqualify Corr for the position.
Another former health lobbyist appointed to the top ranks of HHS is Mark B. Childress, the new chief of staff for the HHS secretary. There are at least a dozen others elsewhere in the administration, including former Goldman Sachs Group lobbyist Mark Patterson, the incoming chief of staff for the Treasury secretary.
Apparently, President Obama is not trying to curb the influence of lobbyists -- but only of lobbyists whose views don't gibe with those of the administration.
There is another, broader issue related to the presence of lobbyists in the administration -- namely, the appointment to critical positions of people who have been "unofficial" lobbyists, both within and outside government agencies. Many of the Obama appointments might not have been official "registered lobbyists," but they certainly have been activists and lobbyists by any reasonable definition. Tom Daschle, who withdrew his nomination as HHS secretary because of tax problems, was a perfect example. Since he left the Senate in 2005, he has made millions as an influence-peddler and "policy adviser" to a wide variety of companies and organizations. He has received hundreds of thousands of dollars from the health care industry. Is he a lobbyist? Well, if it walks like a duck and quacks like a duck...
Another important, broader question is whether government bureaucrats themselves may also be considered to be lobbyists of a sort. The late economist Milton Friedman observed that you can usually rely on individuals and institutions (including regulatory agencies) to act in their own self-interest -- and to lobby for and adopt policies that will enhance it. In the case of regulators, their behavior is influenced in large part by the desire to stay out of trouble (which means not making waves or taking unpopular positions) and by the yearning for larger budgets and grander bureaucratic empires
An example is Lisa Jackson, the new head of the Environmental Protection Agency, who most recently headed New Jersey's environmental agency and is a 16-year veteran of the EPA, during which she developed some of the agency's most unscientific, wasteful and dangerous regulations.
While at EPA, Jackson worked on Superfund (officially the Comprehensive Environmental Response, Compensation and Liability Act), an ongoing EPA program intended to clean up and reduce the risk of toxic-waste sites. It was originally conceived as a short-term project -- $1.6 billion over five years to clean up some 400 sites (by law, at least one per state and, not coincidentally, about one per congressional district). But it has grown into one of the nation's largest public works projects, with more than $30 billion spent on about 1,300 sites.
After the expenditure of tens of billions of dollars, no beneficial results have been demonstrable; on the other hand, Superfund projects have caused a great deal of harm. The risk of fatality to the average cleanup worker -- a dump-truck driver involved in a collision or a laborer run over by a bulldozer, for example -- is considerably larger than the cancer risks to individual residents that might result from exposures to contaminated sites. (And consider that cancer risks are theoretical estimates over many years or decades, while worksite fatalities occur during the much shorter duration of the cleanup.)
With this kind of history, we would consider Ms. Jackson a lobbyist for the interests of the most radical, doctrinaire elements of the environmental community -- and for the bureaucrats who defend and strive to expand the Superfund program. Are former lobbyists for the energy or chemical industry less suited than she to head the EPA?
We were promised change and greater transparency, but behind the rhetoric we're seeing business as usual. We are reminded of Obama's remark during the campaign, "If you put lipstick on a pig, it's still a pig."
Miller is a physician, Hoover Institution fellow and was an FDA official during the Reagan administration. Stier is associate director, American Council on Science and Health.
Avoiding Conflict of Interest, or Conflicting Ideas?
ACSH, Sunday, February 8, 2009
In order to avoid conflicts of interest, President Barack Obama promised repeatedly that he would not appoint lobbyists to positions in his administration, and one of his first actions in office was to issue an executive order forbidding executive branch employees from working in an agency, or on a program, for which they have lobbied during the previous two years.
It's a commitment that owes more to rhetoric than reality -- brass-knuckle politics under the guise of integrity in government. The president already has violated both the letter and the spirit of his pledge. A politician breaking campaign promises? So what else is new, right? But in government, personnel choices are policy, and policy is what spells the difference between the success or failure of individuals and the nation. The president's choices will ensure that (left-wing) politics trumps good government -- and will have dire consequences for the nation's economy and the wellbeing of consumers.
Consider Bill Corr, the deputy secretary-designate of Health and Human Services. He was executive director of the Campaign for Tobacco Free Kids, which (despite its name), partnered with tobacco behemoth Altria to draft the legislation that would give Food and Drug Administration authority to regulate tobacco. The bill is widely expected to pass this year -- and because the FDA is part of HHS, Corr would be in an influential position as FDA decides how to implement its new authority. For example, regulators would have to decide whether to allow "harm reduction" claims for smokeless tobacco, which helps smokers reduce the risk of tobacco use -- a potentially useful approach that Corr's organization vigorously opposed under his leadership.
In spite of President Obama's commitment "to change the way Washington does business and curb the influence of lobbyists on our government," he found a way around the rules: Corr would recuse himself from deliberations on tobacco policy. But as President Obama (himself a smoker) must surely know, cigarettes are the most preventable cause of death in this country. How could the deputy secretary of HHS not be allowed to address tobacco issues? That alone should disqualify Corr for the position.
Another former health lobbyist appointed to the top ranks of HHS is Mark B. Childress, the new chief of staff for the HHS secretary. There are at least a dozen others elsewhere in the administration, including former Goldman Sachs Group lobbyist Mark Patterson, the incoming chief of staff for the Treasury secretary.
Apparently, President Obama is not trying to curb the influence of lobbyists -- but only of lobbyists whose views don't gibe with those of the administration.
There is another, broader issue related to the presence of lobbyists in the administration -- namely, the appointment to critical positions of people who have been "unofficial" lobbyists, both within and outside government agencies. Many of the Obama appointments might not have been official "registered lobbyists," but they certainly have been activists and lobbyists by any reasonable definition. Tom Daschle, who withdrew his nomination as HHS secretary because of tax problems, was a perfect example. Since he left the Senate in 2005, he has made millions as an influence-peddler and "policy adviser" to a wide variety of companies and organizations. He has received hundreds of thousands of dollars from the health care industry. Is he a lobbyist? Well, if it walks like a duck and quacks like a duck...
Another important, broader question is whether government bureaucrats themselves may also be considered to be lobbyists of a sort. The late economist Milton Friedman observed that you can usually rely on individuals and institutions (including regulatory agencies) to act in their own self-interest -- and to lobby for and adopt policies that will enhance it. In the case of regulators, their behavior is influenced in large part by the desire to stay out of trouble (which means not making waves or taking unpopular positions) and by the yearning for larger budgets and grander bureaucratic empires
An example is Lisa Jackson, the new head of the Environmental Protection Agency, who most recently headed New Jersey's environmental agency and is a 16-year veteran of the EPA, during which she developed some of the agency's most unscientific, wasteful and dangerous regulations.
While at EPA, Jackson worked on Superfund (officially the Comprehensive Environmental Response, Compensation and Liability Act), an ongoing EPA program intended to clean up and reduce the risk of toxic-waste sites. It was originally conceived as a short-term project -- $1.6 billion over five years to clean up some 400 sites (by law, at least one per state and, not coincidentally, about one per congressional district). But it has grown into one of the nation's largest public works projects, with more than $30 billion spent on about 1,300 sites.
After the expenditure of tens of billions of dollars, no beneficial results have been demonstrable; on the other hand, Superfund projects have caused a great deal of harm. The risk of fatality to the average cleanup worker -- a dump-truck driver involved in a collision or a laborer run over by a bulldozer, for example -- is considerably larger than the cancer risks to individual residents that might result from exposures to contaminated sites. (And consider that cancer risks are theoretical estimates over many years or decades, while worksite fatalities occur during the much shorter duration of the cleanup.)
With this kind of history, we would consider Ms. Jackson a lobbyist for the interests of the most radical, doctrinaire elements of the environmental community -- and for the bureaucrats who defend and strive to expand the Superfund program. Are former lobbyists for the energy or chemical industry less suited than she to head the EPA?
We were promised change and greater transparency, but behind the rhetoric we're seeing business as usual. We are reminded of Obama's remark during the campaign, "If you put lipstick on a pig, it's still a pig."
Miller is a physician, Hoover Institution fellow and was an FDA official during the Reagan administration. Stier is associate director, American Council on Science and Health.
Dispatch from day two of the International Conference on Climate Change in New York
What Planetary Emergency? By Ronald Bailey
Dispatch from day two of the International Conference on Climate Change in New York
Reason, March 10, 2009
March 9, New York—Assume that man-made global warming exists. So what? That was the premise of a fascinating presentation by Indur Goklany during the second day of sessions at the International Conference on Climate Change. Goklany, who works in the Office of Policy Analysis of the U.S. Department of the Interior and is the author of The Improving State of the World: Why We're Living Longer, Healthier, More Comfortable Lives on a Cleaner Planet, made it clear that he was not speaking on behalf of the federal government.
Goklany's talk looked at three common claims: (1) Human and environmental well-being will be lower in a warmer world than it is today; (2) our descendants will be worse off than if we don't stop man-made global warming; and (3) man-made global warming is the most important problem in the world. Goklany assumed that the United Nations' Intergovernmental Panel on Climate Change's (IPCC) consensus view on future temperature trends is valid. For his analysis, he used data from the fast track assessments of the socioeconomic impacts of global climate change sponsored by the British government, the Stern Review on the Economics of Climate Change, global mortality estimates from the World Health Organization (WHO), and cost estimates from the Intergovernmental Panel on Climate Change.
From the Stern Review, Goklany took the worst case scenario, where man-made global warming produces market and non-market losses equal to 35 percent of the benefits that are projected to exist in the absence of climate change by 2200. What did he find? Even assuming the worst emissions scenario, incomes for both developed and developing countries still rise spectacularly. In 1990, average incomes in developing countries stood around $1,000 per capita and at aroud $14,000 in developed countries. Assuming the worst means that average incomes in developing countries would rise in 2100 to $62,000 and in developed countries to $99,000. By 2200, average incomes would rise to $86,000 and $139,000 in developing and developed countries, respectively. In other words, the warmest world turns out to be the richest world.
Looking at WHO numbers, one finds that the percentage of deaths attributed to climate change now is 13th on the list of causes of mortality, standing at about 200,000 per year, or 0.3 percent of all deaths. High blood pressure is first on the list, accounting for 7 million (12 percent) of deaths; high cholesterol is second at 4.4 million; and hunger is third. Clearly, climate change is not the most important public health problem today. But what about the future? Again looking at just the worst case of warming, climate change would boost the number of deaths in 2085 by 237,000 above what they would otherwise be according to the fast track analyses. Many of the authors of the fast track analyses also co-authored the IPCC's socioeconomic impact assessments.
Various environmental indicators would also improve. For example, 11.6 percent of the world's land was used for growing crops in 1990. In the warmest world, agricultural productivity is projected to increase so much that the amount of land used for crops would drop to just 5 percent by 2100, leaving more land for nature. In other words, if these official projections are correct, man-made global warming is by no means the most important problem faced by humanity.
Next up on the impacts panel was Paul Reiter, head of the insects and infectious disease unit at the Institut Pasteur in Paris. Members of the global warming fraternity frequently worry that climate change will exacerbate the spread of tropical diseases like malaria. Reiter began his talk by pointing out that malaria was endemic in Yakutsk, the coldest city on earth, until 1959. In 1935, the Soviets claimed that malaria killed nearly 4,000 people in Yakutsk, a number that dropped to just 85 in 1959, the year that the disease was finally eradicated, in part by using the insecticide DDT.
Reiter then described a vast new research program that he is participating in, the Emerging Diseases in a Changing European eNvironment, or EDEN project. Sponsored by the European Union, the EDEN project is evaluating the potential impacts of future global warming on the spread of disease in Europe. The EDEN researchers have been assessing outbreaks of various diseases to see if they could discern any impact climate change may be having on their spread.
Reiter cited a recent analysis of the outbreak of tick-borne encephalitis in the early 1990s in many eastern European countries. The epidemic occurred shortly after the fall of communism, when many former Soviet bloc countries went into steep economic decline. After sifting through the data, it became apparent that the tough economic situation forced many eastern Europeans to spend more time in forests and farms trying to either find wild foods or grow more food on farms and in gardens. This meant that their exposure to deer ticks increased, resulting in more cases of encephalitis. Since the epidemic was coincident with the fall of the Soviet empire and the end of the Cold War, one of Reiter's colleagues quipped that it was caused by "political global warming." Reiter noted that 150 EDEN studies have been published so far and that "none of them support the notion that disease is increasing because of climate change."
Finally, Reiter pointed out that many of the claims that climate change will increase disease can be attributed to an incestuous network of just nine authors who write scientific reviews and cite each other's work. None are actual on-the-ground disease researchers and many of them write the IPCC disease analyses. "These are people who know absolutely bugger about dengue, malaria or anything else," said Reiter.
The final presenter of the panel was Stanley Goldenberg, a meteorologist with the National Oceanic and Atmospheric Administration's (NOAA) Hurricane Research Division in Miami, Florida. Again, he stressed that his views were his won, not that of any government agency. Goldenberg is particularly annoyed by former Vice President Al Gore's repeated claim that man-made global warming is making hurricanes more numerous and/or more powerful. For example, at the U.N. Climate Change Conference in Poznan, Poland in December, Gore flat out stated, "The warming ocean waters are also causing stronger typhoons and cyclones and hurricanes."
Goldenberg acknowledged that hurricanes have been more numerous in the North Atlantic in the last decade. But when one looks at the data from the 20th century two, factors stand out. First, the number of hurricanes has increased So have sea surface temperatures. QED: global warming causes more hurricanes, right? Not so fast, says Goldenberg. The perceived increase in the number of hurricanes is actually the result of observational biases. With the advent of satellites, scientists have become much better at finding and identifying hurricanes. In the first half of the 20th century, he pointed out, if a storm didn't come close to land, researchers would often miss it.
The second factor is that researchers have identified a multi-decadal pattern in the frequency of hurricanes in the North Atlantic. There was a very active period between 1870 and 1900, a slow-down between 1900 and 1925, another active period between 1926 and 1970, a period of fewer storms between 1970 and 1995, and the beginning of a new active period around 1995. According to Goldenberg, this new active period will probably last another 20 to 30 years. Goldenberg was a co-author of a 2001 study published in Science which concluded:
Tropical North Atlantic SST [sea surface temperature] has exhibited a warming trend of [about] ) 0.3°C over the last 100 years; whereas Atlantic hurricane activity has not exhibited trend-like variability, but rather distinct multidecadal cycles....The possibility exists that the unprecedented activity since 1995 is the result of a combination of the multidecadal-scale changes in the Atlantic SSTs (and vertical shear) along with the additional increase in SSTs resulting from the long-term warming trend. It is, however, equally possible that the current active period (1995-2000) only appears more active than the previous active period (1926-1970) due to the better observational network in place.
Since this study was published, much more data on hurricane trends has been collected and analyzed. "Not a single scientist at the hurricane center believes that global warming has had any measurable impact on hurricane numbers and strength," concluded Goldenberg. He also suggested that some proponents of the idea that global warming is exacerbating tropical storms have backed off lately. Clearly the former vice president hasn't gotten the news yet.
Yesterday, Bailey walked among the climate change skeptics and reported on talks given by Czech Republic and European Union president Vaclav Klaus and MIT climatologist Bruce Lindzen. Read about it here.
Tomorrow: The last day of the International Conference on Climate Change will feature presentations on the Pacific Decadal Oscillation's effect on global temperature trends, the economic impacts of carbon rationing, and how policymakers deal with scientific information.
Ronald Bailey is Reason magazine's science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is now available from Prometheus Books.
Dispatch from day two of the International Conference on Climate Change in New York
Reason, March 10, 2009
March 9, New York—Assume that man-made global warming exists. So what? That was the premise of a fascinating presentation by Indur Goklany during the second day of sessions at the International Conference on Climate Change. Goklany, who works in the Office of Policy Analysis of the U.S. Department of the Interior and is the author of The Improving State of the World: Why We're Living Longer, Healthier, More Comfortable Lives on a Cleaner Planet, made it clear that he was not speaking on behalf of the federal government.
Goklany's talk looked at three common claims: (1) Human and environmental well-being will be lower in a warmer world than it is today; (2) our descendants will be worse off than if we don't stop man-made global warming; and (3) man-made global warming is the most important problem in the world. Goklany assumed that the United Nations' Intergovernmental Panel on Climate Change's (IPCC) consensus view on future temperature trends is valid. For his analysis, he used data from the fast track assessments of the socioeconomic impacts of global climate change sponsored by the British government, the Stern Review on the Economics of Climate Change, global mortality estimates from the World Health Organization (WHO), and cost estimates from the Intergovernmental Panel on Climate Change.
From the Stern Review, Goklany took the worst case scenario, where man-made global warming produces market and non-market losses equal to 35 percent of the benefits that are projected to exist in the absence of climate change by 2200. What did he find? Even assuming the worst emissions scenario, incomes for both developed and developing countries still rise spectacularly. In 1990, average incomes in developing countries stood around $1,000 per capita and at aroud $14,000 in developed countries. Assuming the worst means that average incomes in developing countries would rise in 2100 to $62,000 and in developed countries to $99,000. By 2200, average incomes would rise to $86,000 and $139,000 in developing and developed countries, respectively. In other words, the warmest world turns out to be the richest world.
Looking at WHO numbers, one finds that the percentage of deaths attributed to climate change now is 13th on the list of causes of mortality, standing at about 200,000 per year, or 0.3 percent of all deaths. High blood pressure is first on the list, accounting for 7 million (12 percent) of deaths; high cholesterol is second at 4.4 million; and hunger is third. Clearly, climate change is not the most important public health problem today. But what about the future? Again looking at just the worst case of warming, climate change would boost the number of deaths in 2085 by 237,000 above what they would otherwise be according to the fast track analyses. Many of the authors of the fast track analyses also co-authored the IPCC's socioeconomic impact assessments.
Various environmental indicators would also improve. For example, 11.6 percent of the world's land was used for growing crops in 1990. In the warmest world, agricultural productivity is projected to increase so much that the amount of land used for crops would drop to just 5 percent by 2100, leaving more land for nature. In other words, if these official projections are correct, man-made global warming is by no means the most important problem faced by humanity.
Next up on the impacts panel was Paul Reiter, head of the insects and infectious disease unit at the Institut Pasteur in Paris. Members of the global warming fraternity frequently worry that climate change will exacerbate the spread of tropical diseases like malaria. Reiter began his talk by pointing out that malaria was endemic in Yakutsk, the coldest city on earth, until 1959. In 1935, the Soviets claimed that malaria killed nearly 4,000 people in Yakutsk, a number that dropped to just 85 in 1959, the year that the disease was finally eradicated, in part by using the insecticide DDT.
Reiter then described a vast new research program that he is participating in, the Emerging Diseases in a Changing European eNvironment, or EDEN project. Sponsored by the European Union, the EDEN project is evaluating the potential impacts of future global warming on the spread of disease in Europe. The EDEN researchers have been assessing outbreaks of various diseases to see if they could discern any impact climate change may be having on their spread.
Reiter cited a recent analysis of the outbreak of tick-borne encephalitis in the early 1990s in many eastern European countries. The epidemic occurred shortly after the fall of communism, when many former Soviet bloc countries went into steep economic decline. After sifting through the data, it became apparent that the tough economic situation forced many eastern Europeans to spend more time in forests and farms trying to either find wild foods or grow more food on farms and in gardens. This meant that their exposure to deer ticks increased, resulting in more cases of encephalitis. Since the epidemic was coincident with the fall of the Soviet empire and the end of the Cold War, one of Reiter's colleagues quipped that it was caused by "political global warming." Reiter noted that 150 EDEN studies have been published so far and that "none of them support the notion that disease is increasing because of climate change."
Finally, Reiter pointed out that many of the claims that climate change will increase disease can be attributed to an incestuous network of just nine authors who write scientific reviews and cite each other's work. None are actual on-the-ground disease researchers and many of them write the IPCC disease analyses. "These are people who know absolutely bugger about dengue, malaria or anything else," said Reiter.
The final presenter of the panel was Stanley Goldenberg, a meteorologist with the National Oceanic and Atmospheric Administration's (NOAA) Hurricane Research Division in Miami, Florida. Again, he stressed that his views were his won, not that of any government agency. Goldenberg is particularly annoyed by former Vice President Al Gore's repeated claim that man-made global warming is making hurricanes more numerous and/or more powerful. For example, at the U.N. Climate Change Conference in Poznan, Poland in December, Gore flat out stated, "The warming ocean waters are also causing stronger typhoons and cyclones and hurricanes."
Goldenberg acknowledged that hurricanes have been more numerous in the North Atlantic in the last decade. But when one looks at the data from the 20th century two, factors stand out. First, the number of hurricanes has increased So have sea surface temperatures. QED: global warming causes more hurricanes, right? Not so fast, says Goldenberg. The perceived increase in the number of hurricanes is actually the result of observational biases. With the advent of satellites, scientists have become much better at finding and identifying hurricanes. In the first half of the 20th century, he pointed out, if a storm didn't come close to land, researchers would often miss it.
The second factor is that researchers have identified a multi-decadal pattern in the frequency of hurricanes in the North Atlantic. There was a very active period between 1870 and 1900, a slow-down between 1900 and 1925, another active period between 1926 and 1970, a period of fewer storms between 1970 and 1995, and the beginning of a new active period around 1995. According to Goldenberg, this new active period will probably last another 20 to 30 years. Goldenberg was a co-author of a 2001 study published in Science which concluded:
Tropical North Atlantic SST [sea surface temperature] has exhibited a warming trend of [about] ) 0.3°C over the last 100 years; whereas Atlantic hurricane activity has not exhibited trend-like variability, but rather distinct multidecadal cycles....The possibility exists that the unprecedented activity since 1995 is the result of a combination of the multidecadal-scale changes in the Atlantic SSTs (and vertical shear) along with the additional increase in SSTs resulting from the long-term warming trend. It is, however, equally possible that the current active period (1995-2000) only appears more active than the previous active period (1926-1970) due to the better observational network in place.
Since this study was published, much more data on hurricane trends has been collected and analyzed. "Not a single scientist at the hurricane center believes that global warming has had any measurable impact on hurricane numbers and strength," concluded Goldenberg. He also suggested that some proponents of the idea that global warming is exacerbating tropical storms have backed off lately. Clearly the former vice president hasn't gotten the news yet.
Yesterday, Bailey walked among the climate change skeptics and reported on talks given by Czech Republic and European Union president Vaclav Klaus and MIT climatologist Bruce Lindzen. Read about it here.
Tomorrow: The last day of the International Conference on Climate Change will feature presentations on the Pacific Decadal Oscillation's effect on global temperature trends, the economic impacts of carbon rationing, and how policymakers deal with scientific information.
Ronald Bailey is Reason magazine's science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is now available from Prometheus Books.
Somali Americans Recruited by Extremists
Somali Americans Recruited by Extremists. By Spencer S. Hsu and Carrie Johnson
U.S. Cites Case of Minnesotan Killed in Suicide Blast in Africa
Washington Post, Wednesday, March 11, 2009; Page A01
Senior U.S. counterterrorism officials are stepping up warnings that Islamist extremists in Somalia are radicalizing Americans to their cause, citing their recruitment of the first U.S. citizen suicide bomber and their potential role in the disappearance of more than a dozen Somali American youths.
In recent public statements, the director of national intelligence and the leaders of the FBI and CIA have cited the case of Shirwa Ahmed, a 27-year-old college student from Minneapolis who blew himself up in Somalia on Oct. 29 in one of five simultaneous bombings attributed to al-Shabaab, a group with close links to al-Qaeda.
Since November, the FBI has raced to uncover any ties to foreign extremist networks in the unexpected departures of numerous Somali American teenagers and young men, who family members believe are in Somalia. The investigation is active in Boston; San Diego; Seattle; Columbus, Ohio; and Portland, Maine, a U.S. law enforcement official said, and community members say federal grand juries have issued subpoenas in Minneapolis and elsewhere.
Officials are still trying to assess the scope of the problem but say reports so far do not warrant a major concern about a terrorist threat within the United States. But intelligence officials said the recruitment of U.S. citizens by terrorist groups is particularly worrisome because their American passports could make it easier for them to reenter the country.
Al-Shabaab -- meaning "the youth" or "young guys" in Arabic -- "presents U.S. authorities with the most serious evidence to date of a 'homegrown' terrorist recruitment problem right in the American heartland," Georgetown University professor Bruce R. Hoffman says in a forthcoming report by the SITE Intelligence Group, a private firm that monitors Islamist Web sites.
The extent of al-Shabaab's reach into the U.S. Somali community, estimated at up to 200,000 foreign-born residents and their relatives, will be the subject of a hearing today by the Senate homeland security committee, chaired by Sen. Joseph I. Lieberman (I-Conn.).
U.S. officials give varying assessments of the problem. On Feb. 25, CIA Director Leon E. Panetta told reporters that the relationship between Somalis in the United States and in Somalia "raises real concerns about the potential for terrorist activity" and "constitutes a potential threat to the security of this country."
Two days later, FBI Director Robert S. Mueller III appeared to play down the concern, calling Ahmed "just one manifestation of a problem" since the attacks of Sept. 11, 2001, of young men in the United States being recruited to fight with terrorists overseas. Federal authorities have investigated cases of U.S. fighters in Afghanistan, Iraq, Yemen and Somalia.
Mueller added in a speech to the Council on Foreign Relations: "We certainly believe that [Ahmed] was recruited here in the United States, and we do believe that there may have been others that have been radicalized as well."
Overall, U.S. intelligence officials assess that "homegrown" extremists are not as numerous, active or skilled here as they are in Europe, but authorities remain focused on what Director of National Intelligence Dennis C. Blair called the "likelihood that a small but violent number of cells may develop here."
Domestic radicalization has been a greater concern in Europe than in the United States, whose economic mobility, assimilative culture and historic openness to immigrants have provided some insulation, U.S. officials suggest. In the year before the 2005 London transit attack, Britain in particular struggled with reports that al-Qaeda was secretly recruiting Muslims at British universities and that up to 3,000 Britons had returned over a decade from the terrorist group's camps.
U.S. authorities have been wary of stereotyping Somalis or overstating concerns, with Mueller recently comparing the situation to that in Ireland, another country with civil strife, terrorism and a large immigrant community in the United States but little violence here.
Al-Shabaab's ranks may also diminish now that an Islamist government has replaced a U.S.-backed Ethiopian occupation in Somalia. "It's very difficult to see how launching an attack using a sleeper cell in the United States would in any way serve their interests," said Kenneth J. Menkhaus, a political scientist at Davidson College who specializes in East Africa.
The FBI investigation of Ahmed's death may help determine how broad the problem is. The coordinated bombings in two cities about 500 miles north of Mogadishu represented "a qualitative leap" of terrorist capabilities and were probably the work of al-Shabaab, according to a United Nations monitoring group. In February 2008, the State Department designated al-Shabaab a terrorist group.
Little has been said publicly about Ahmed, a naturalized citizen who reportedly moved to Minneapolis in 1996 and graduated from high school there. As accounts of his death spread, distraught Somali American families came forward in Minneapolis, alleging that the first young man left a year ago, then eight more on Aug. 1, followed by seven on Election Day. Four families spoke out publicly, and U.S. authorities confirmed the names of Burhan Hassan, 17, and Mustafa Ali, 17, high school seniors who families said attended the Abubakar As-Saddique Islamic Center mosque.
Relatives tell similar stories. Hassan was a bookworm who wanted to become a doctor or a lawyer and spent time after school and on weekends at the mosque, Minnesota's largest, which Ali also attended, said Osman Ahmed, 43, Hassan's cousin. Two 19-year-olds were studying medicine and engineering at the University of Minnesota, but they became antisocial, speaking and eating less as they grew more devout, Ahmed said.
Hassan had no job and no money, but when he did not come home Nov. 4, his family discovered that his passport, laptop computer and cellphone were gone, and they found paperwork for nearly $2,000 in airfare, Ahmed said. He said Universal Travel, a nearby travel agency, had said an adult claiming to be a parent paid for tickets for several youths.
"We believe a minority group are recruiting these kids and brainwashing them and financing and arranging the travel," Ahmed said. "Those who are recruiting kids here can harm us here."
He said the young men periodically call their relatives, who say they repeat terse, scripted statements that they are safe and in Somalia studying, but nothing more.
Somalis as a whole may be vulnerable to radical appeals because their home country has been torn by two decades of political strife and they are among the youngest, poorest and newest immigrants to the United States. According to a U.S. census report, nearly 60 percent of Somali immigrants arrived since 2000; their average age is 26.8 years; and 51 percent live in poverty, with a median household income of $21,461, compared with the national median of $61,173.
Omar Jamal, executive director of the Somali Justice Advocacy Center in Minnesota, said the group first alerted the local FBI a year ago, when family members believed Sakaria Sharif Macruf had left and been killed fighting in Somalia. He later turned up alive but with al-Shabaab in Kismaayo, a city in southern Somalia under the group's control, Ahmed said.
Jamal said U.S. government policies since 9/11 helped push alienated youths toward radicals.
"You have high rates of young guys unemployed. You have a high rate of dropouts. They're difficult to integrate and work into the mainstream." He said religious extremists worked with youths and "gave them hope in their lives -- then indoctrinated them into this violent, radical ideology."
Mahir Sherif, a lawyer for the mosque, said its imam, Abdirahman Ahmed, and a youth coordinator were barred from flying last winter by U.S. authorities in connection with the investigation. But Sherif said there is no evidence that mosque leaders recruited, financed or facilitated young men to go to Somalia and accused Jamal and his allies of acting out of hatred for the Muslim religious establishment.
The Abubakar As-Saddique Islamic Center "does not engage in political activities, has not and will not recruit for any political cause and never will be in support of terrorist philosophy or acts," Sherif said, adding: "The center unequivocally condemns suicide bombings and all acts of indiscriminate violence."
E.K. Wilson, a spokesman for the FBI in Minneapolis, declined to comment on the probe but said: "We're aware some of these young kids have traveled to the Horn of Africa to train, possibly to fight, with terrorist groups. We're trying to expand our outreach effort in the Somali American community here in Minneapolis to cover a broader base of the population here."
Earlier last year, Ruben Shumpert, an African American convert to Islam from Seattle, was killed in a U.S.-supported rocket attack near Mogadishu after he fled to Somalia in part to avoid prison after pleading guilty to gun and counterfeiting charges in the United States.
Another man, Boston native Daniel J. Maldonado, now 30, became in February 2007 the first American to be charged with a crime for joining Islamist extremist fighters in Somalia. Maldonado moved to Texas, changed his name to Daniel Aljughaifi and traveled to Africa in 2005, according to government court filings. He was captured by Kenyan soldiers in 2007 and returned to the United States, where he is serving a 10-year prison term.
Staff researcher Julie Tate contributed to this report.
U.S. Cites Case of Minnesotan Killed in Suicide Blast in Africa
Washington Post, Wednesday, March 11, 2009; Page A01
Senior U.S. counterterrorism officials are stepping up warnings that Islamist extremists in Somalia are radicalizing Americans to their cause, citing their recruitment of the first U.S. citizen suicide bomber and their potential role in the disappearance of more than a dozen Somali American youths.
In recent public statements, the director of national intelligence and the leaders of the FBI and CIA have cited the case of Shirwa Ahmed, a 27-year-old college student from Minneapolis who blew himself up in Somalia on Oct. 29 in one of five simultaneous bombings attributed to al-Shabaab, a group with close links to al-Qaeda.
Since November, the FBI has raced to uncover any ties to foreign extremist networks in the unexpected departures of numerous Somali American teenagers and young men, who family members believe are in Somalia. The investigation is active in Boston; San Diego; Seattle; Columbus, Ohio; and Portland, Maine, a U.S. law enforcement official said, and community members say federal grand juries have issued subpoenas in Minneapolis and elsewhere.
Officials are still trying to assess the scope of the problem but say reports so far do not warrant a major concern about a terrorist threat within the United States. But intelligence officials said the recruitment of U.S. citizens by terrorist groups is particularly worrisome because their American passports could make it easier for them to reenter the country.
Al-Shabaab -- meaning "the youth" or "young guys" in Arabic -- "presents U.S. authorities with the most serious evidence to date of a 'homegrown' terrorist recruitment problem right in the American heartland," Georgetown University professor Bruce R. Hoffman says in a forthcoming report by the SITE Intelligence Group, a private firm that monitors Islamist Web sites.
The extent of al-Shabaab's reach into the U.S. Somali community, estimated at up to 200,000 foreign-born residents and their relatives, will be the subject of a hearing today by the Senate homeland security committee, chaired by Sen. Joseph I. Lieberman (I-Conn.).
U.S. officials give varying assessments of the problem. On Feb. 25, CIA Director Leon E. Panetta told reporters that the relationship between Somalis in the United States and in Somalia "raises real concerns about the potential for terrorist activity" and "constitutes a potential threat to the security of this country."
Two days later, FBI Director Robert S. Mueller III appeared to play down the concern, calling Ahmed "just one manifestation of a problem" since the attacks of Sept. 11, 2001, of young men in the United States being recruited to fight with terrorists overseas. Federal authorities have investigated cases of U.S. fighters in Afghanistan, Iraq, Yemen and Somalia.
Mueller added in a speech to the Council on Foreign Relations: "We certainly believe that [Ahmed] was recruited here in the United States, and we do believe that there may have been others that have been radicalized as well."
Overall, U.S. intelligence officials assess that "homegrown" extremists are not as numerous, active or skilled here as they are in Europe, but authorities remain focused on what Director of National Intelligence Dennis C. Blair called the "likelihood that a small but violent number of cells may develop here."
Domestic radicalization has been a greater concern in Europe than in the United States, whose economic mobility, assimilative culture and historic openness to immigrants have provided some insulation, U.S. officials suggest. In the year before the 2005 London transit attack, Britain in particular struggled with reports that al-Qaeda was secretly recruiting Muslims at British universities and that up to 3,000 Britons had returned over a decade from the terrorist group's camps.
U.S. authorities have been wary of stereotyping Somalis or overstating concerns, with Mueller recently comparing the situation to that in Ireland, another country with civil strife, terrorism and a large immigrant community in the United States but little violence here.
Al-Shabaab's ranks may also diminish now that an Islamist government has replaced a U.S.-backed Ethiopian occupation in Somalia. "It's very difficult to see how launching an attack using a sleeper cell in the United States would in any way serve their interests," said Kenneth J. Menkhaus, a political scientist at Davidson College who specializes in East Africa.
The FBI investigation of Ahmed's death may help determine how broad the problem is. The coordinated bombings in two cities about 500 miles north of Mogadishu represented "a qualitative leap" of terrorist capabilities and were probably the work of al-Shabaab, according to a United Nations monitoring group. In February 2008, the State Department designated al-Shabaab a terrorist group.
Little has been said publicly about Ahmed, a naturalized citizen who reportedly moved to Minneapolis in 1996 and graduated from high school there. As accounts of his death spread, distraught Somali American families came forward in Minneapolis, alleging that the first young man left a year ago, then eight more on Aug. 1, followed by seven on Election Day. Four families spoke out publicly, and U.S. authorities confirmed the names of Burhan Hassan, 17, and Mustafa Ali, 17, high school seniors who families said attended the Abubakar As-Saddique Islamic Center mosque.
Relatives tell similar stories. Hassan was a bookworm who wanted to become a doctor or a lawyer and spent time after school and on weekends at the mosque, Minnesota's largest, which Ali also attended, said Osman Ahmed, 43, Hassan's cousin. Two 19-year-olds were studying medicine and engineering at the University of Minnesota, but they became antisocial, speaking and eating less as they grew more devout, Ahmed said.
Hassan had no job and no money, but when he did not come home Nov. 4, his family discovered that his passport, laptop computer and cellphone were gone, and they found paperwork for nearly $2,000 in airfare, Ahmed said. He said Universal Travel, a nearby travel agency, had said an adult claiming to be a parent paid for tickets for several youths.
"We believe a minority group are recruiting these kids and brainwashing them and financing and arranging the travel," Ahmed said. "Those who are recruiting kids here can harm us here."
He said the young men periodically call their relatives, who say they repeat terse, scripted statements that they are safe and in Somalia studying, but nothing more.
Somalis as a whole may be vulnerable to radical appeals because their home country has been torn by two decades of political strife and they are among the youngest, poorest and newest immigrants to the United States. According to a U.S. census report, nearly 60 percent of Somali immigrants arrived since 2000; their average age is 26.8 years; and 51 percent live in poverty, with a median household income of $21,461, compared with the national median of $61,173.
Omar Jamal, executive director of the Somali Justice Advocacy Center in Minnesota, said the group first alerted the local FBI a year ago, when family members believed Sakaria Sharif Macruf had left and been killed fighting in Somalia. He later turned up alive but with al-Shabaab in Kismaayo, a city in southern Somalia under the group's control, Ahmed said.
Jamal said U.S. government policies since 9/11 helped push alienated youths toward radicals.
"You have high rates of young guys unemployed. You have a high rate of dropouts. They're difficult to integrate and work into the mainstream." He said religious extremists worked with youths and "gave them hope in their lives -- then indoctrinated them into this violent, radical ideology."
Mahir Sherif, a lawyer for the mosque, said its imam, Abdirahman Ahmed, and a youth coordinator were barred from flying last winter by U.S. authorities in connection with the investigation. But Sherif said there is no evidence that mosque leaders recruited, financed or facilitated young men to go to Somalia and accused Jamal and his allies of acting out of hatred for the Muslim religious establishment.
The Abubakar As-Saddique Islamic Center "does not engage in political activities, has not and will not recruit for any political cause and never will be in support of terrorist philosophy or acts," Sherif said, adding: "The center unequivocally condemns suicide bombings and all acts of indiscriminate violence."
E.K. Wilson, a spokesman for the FBI in Minneapolis, declined to comment on the probe but said: "We're aware some of these young kids have traveled to the Horn of Africa to train, possibly to fight, with terrorist groups. We're trying to expand our outreach effort in the Somali American community here in Minneapolis to cover a broader base of the population here."
Earlier last year, Ruben Shumpert, an African American convert to Islam from Seattle, was killed in a U.S.-supported rocket attack near Mogadishu after he fled to Somalia in part to avoid prison after pleading guilty to gun and counterfeiting charges in the United States.
Another man, Boston native Daniel J. Maldonado, now 30, became in February 2007 the first American to be charged with a crime for joining Islamist extremist fighters in Somalia. Maldonado moved to Texas, changed his name to Daniel Aljughaifi and traveled to Africa in 2005, according to government court filings. He was captured by Kenyan soldiers in 2007 and returned to the United States, where he is serving a 10-year prison term.
Staff researcher Julie Tate contributed to this report.
Buffett's Unmentionable Bank Solution
Buffett's Unmentionable Bank Solution, by Holman W Jenkins, Jr
WSJ, Mar 11, 2009
Last week's post mortem on the Fannie and Freddie takeover was received better than we might have expected. A few readers assumed Eddie Lampert and Bill Miller, fund managers who lost money when Fan and Fred were seized, and whose letters to their own investors we quoted, were engaged in special pleading.
In fact, the nationalization of Fannie and Freddie is water over the dam. The men's perspective may be one of pain, but it is historical pain.
Now comes Warren Buffett, a big investor in Wells Fargo, M&T Bank and several other banks, who, during his marathon appearance on CNBC Monday, clearly called for suspension of mark-to-market accounting for regulatory capital purposes.
We add the italics for the benefit of a House hearing tomorrow on this very issue. Mark-to-market accounting is fine for disclosure purposes, because investors are not required to take actions based on it. It's not so fine for regulatory purposes. It doesn't just inform but can dictate actions that make no sense in the circumstances. Banks can be forced to raise capital when capital is unavailable or unduly expensive; regulators can be forced to treat banks as insolvent though their assets continue to perform.
What happens next is exactly what we've seen: Their share prices collapse; government feels obliged to inject taxpayer capital into banks simply to achieve an accounting effect, so banks can meet capital adequacy rules set by, um, government.
(This sounds silly, but has been a big part of government's response so far.)
CNBC, sadly, has been playing a loop of Mr. Buffett's remarks that does a consummate job of leaving out his most important point. Nobody cares about the merits of mark-to-market in the abstract, but how it impacts our current banking crisis. And his exact words were that it is "gasoline on the fire in terms of financial institutions."
Depressing bank stocks today, he said, is precisely the question of whether banks will be "forced to sell stock at ridiculously low prices" to meet the capital adequacy rules.
"If they don't have to sell stock at distressed prices, I think a number of them will do very, very well."
He also proposed a fix, which CNBC duly omitted from its loop, namely to "not have the regulators say, 'We're going to force you to put a lot more capital in based on these mark-to-market figures.'"
Mr. Buffett obviously understands where we are today, though it seems to elude many of those kibitzing about "nationalization," "letting banks fail" and other lagging notions. Since last year, our banking system no longer rests on capital, but on government guarantees. With those sweeping guarantees in place to protect their depositors and bondholders, banks now are able to earn princely spreads above their cost of funds, however questionable their balance sheets.
Banks will "build equity at a very rapid rate with the spreads that exist now," Mr. Buffett said. With the possible exception of Citigroup, he added, "the banking system largely will cure itself."
Notice he didn't call for subsidizing hedge funds to buy toxic assets. He didn't call for more government capital injections -- which are not merely redundant when comprehensive guarantees are in place, but positively destructive of the ultimate goal of moving back toward a system based on private capital.
Mr. Buffett didn't utter the unstylish words "regulatory forbearance," but letting banks earn their way out of trouble under an umbrella of government guarantees is precisely that.
Hank Paulson started down just this road last July. Bank stocks soared. Then he turned on a dime. Washington needed somebody to punish and felt it couldn't impose haircuts on uninsured depositors and bondholders. That left only shareholders, who have been allowed to face vast dilution and/or government takeover based on mark-to-market regulatory capital standards.
Yet the truth is, you get little or no moral hazard bang from punishing bank shareholders. Equity investors, by definition, accept the risk of losing 100% of their stake in return for unlimited upside. Go ahead and wipe out shareholders: Markets will turn around and create the next 50-to-1 leveraged financial institution as long as the potential return outweighs the risk.
The only real fix for moral hazard, in some future regulatory arrangement, would be truly to dispel the belief of bondholders and uninsured creditors that they will be bailed out.
That's a subject for another day. The recent devastation of bank equity values wasn't inevitable but was a choice (an addled and perhaps not entirely conscious one) by policy makers trying to make sure bank shareholders didn't benefit from the massive safety net rolled out for banks.
As strategies go, it was a terrible one. It greatly increased the toll the banking crisis imposed on the economy, and the cost that fixing the banks will impose on taxpayers. But there's still a chance to avoid a disastrous, taxpayer-financed government takeover of the banking system. The alternative, just as Mr. Buffett spelled it out, begins with forbearance on capital standards.
WSJ, Mar 11, 2009
Last week's post mortem on the Fannie and Freddie takeover was received better than we might have expected. A few readers assumed Eddie Lampert and Bill Miller, fund managers who lost money when Fan and Fred were seized, and whose letters to their own investors we quoted, were engaged in special pleading.
In fact, the nationalization of Fannie and Freddie is water over the dam. The men's perspective may be one of pain, but it is historical pain.
Now comes Warren Buffett, a big investor in Wells Fargo, M&T Bank and several other banks, who, during his marathon appearance on CNBC Monday, clearly called for suspension of mark-to-market accounting for regulatory capital purposes.
We add the italics for the benefit of a House hearing tomorrow on this very issue. Mark-to-market accounting is fine for disclosure purposes, because investors are not required to take actions based on it. It's not so fine for regulatory purposes. It doesn't just inform but can dictate actions that make no sense in the circumstances. Banks can be forced to raise capital when capital is unavailable or unduly expensive; regulators can be forced to treat banks as insolvent though their assets continue to perform.
What happens next is exactly what we've seen: Their share prices collapse; government feels obliged to inject taxpayer capital into banks simply to achieve an accounting effect, so banks can meet capital adequacy rules set by, um, government.
(This sounds silly, but has been a big part of government's response so far.)
CNBC, sadly, has been playing a loop of Mr. Buffett's remarks that does a consummate job of leaving out his most important point. Nobody cares about the merits of mark-to-market in the abstract, but how it impacts our current banking crisis. And his exact words were that it is "gasoline on the fire in terms of financial institutions."
Depressing bank stocks today, he said, is precisely the question of whether banks will be "forced to sell stock at ridiculously low prices" to meet the capital adequacy rules.
"If they don't have to sell stock at distressed prices, I think a number of them will do very, very well."
He also proposed a fix, which CNBC duly omitted from its loop, namely to "not have the regulators say, 'We're going to force you to put a lot more capital in based on these mark-to-market figures.'"
Mr. Buffett obviously understands where we are today, though it seems to elude many of those kibitzing about "nationalization," "letting banks fail" and other lagging notions. Since last year, our banking system no longer rests on capital, but on government guarantees. With those sweeping guarantees in place to protect their depositors and bondholders, banks now are able to earn princely spreads above their cost of funds, however questionable their balance sheets.
Banks will "build equity at a very rapid rate with the spreads that exist now," Mr. Buffett said. With the possible exception of Citigroup, he added, "the banking system largely will cure itself."
Notice he didn't call for subsidizing hedge funds to buy toxic assets. He didn't call for more government capital injections -- which are not merely redundant when comprehensive guarantees are in place, but positively destructive of the ultimate goal of moving back toward a system based on private capital.
Mr. Buffett didn't utter the unstylish words "regulatory forbearance," but letting banks earn their way out of trouble under an umbrella of government guarantees is precisely that.
Hank Paulson started down just this road last July. Bank stocks soared. Then he turned on a dime. Washington needed somebody to punish and felt it couldn't impose haircuts on uninsured depositors and bondholders. That left only shareholders, who have been allowed to face vast dilution and/or government takeover based on mark-to-market regulatory capital standards.
Yet the truth is, you get little or no moral hazard bang from punishing bank shareholders. Equity investors, by definition, accept the risk of losing 100% of their stake in return for unlimited upside. Go ahead and wipe out shareholders: Markets will turn around and create the next 50-to-1 leveraged financial institution as long as the potential return outweighs the risk.
The only real fix for moral hazard, in some future regulatory arrangement, would be truly to dispel the belief of bondholders and uninsured creditors that they will be bailed out.
That's a subject for another day. The recent devastation of bank equity values wasn't inevitable but was a choice (an addled and perhaps not entirely conscious one) by policy makers trying to make sure bank shareholders didn't benefit from the massive safety net rolled out for banks.
As strategies go, it was a terrible one. It greatly increased the toll the banking crisis imposed on the economy, and the cost that fixing the banks will impose on taxpayers. But there's still a chance to avoid a disastrous, taxpayer-financed government takeover of the banking system. The alternative, just as Mr. Buffett spelled it out, begins with forbearance on capital standards.
LOST on China - the USNS Impeccable incident
LOST on China. WSJ Editorial
A bad treaty leads to a naval scrap.
WSJ, Mar 11, 2009
So once again we are reminded of why Ronald Reagan sank the Law of the Sea Treaty.
Thanks of a sort here go to China, which last week sent several ships to shadow and harass the USNS Impeccable, an unarmed U.S. Navy surveillance ship, as it was operating in international waters about 70 miles south of Hainan Island. The harassment culminated Sunday when the Chinese boats "maneuvered in dangerously close proximity" to the Impeccable, according to the Pentagon, forcing the American crew to turn fire hoses on the Chinese. Undeterred, two of the Chinese ships positioned themselves directly in front of the Impeccable after it had radioed its intention to leave and requested safe passage. A collision was barely averted.
The Chinese have a knack for welcoming incoming U.S. Administrations with these sorts of provocations. In April 2001, a hotdogging Chinese fighter pilot collided with a slow-moving U.S. Navy surveillance aircraft, forcing the American plane to make an emergency landing on Hainan, where its 24-member crew remained for 11 days. They were released only after the U.S. issued a letter saying it was "sorry" for the incident without quite apologizing for it.
Thereafter, the Chinese kept their distance from U.S. surveillance planes, and Beijing's relations with the Bush Administration were generally positive. But the Chinese military remains strategically committed to dominating the South China Sea, and it has recently built a large submarine base on Hainan. China also makes a contentious claim to the oil-rich Spratly and Parcel Islands -- an endless source of friction with the Philippines, Malaysia, Taiwan and Vietnam, which also have their claims. Following Sunday's incident, the Chinese accused the U.S. of violating Chinese and international law.
Which brings us to the U.N.'s Law of the Sea Treaty -- which the Gipper sent to the bottom of the ocean, but the Chinese have signed and which the Obama Administration intends to ratify, with the broad support of the U.S. Navy. The supposed virtue of the treaty is that it codifies the customary laws that have long guaranteed freedom of the seas and creates a legal framework for navigational rights.
The problem is that, as with any document that contains 320 articles and nine annexes, the treaty creates as many ambiguities as it resolves. In this case, the dispute involves the so-called "Exclusive Economic Zones," which give coastal states a patchwork of sovereign and jurisdictional rights over the economic resources of seas to a distance of 200 miles beyond their territorial waters.
Thus, the U.S. contends that the right of its ships to transit through or operate in the EEZs (and of planes to overfly them) is no different than their rights on the high seas, including intelligence gathering, and can point to various articles in the treaty that seem to say as much. But a number of signatories to the treaty, including Brazil, Malaysia, Pakistan and China, take the view that the treaty forbids military and intelligence-gathering work by foreign countries in an EEZ. Matters are further complicated by the claims China made for itself over its EEZ when it ratified the Law of the Sea in the 1990s.
We don't have a view on the legal niceties here, which amounts to a theological dispute in a religion to which we don't subscribe. But the incident with the Impeccable is another reminder that China's ambitions for regional dominance, and for diminishing U.S. influence, remain unchanged despite a new American Administration; and that the Law of the Sea Treaty, far from curbing ambitions or resolving differences, has served only to sharpen both.
Next time the Impeccable sails these waters -- and for the sake of responding to China's provocation it should be soon -- President Obama ought to dispatch a destroyer or two as escorts.
A bad treaty leads to a naval scrap.
WSJ, Mar 11, 2009
So once again we are reminded of why Ronald Reagan sank the Law of the Sea Treaty.
Thanks of a sort here go to China, which last week sent several ships to shadow and harass the USNS Impeccable, an unarmed U.S. Navy surveillance ship, as it was operating in international waters about 70 miles south of Hainan Island. The harassment culminated Sunday when the Chinese boats "maneuvered in dangerously close proximity" to the Impeccable, according to the Pentagon, forcing the American crew to turn fire hoses on the Chinese. Undeterred, two of the Chinese ships positioned themselves directly in front of the Impeccable after it had radioed its intention to leave and requested safe passage. A collision was barely averted.
The Chinese have a knack for welcoming incoming U.S. Administrations with these sorts of provocations. In April 2001, a hotdogging Chinese fighter pilot collided with a slow-moving U.S. Navy surveillance aircraft, forcing the American plane to make an emergency landing on Hainan, where its 24-member crew remained for 11 days. They were released only after the U.S. issued a letter saying it was "sorry" for the incident without quite apologizing for it.
Thereafter, the Chinese kept their distance from U.S. surveillance planes, and Beijing's relations with the Bush Administration were generally positive. But the Chinese military remains strategically committed to dominating the South China Sea, and it has recently built a large submarine base on Hainan. China also makes a contentious claim to the oil-rich Spratly and Parcel Islands -- an endless source of friction with the Philippines, Malaysia, Taiwan and Vietnam, which also have their claims. Following Sunday's incident, the Chinese accused the U.S. of violating Chinese and international law.
Which brings us to the U.N.'s Law of the Sea Treaty -- which the Gipper sent to the bottom of the ocean, but the Chinese have signed and which the Obama Administration intends to ratify, with the broad support of the U.S. Navy. The supposed virtue of the treaty is that it codifies the customary laws that have long guaranteed freedom of the seas and creates a legal framework for navigational rights.
The problem is that, as with any document that contains 320 articles and nine annexes, the treaty creates as many ambiguities as it resolves. In this case, the dispute involves the so-called "Exclusive Economic Zones," which give coastal states a patchwork of sovereign and jurisdictional rights over the economic resources of seas to a distance of 200 miles beyond their territorial waters.
Thus, the U.S. contends that the right of its ships to transit through or operate in the EEZs (and of planes to overfly them) is no different than their rights on the high seas, including intelligence gathering, and can point to various articles in the treaty that seem to say as much. But a number of signatories to the treaty, including Brazil, Malaysia, Pakistan and China, take the view that the treaty forbids military and intelligence-gathering work by foreign countries in an EEZ. Matters are further complicated by the claims China made for itself over its EEZ when it ratified the Law of the Sea in the 1990s.
We don't have a view on the legal niceties here, which amounts to a theological dispute in a religion to which we don't subscribe. But the incident with the Impeccable is another reminder that China's ambitions for regional dominance, and for diminishing U.S. influence, remain unchanged despite a new American Administration; and that the Law of the Sea Treaty, far from curbing ambitions or resolving differences, has served only to sharpen both.
Next time the Impeccable sails these waters -- and for the sake of responding to China's provocation it should be soon -- President Obama ought to dispatch a destroyer or two as escorts.
It's a Terrible Time to Reject Skilled Workers
Turning Away Talent. WSJ Editorial
Another harmful 'stimulus' provision.
WSJ, Mar 11, 2009
Bank of America, citing a provision of the stimulus package that became law last month, is rescinding job offers to foreign-born students graduating from U.S. business schools this summer. Protectionists will applaud, no doubt. But denying companies access to talented workers born outside the U.S. will neither jump-start the economy nor serve the nation's long-term interests.
The stated purpose of the amendment, which was sponsored by Vermont Independent Bernie Sanders and Iowa Republican Chuck Grassley, is "to prohibit any recipient of TARP funding from hiring H-1B visa holders." Press reports have suggested that these visa holders are displacing U.S. workers.
Mr. Sanders cited an especially misleading Associated Press story, which said that the major banks requested visas for more than 21,800 foreign workers over the past six years. "Even as the economy collapsed last year and many financial workers found themselves unemployed," said AP, "the dozen U.S. banks now receiving the biggest rescue packages requested visas for tens of thousand of foreign workers to fill high-paying jobs."
What the story left out is that companies file multiple applications for each available slot to comply with Department of Labor wage rules for H-1B hires. By focusing on how many applications were filed rather than how many foreign workers were hired, the story exaggerates actual visa use. In fact, H-1B visa holders have been a negligible percentage of financial industry hires in recent years. In 2007, for instance, Citigroup hired 185 H-1B workers, which represented .04% of its 387,000 employees. Bank of America hired 66 H-1B workers, which represented .03% of its 210,000 employees.
The reality is that cumbersome labor regulations and fees make foreign professionals more expensive to hire than Americans, which undercuts the argument that the banks were looking for cheap labor and explains why H-1B applications tend to fall during economic downturns. But far from displacing U.S. workers, H-1B hires have been associated with an increase in total employment.
A 2008 study of the tech industry by the National Foundation for American Policy found that for every H-1B position requested, U.S. technology companies in the S&P 500 increase their employment by five workers. America must compete in a global economy, and if U.S. companies can't hire these skilled workers -- many of whom graduate from U.S. universities, by the way -- you can bet foreign competitors will
Another harmful 'stimulus' provision.
WSJ, Mar 11, 2009
Bank of America, citing a provision of the stimulus package that became law last month, is rescinding job offers to foreign-born students graduating from U.S. business schools this summer. Protectionists will applaud, no doubt. But denying companies access to talented workers born outside the U.S. will neither jump-start the economy nor serve the nation's long-term interests.
The stated purpose of the amendment, which was sponsored by Vermont Independent Bernie Sanders and Iowa Republican Chuck Grassley, is "to prohibit any recipient of TARP funding from hiring H-1B visa holders." Press reports have suggested that these visa holders are displacing U.S. workers.
Mr. Sanders cited an especially misleading Associated Press story, which said that the major banks requested visas for more than 21,800 foreign workers over the past six years. "Even as the economy collapsed last year and many financial workers found themselves unemployed," said AP, "the dozen U.S. banks now receiving the biggest rescue packages requested visas for tens of thousand of foreign workers to fill high-paying jobs."
What the story left out is that companies file multiple applications for each available slot to comply with Department of Labor wage rules for H-1B hires. By focusing on how many applications were filed rather than how many foreign workers were hired, the story exaggerates actual visa use. In fact, H-1B visa holders have been a negligible percentage of financial industry hires in recent years. In 2007, for instance, Citigroup hired 185 H-1B workers, which represented .04% of its 387,000 employees. Bank of America hired 66 H-1B workers, which represented .03% of its 210,000 employees.
The reality is that cumbersome labor regulations and fees make foreign professionals more expensive to hire than Americans, which undercuts the argument that the banks were looking for cheap labor and explains why H-1B applications tend to fall during economic downturns. But far from displacing U.S. workers, H-1B hires have been associated with an increase in total employment.
A 2008 study of the tech industry by the National Foundation for American Policy found that for every H-1B position requested, U.S. technology companies in the S&P 500 increase their employment by five workers. America must compete in a global economy, and if U.S. companies can't hire these skilled workers -- many of whom graduate from U.S. universities, by the way -- you can bet foreign competitors will
Obama Takes On Market-Based Government
Obama Takes On Market-Based Government. By Thomas Frank
The GOP fetish for outsourcing deserved to be repudiated.
WSJ, Mar 11, 2009
In a little-noted passage of a mostly ignored speech, President Barack Obama said last week that the government faced "a real choice between investments that are designed to keep the American people safe and those that are designed to make a defense contractor rich."
It was notice, amid all the caterwauling over Mr. Obama's stimulus package, that the president meant to put the kibosh on the GOP's favorite method for spreading the wealth around. A prize ox of the conservative ascendancy was about to be gored.
Under the administration of George W. Bush, you will recall, federal spending grew pretty significantly. At the same time, the number of people directly employed by the federal government shrank. One of the factors that explained the difference was contracting. Guided by Mr. Bush's dictum that "government should be market-based," federal operations under his administration were encouraged to outsource wherever they could.
Indeed, agencies were graded on how many of their duties they opened to private-sector competition. By 2005, according numbers published by Paul C. Light of New York University, contractor employees outnumbered federal workers four to one.
The new system was supposed to save money; it almost certainly hasn't. It was supposed to deliver quality; instead we witnessed the most spectacular government failures in years.
Politically, though, it did its job. For government workers, a class of citizens despised above all others in conservative mythology, the results were predictably dispiriting. For favored contractors and their lobbyists, on the other hand, the Bush years were a boom time like no other. Certain branches of the government, as per Mr. Obama's description, seemed designed merely to shovel money into contractors' pockets, and the revolving door between government and the companies that did its work spun like a giant roulette wheel.
Until they were bumped off the front pages by the unfolding economic disaster, the many failures of government-by-contractor made for riveting stuff. There was the 22-year-old accused of selling lousy munitions to our allies in Afghanistan. There was also the contractor feeding frenzy in the aftermath of Hurricane Katrina.
But things have changed. These days it doesn't even look like banks can be "market-based," much less government. Republican leaders have rediscovered their distaste for wasteful spending. Contractor bad-boy Blackwater has renamed itself "Xe," perhaps on the theory that it will be more difficult for the media to criticize their doings when readers confuse them with the symbol for the element xenon. And Mr. Obama announced his plan to reform the contracting system.
To judge by his rhetoric, the president was declaring the end of the era of government-by-contractor. "Far too often, [government] spending is plagued by massive cost overruns, outright fraud, and the absence of oversight and accountability," he said. "In some cases, contracts are awarded without competition. In others, contractors actually oversee other contractors."
In an accompanying memorandum, Mr. Obama called for new rules specifying exactly when contracting is appropriate -- since "always" is no longer an acceptable answer -- and also for a "review" of "existing contracts," by which we can only hope he means a thorough and public examination of the most outrageous contractor misconduct of the last eight years.
If so, it's easy to get an idea of where to start. The Senate's Democratic Policy Committee, which has been investigating contractor misbehavior in Iraq since 2003, has unearthed important details about the failure of Parsons Corp. to complete more than 20 of the 150 medical clinics it was hired to build in Iraq. The same committee heard from Bunnatine Greenhouse, a whistleblower who charged the Army Corps of Engineers with an unseemly affection for KBR, the company that manages military logistics in Iraq; and it also furnished a platform for Charles Smith who once oversaw one of KBR's big contracts and who asserted that, after KBR had turned in a billion dollars worth of "unsupported charges," the Pentagon blew by his objections and paid up.
This last fellow eventually concluded from his experience that "The interest of a corporation, KBR, not the interests of American soldiers or American taxpayers, seemed to be paramount." It could be the epitaph for an era.
There have always been government contractors, of course, and no doubt there always will be government contractors. But the way Washington manages its contracting isn't static or eternally unchanging. During the last administration, in particular, it was soaked in ideology, from the right's frothing against the federal work force to its cult of outsourcing.
As we embark on the greatest spending program since World War II, getting away from all that is a form of "post-partisanship" that even I will welcome.
The GOP fetish for outsourcing deserved to be repudiated.
WSJ, Mar 11, 2009
In a little-noted passage of a mostly ignored speech, President Barack Obama said last week that the government faced "a real choice between investments that are designed to keep the American people safe and those that are designed to make a defense contractor rich."
It was notice, amid all the caterwauling over Mr. Obama's stimulus package, that the president meant to put the kibosh on the GOP's favorite method for spreading the wealth around. A prize ox of the conservative ascendancy was about to be gored.
Under the administration of George W. Bush, you will recall, federal spending grew pretty significantly. At the same time, the number of people directly employed by the federal government shrank. One of the factors that explained the difference was contracting. Guided by Mr. Bush's dictum that "government should be market-based," federal operations under his administration were encouraged to outsource wherever they could.
Indeed, agencies were graded on how many of their duties they opened to private-sector competition. By 2005, according numbers published by Paul C. Light of New York University, contractor employees outnumbered federal workers four to one.
The new system was supposed to save money; it almost certainly hasn't. It was supposed to deliver quality; instead we witnessed the most spectacular government failures in years.
Politically, though, it did its job. For government workers, a class of citizens despised above all others in conservative mythology, the results were predictably dispiriting. For favored contractors and their lobbyists, on the other hand, the Bush years were a boom time like no other. Certain branches of the government, as per Mr. Obama's description, seemed designed merely to shovel money into contractors' pockets, and the revolving door between government and the companies that did its work spun like a giant roulette wheel.
Until they were bumped off the front pages by the unfolding economic disaster, the many failures of government-by-contractor made for riveting stuff. There was the 22-year-old accused of selling lousy munitions to our allies in Afghanistan. There was also the contractor feeding frenzy in the aftermath of Hurricane Katrina.
But things have changed. These days it doesn't even look like banks can be "market-based," much less government. Republican leaders have rediscovered their distaste for wasteful spending. Contractor bad-boy Blackwater has renamed itself "Xe," perhaps on the theory that it will be more difficult for the media to criticize their doings when readers confuse them with the symbol for the element xenon. And Mr. Obama announced his plan to reform the contracting system.
To judge by his rhetoric, the president was declaring the end of the era of government-by-contractor. "Far too often, [government] spending is plagued by massive cost overruns, outright fraud, and the absence of oversight and accountability," he said. "In some cases, contracts are awarded without competition. In others, contractors actually oversee other contractors."
In an accompanying memorandum, Mr. Obama called for new rules specifying exactly when contracting is appropriate -- since "always" is no longer an acceptable answer -- and also for a "review" of "existing contracts," by which we can only hope he means a thorough and public examination of the most outrageous contractor misconduct of the last eight years.
If so, it's easy to get an idea of where to start. The Senate's Democratic Policy Committee, which has been investigating contractor misbehavior in Iraq since 2003, has unearthed important details about the failure of Parsons Corp. to complete more than 20 of the 150 medical clinics it was hired to build in Iraq. The same committee heard from Bunnatine Greenhouse, a whistleblower who charged the Army Corps of Engineers with an unseemly affection for KBR, the company that manages military logistics in Iraq; and it also furnished a platform for Charles Smith who once oversaw one of KBR's big contracts and who asserted that, after KBR had turned in a billion dollars worth of "unsupported charges," the Pentagon blew by his objections and paid up.
This last fellow eventually concluded from his experience that "The interest of a corporation, KBR, not the interests of American soldiers or American taxpayers, seemed to be paramount." It could be the epitaph for an era.
There have always been government contractors, of course, and no doubt there always will be government contractors. But the way Washington manages its contracting isn't static or eternally unchanging. During the last administration, in particular, it was soaked in ideology, from the right's frothing against the federal work force to its cult of outsourcing.
As we embark on the greatest spending program since World War II, getting away from all that is a form of "post-partisanship" that even I will welcome.
The majority of "homeless" children live in homes, despite media claims
Double Trouble? By James Taranto
The majority of "homeless" children live in homes.
WSJ, Mar 11,2009
Excerpts:
"One of every 50 American children experiences homelessness, according to a new report that says most states have inadequate plans to address the worsening and often-overlooked problem," the Associated Press reports from New York:
"These kids are the innocent victims, yet it seems somehow or other they get left out," said the [National Center on Family Homelessness] president, Dr. Ellen Bassuk. "Why are they America's outcasts?"
The report analyzes data from 2005-2006. It estimates that 1.5 million children experienced homelessness at least once that year, and says the problem is surely worse now because of the foreclosures and job losses of the deepening recession.
"If we could freeze-frame it now, it would be bad enough," said Democratic Sen. Robert Casey of Pennsylvania, who wrote a foreward [sic] to the report. "By end of this year, it will be that much worse."
Horrible if true. But is it true? Not so much. Believe it or not, it turns out that the majority of "homeless" children live in homes.
Seriously! The AP link above includes a graphic that breaks down the "living conditions of homeless children." Fifty-six percent of them are "doubled-up," defined as "sharing housing with other persons due to economic hardship." By this definition, the Meathead on "All in the Family" was "homeless."
Another 7% are listed as living in hotels--a category that, in the report itself, also includes motels, trailer parks and camping grounds. We'll give them campgrounds, but when you think of the homeless, are residents of hotels and trailer parks what come to mind?
Twenty-four percent of "homeless" children live in shelters, according to the AP graphic. That would seem to meet a commonsense definition of homelessness--but it turns out the number conflates those who live in two different types of shelters: "emergency" and "transitional." As the report defines the latter:
Transitional housing bridges the gap between emergency shelters and permanent housing--often providing more intensive services and allowing longer lengths of stay than emergency shelters. Transitional housing models arose in the mid-1980s, when communities realized that for some, emergency shelter services were not sufficient to ensure a permanent exit from homelessness. Transitional housing programs often have a specialized focus on particular barriers to stable housing and provide services and supports to address these issues. For example, programs may be designed exclusively for those fleeing domestic violence, struggling with addictions, or working to reunite with children in the foster care system.
It's not clear what percentage of "homeless" children are in emergency vs. transitional shelters, but the report does say that "Nationally, there are 29,949 units (i.e., housing for one family) of emergency shelter [and] 35,799 units of transitional housing." In any case, a substantial number of the "homeless" who are in "shelters" are actually in facilities the center itself calls "housing" and are on track to finding permanent homes.
The remaining "homeless" children are either "unsheltered" (3%) or "unknown/other" (10%). Among these are children "abandoned in hospitals," "using a primary nighttime residence that is a public or private place not designed for, or ordinarily used as, a regular sleeping accommodation for human beings," and "living in cars, parks, public spaces, abandoned buildings, substandard housing, bus or train stations."
People in most of these categories are plainly homeless--but note how the center slips "substandard housing" in there between abandoned buildings and bus stations. A child should not have to live in substandard housing, and maybe one who does deserves help at the taxpayer's expense. But a lousy home doesn't make you homeless any more than a lousy marriage makes you single.
The AP story is the work of four reporters: David Crary, who gets the byline, plus Linda Stewart Ball in Dallas, Daniel Shea in Little Rock, Ark., and Dionne Walker in Atlanta, who "contributed to this report." Despite all this manpower, it is nothing but a work of stenography. A group whose raison d'être is homelessness has an obvious interest in exaggerating the extent of the problem. The press's complicity is harder to explain.
[...]
The majority of "homeless" children live in homes.
WSJ, Mar 11,2009
Excerpts:
"One of every 50 American children experiences homelessness, according to a new report that says most states have inadequate plans to address the worsening and often-overlooked problem," the Associated Press reports from New York:
"These kids are the innocent victims, yet it seems somehow or other they get left out," said the [National Center on Family Homelessness] president, Dr. Ellen Bassuk. "Why are they America's outcasts?"
The report analyzes data from 2005-2006. It estimates that 1.5 million children experienced homelessness at least once that year, and says the problem is surely worse now because of the foreclosures and job losses of the deepening recession.
"If we could freeze-frame it now, it would be bad enough," said Democratic Sen. Robert Casey of Pennsylvania, who wrote a foreward [sic] to the report. "By end of this year, it will be that much worse."
Horrible if true. But is it true? Not so much. Believe it or not, it turns out that the majority of "homeless" children live in homes.
Seriously! The AP link above includes a graphic that breaks down the "living conditions of homeless children." Fifty-six percent of them are "doubled-up," defined as "sharing housing with other persons due to economic hardship." By this definition, the Meathead on "All in the Family" was "homeless."
Another 7% are listed as living in hotels--a category that, in the report itself, also includes motels, trailer parks and camping grounds. We'll give them campgrounds, but when you think of the homeless, are residents of hotels and trailer parks what come to mind?
Twenty-four percent of "homeless" children live in shelters, according to the AP graphic. That would seem to meet a commonsense definition of homelessness--but it turns out the number conflates those who live in two different types of shelters: "emergency" and "transitional." As the report defines the latter:
Transitional housing bridges the gap between emergency shelters and permanent housing--often providing more intensive services and allowing longer lengths of stay than emergency shelters. Transitional housing models arose in the mid-1980s, when communities realized that for some, emergency shelter services were not sufficient to ensure a permanent exit from homelessness. Transitional housing programs often have a specialized focus on particular barriers to stable housing and provide services and supports to address these issues. For example, programs may be designed exclusively for those fleeing domestic violence, struggling with addictions, or working to reunite with children in the foster care system.
It's not clear what percentage of "homeless" children are in emergency vs. transitional shelters, but the report does say that "Nationally, there are 29,949 units (i.e., housing for one family) of emergency shelter [and] 35,799 units of transitional housing." In any case, a substantial number of the "homeless" who are in "shelters" are actually in facilities the center itself calls "housing" and are on track to finding permanent homes.
The remaining "homeless" children are either "unsheltered" (3%) or "unknown/other" (10%). Among these are children "abandoned in hospitals," "using a primary nighttime residence that is a public or private place not designed for, or ordinarily used as, a regular sleeping accommodation for human beings," and "living in cars, parks, public spaces, abandoned buildings, substandard housing, bus or train stations."
People in most of these categories are plainly homeless--but note how the center slips "substandard housing" in there between abandoned buildings and bus stations. A child should not have to live in substandard housing, and maybe one who does deserves help at the taxpayer's expense. But a lousy home doesn't make you homeless any more than a lousy marriage makes you single.
The AP story is the work of four reporters: David Crary, who gets the byline, plus Linda Stewart Ball in Dallas, Daniel Shea in Little Rock, Ark., and Dionne Walker in Atlanta, who "contributed to this report." Despite all this manpower, it is nothing but a work of stenography. A group whose raison d'être is homelessness has an obvious interest in exaggerating the extent of the problem. The press's complicity is harder to explain.
[...]
Subscribe to:
Posts (Atom)