Paraphrasing Macroeconomics: Understanding the Wealth of Nations. By David Miles, Imperial College, and Andrew Scott, London Business School. Chichester, UK: John Wiley & Sons, 2005
Soft start:
- [M]acroeconomics is far more than just an intellectual toolkit for understanding current events. It is also about understanding the long-term forces that drive the economy and shape the business environment
- [M]acroeconomics is about the economy as a whole ... how the whole economy evolves over time rather than on any one sector, region, or firm. Yet macroeconomics also considers the important issues from the perspective of the firm and/or the individual consumer. It is the overall, or aggregate, implications of tens of thousands of individual decisions that companies and households make that generates the macroeconomic outcomes.
- Economics is the study of the allocation of scarce resources ... Not all these needs can be satisfied, but economics should be able to help you (and society) meet as many of them as possible.
- Market economies allocate resources through prices. Prices tell producers what the demand for a particular product is—if prices are high, then producers know the good is in demand, and they can increase production. If prices are low, producers know that demand for the product is weak, and they should cut back production. Thus the market ensures that society produces more of the goods that people want and less of those that they do not.
- Broadly speaking, economics has two components: microeconomics and macroeconomics. ... microeconomics essentially examines how individual units, whether they be consumers or
firms, decide how to allocate resources and whether those decisions are desirable.
- Macroeconomics studies the economy as a whole; it looks at the aggregate outcomes of
all the decisions that consumers, firms, and the government make in an economy. Macroeconomics is about aggregate variables such as the overall levels of output, consumption, employment, and prices—and how they move over time and between countries.
- In terms of prices, microeconomics focuses on, for instance, the price of a particular firm’s product, whereas macroeconomics focuses on the exchange rate (the price of one country’s money in terms of that of another country) or the interest rate (the price of spending today rather than tomorrow).
The Difference between Macro and Microeconomics
[A] gray area exists between micro and macroeconomics that relates to aggregation—at what point do the actions of a number of firms cease to be a microeconomic issue and become a macroeconomic issue?
- another way of outlining the differences ... In microeconomics the focus is on a small group of agents, say a group of consumers or two firms battling over a particular market. ... economists pay a great deal of attention to the behavior of the agents the model is focusing on ... make assumptions about what consumers want or how much they have to spend, or about whether the two firms are competing over prices or market share, and whether one firm is playing an aggressive strategy, and so on. The result is a detailed analysis of the way particular firms or consumers should behave in a given situation.
- [T]his microeconomic analysis does not explain what is happening in the wider economic environment. Think about consumers’ choice of what goods to consume. In addition to consumers’ own income and the price of the goods they wish to purchase, their decisions depend on an enormous amount of other information. How high is unemployment? Is the government going to increase taxes? Is the exchange rate about to collapse, requiring a sharp increase in interest rates? ... [I]f imported materials are important for the firm’s production process, then a depreciating currency will lead to higher import costs, reducing profit margins even before the firm engages in a price war.
- While none of these background influences—shifts in interest rates or movements in the exchange rate—are under the control of the firm or consumer, they still influence their decisions.
Macroeconomics analyzes the backdrop of economic conditions against which firms and consumers make decisions.
- The economy, as a whole, represents the outcome of decisions that millions of individual
firms and consumers make ... The inflation rate reflects the number of firms that are increasing prices and the amount by which each firm is raising prices ... all of the individual pricing decisions that millions of firms make determine the macroeconomic environment.
- While microeconomics is mainly concerned with studying in detail the decisions of a few agents, taking as given the basic economic backdrop, macroeconomics is about studying how the decisions of all the agents who make up the economy create this backdrop.
- Consider, for instance, the issue of whether a firm should adopt the latest developments in information technology (IT), which promise to increase labor productivity by, say, 20%. A microeconomic analysis of this topic would focus mainly on the costs the firm faces in adopting this technology and the likely productivity and profit gains that it would create. Macroeconomics would consider this IT innovation in the context of the whole economy. In particular, it would examine how, if many firms were to adopt this technology, costs in the whole economy would fall and the demand for skilled labor would rise... this would lead to an increase in wages and the firm’s payroll costs... [could] also shift demand away from unskilled towards skilled workers, causing the composition of unemployment and relative wages to change.
- The microeconomic analysis is one where the firm alone is contemplating adopting a new technology, and the emphasis is on the firm’s pricing and employment decisions, probably holding wages fixed... the analysis assumes the firm’s decisions do not influence the background economic environment. [T]he macroeconomic analysis examines the consequences when many firms implement the new technology and investigates how this affects economy-wide output, wages, and unemployment. [W]hich [form of analysis] is more appropriate depends on the issue to be analyzed and the question that needs to be answered.
Why to study macroeconomics - limitations, relevance
- Understanding macroeconomics is not simply a useful aspect of the public relations role of the business person; nor is it solely related to better understanding government policy.
- Economists distinguish between two types of uncertainty: aggregate and idiosyncratic. Aggregate uncertainty affects all firms and sectors in the economy; idiosyncratic uncertainty affects only a few individuals, firms, or industries. Macroeconomics is essentially about the aggregate sources of uncertainty that affect firms, workers, and consumers.
- [W]hich source of uncertainty is more important for individual health ... ? Evidence (covering firms and consumers) shows that the biggest source of uncertainty in the short term for most firms is the idiosyncratic component. All firms should worry about loss through illness of key personnel, major clients canceling contracts, litigation, fire and theft, and so forth.
- For households, or individuals, idiosyncratic risk is also generally more important than systematic (or aggregate) uncertainty. Whether you pass an exam; how well you get along with your first boss; whether you avoid serious illness in your forties and fifties—for [most these] are likely to be more important for their standard of living over their lifetime than ... aggregate output or ... inflation.
- Consider ... unemployment. In recessions unemployment rises, but not everyone becomes unemployed. Most people carry on with their regular job even through the worst recessions. ...
Therefore, the aggregate measure of unemployment, while important, gives an incomplete
picture of what is happening to individuals in the labor market. Idiosyncratic factors are significant—even during the worst recession some firms ... will be doing well and hiring workers; it is just that more firms are doing badly.
- This does not mean macroeconomics is unimportant to business. Netherlands case in
the early 1980s, and ... the early 1990s, p. 9
- Aggregate uncertainty is also important because it generates a type of risk that, by definition, all firms and consumers share ... the only source of uncertainty that is fully portable between jobs in different industries is aggregate uncertainty.
- [O]nly a small part of corporate uncertainty in any one year is due to aggregate or macroeconomic uncertainty. However, the further ahead one looks, the more important aggregate uncertainty becomes.
C O N C E P T U A L Q U E S T I O N S
1. (Section 1.1) What factors do you think explain why the United States is so rich and Bangladesh is so poor? What do you think accounts for the growth that most economies have shown?
Answer: Despite ideological disputes, proposals to tax the rich more in some Swiss cantons or NY State, and calling big business owners locusts (some SPD candidate), etc., there is a consensus that (see Obama's inauguration, 2009), a relatively free market economy and the rule of law are essential for prosperity. We can summarize Bangladesh plight saying that the country have neither of those essentials, nor many other factors for growth. They even had a mutiny in the Army this year - that is not precisely what investors need to have confidence.
2. (1.3) Figure 1.5 shows the pattern of bankruptcies and interest rates in the Netherlands. What do you think might account for this pattern? What can firms do to try to minimize this cyclical risk of bankruptcy?
A.: It is a good explanation that those pikes in interest rates were costly for many companies. As to try to minimize that risk and the real effect of that risk, I cannot think of anything of value: if rates go from 7 to 12 pct, or from 3 to 6 pct, what can you do? It is a tragedy.
3. (1.3) Figure 1.6 shows the real price of oil since 1913. What other industries besides automobile manufacturers are affected by fluctuations in oil prices, and how are they affected? What are the effects on individual consumers? How do you suppose that national economies of oil-producing nations are affected by changes in oil prices? What about the economies of countries that use a lot of imported oil?
A.: Transportation of all goods, food producers, almost anyone, including individuals and countries that import oil, will see their monthly bills go up for the same or less productive work, less customers and less spending of those customers. Exporting oil countries, to some extent, get more for less, so they can pay debt (to foreign investors) and debts (pending pensions, civil servants' paychecks, etc.).
4. (1.4) Consider the differing impact of microeconomic and macroeconomic factors in the near term prospects of
(a) a graduating student
(b) a restaurant in a village
(c) a restaurant in an airport
(d) a manufacturer of low-price cars
(e) a manufacturer of luxury sports cars
A N A L Y T I C A L Q U E S T I O N S
1. (Section 1.1) Consider the data in Figure 1.2. What growth rate will Bangladesh have to
show to catch up with the 2002 U.K. and U.S. per capita income level within 10 years? 20
years? 30 years? How would population growth affect your calculations?
2. (1.3) Consider an economy made up of 5 equal sized firms (labeled A to E). Under one scenario the output of each firm alternates between
Firm A B C D E
Output 1 2 3 4 5
And
Firm A B C D E
Output 5 4 3 2 1
What is the balance between idiosyncratic and aggregate risk in this economy?
A.: aggregate risk is flat, since final output, number of employees, etc., is the same after the change than before it. But idiosyncratic damage is big for company E, and gains for company A are enormous too.
How does your answer change if each firm oscillates between
Firm A B C D E
Output 3 3 3 3 3
And
Firm A B C D E
Output 4 4 4 4 4
A.: aggregate risk is lower if output is not eaten by price changes, and the country is richer than it was. Also, idiosyncratic risk is lower, since all companies earn more, but also the employees probably will get a bigger chunk of the pie, so in the end there will be a pressure on costs that will transmit to the consumer.
Monday, May 11, 2009
WaPo: The White House should join lawmakers in reforming the state secrets doctrine
Securing Lawsuits. WaPo Editorial
The White House should join lawmakers in reforming the state secrets doctrine.
WaPo, Monday, May 11, 2009
ON CONSECUTIVE days last month, a federal appeals court and the president of the United States revealed that they had come to the same conclusion: The state secrets doctrine, which has been used to shut down litigation that the government claims to be risky for national security, needs to be revamped.
During an April 29 news conference, President Obama called the doctrine "overbroad." "I think it is appropriate to say that there are going to be cases in which national security interests are genuinely at stake and that you can't litigate without revealing covert activities or classified information that would genuinely compromise our safety," Mr. Obama said, but he added, "There should be some additional tools, so that it's not such a blunt instrument."
Yet the Obama administration seized on the most blunt interpretation of the state secrets doctrine this year in a lawsuit brought by five men who sued Jeppesen DataPlan, claiming that the Boeing subsidiary helped the Bush administration carry out extraordinary renditions that led to their torture. The Obama Justice Department -- like the Bush administration -- argued that no part of the case could be litigated without the threat of compromising national security.
The day before the president's news conference, the U.S. Court of Appeals for the 9th Circuit roundly rejected the administration's assertions. In an April 28 opinion, the court ruled that a trial judge should proceed with the case and evaluate individual claims of government secrecy. If a piece of evidence is deemed too sensitive, it may be stricken, but the plaintiff may still try to prove his case using unclassified evidence.
The 9th Circuit's decision squarely conflicts with a 2007 decision of the Richmond-based 4th Circuit to throw out the case of Khaled al-Masri, a German national who was seized in 2004 by U.S. operatives and allegedly tortured. The Justice Department could appeal the 9th Circuit decision, so as to allow the Supreme Court to resolve the conflict. But a better course would be to begin working with lawmakers to fine-tune the proposed State Secrets Protection Act.
The bill, originally championed by Sen. Edward M. Kennedy (D-Mass.) and recently reintroduced by Senate Judiciary Committee Chairman Patrick J. Leahy (D-Vt.), would allow judges to privately review information that the government claims is too sensitive for public dissemination. If a specific piece of evidence was deemed too sensitive, the bill would allow the government to provide unclassified summaries of evidence to plaintiffs' lawyers with appropriate security clearances. If even that proved unworkable, the judge could exclude the evidence, but the entire case would not have to be dismissed.
The executive's prerogatives to protect national security must be respected, as must the rights of private litigants to have a fighting chance in court. Passage of the State Secrets Protection Act would ensure that these factors are weighed fairly no matter who sits in the Oval Office.
The White House should join lawmakers in reforming the state secrets doctrine.
WaPo, Monday, May 11, 2009
ON CONSECUTIVE days last month, a federal appeals court and the president of the United States revealed that they had come to the same conclusion: The state secrets doctrine, which has been used to shut down litigation that the government claims to be risky for national security, needs to be revamped.
During an April 29 news conference, President Obama called the doctrine "overbroad." "I think it is appropriate to say that there are going to be cases in which national security interests are genuinely at stake and that you can't litigate without revealing covert activities or classified information that would genuinely compromise our safety," Mr. Obama said, but he added, "There should be some additional tools, so that it's not such a blunt instrument."
Yet the Obama administration seized on the most blunt interpretation of the state secrets doctrine this year in a lawsuit brought by five men who sued Jeppesen DataPlan, claiming that the Boeing subsidiary helped the Bush administration carry out extraordinary renditions that led to their torture. The Obama Justice Department -- like the Bush administration -- argued that no part of the case could be litigated without the threat of compromising national security.
The day before the president's news conference, the U.S. Court of Appeals for the 9th Circuit roundly rejected the administration's assertions. In an April 28 opinion, the court ruled that a trial judge should proceed with the case and evaluate individual claims of government secrecy. If a piece of evidence is deemed too sensitive, it may be stricken, but the plaintiff may still try to prove his case using unclassified evidence.
The 9th Circuit's decision squarely conflicts with a 2007 decision of the Richmond-based 4th Circuit to throw out the case of Khaled al-Masri, a German national who was seized in 2004 by U.S. operatives and allegedly tortured. The Justice Department could appeal the 9th Circuit decision, so as to allow the Supreme Court to resolve the conflict. But a better course would be to begin working with lawmakers to fine-tune the proposed State Secrets Protection Act.
The bill, originally championed by Sen. Edward M. Kennedy (D-Mass.) and recently reintroduced by Senate Judiciary Committee Chairman Patrick J. Leahy (D-Vt.), would allow judges to privately review information that the government claims is too sensitive for public dissemination. If a specific piece of evidence was deemed too sensitive, the bill would allow the government to provide unclassified summaries of evidence to plaintiffs' lawyers with appropriate security clearances. If even that proved unworkable, the judge could exclude the evidence, but the entire case would not have to be dismissed.
The executive's prerogatives to protect national security must be respected, as must the rights of private litigants to have a fighting chance in court. Passage of the State Secrets Protection Act would ensure that these factors are weighed fairly no matter who sits in the Oval Office.
Saturday, May 9, 2009
WaPo: The administration's inflexible policy on lobbyists and lobbying is having some perverse effects
Cutting Off Competence. WaPo Editorial
The administration's inflexible policy on lobbyists and lobbying is having some perverse effects.
WaPo, Saturday, May 9, 2009
IN THE WAKE of the Jack Abramoff scandal, President Obama was right to err on the side of strictness in restricting lobbyists entering government, and we supported his rule. But we think it's worth asking whether the costs are outweighing the benefits.
The president was right to slow the revolving door so that those who serve in his administration cannot cash in quickly after leaving. The rules will prevent senior officials from lobbying the executive branch -- not just the departments in which they served -- for the remainder of Mr. Obama's service.
On the entry side of the revolving door, the administration barred those who had been registered lobbyists in the past two years from serving in departments that they had lobbied, even if their government work would not involve issues on which they had lobbied. This limitation had its silly aspects -- you want health-care experts at the Department of Health and Human Services, for example, and some very good health-care experts have been registered lobbyists -- but the administration built in some needed flexibility by allowing for waivers.
However, after an early waiver given to Raytheon Corp.'s chief in-house lobbyist, William J. Lynn, to become deputy defense secretary, the administration was bombarded with accusations of hypocrisy. The lesson that Team Obama took from this was not to use its waivers more wisely but to crack down on them -- and to broaden the prohibition, in practice, to exclude lobbyist candidates from consideration even for jobs in departments or agencies that they hadn't lobbied. As a result, too many qualified candidates have been denied positions for which they are suited simply because of a lobbyist taint.
This approach could have the perverse consequence of driving lobbying underground and reducing the openness that the Obama administration says it wants to promote. The decision about whether to register as a lobbyist isn't always clear-cut; in the past, many people registered out of an abundance of caution. Now, some are saying privately that they will avoid registering if at all possible, shedding less sunlight on lobbying activities.
The Obama administration is also making a mistake by barring lobbyists from, well, lobbying it in some circumstances. The administration's rules on distributing stimulus funds bar registered lobbyists from telephoning or meeting with government officials about specific projects; they can make contact only in writing, with documents to be posted on the government's Web site. We understand the good-government impetus here. But why distinguish between lobbyists and corporate executives or local government officials seeking the funds, who have the biggest interests at stake? The rules are up for review soon. They should be rethought.
The administration's inflexible policy on lobbyists and lobbying is having some perverse effects.
WaPo, Saturday, May 9, 2009
IN THE WAKE of the Jack Abramoff scandal, President Obama was right to err on the side of strictness in restricting lobbyists entering government, and we supported his rule. But we think it's worth asking whether the costs are outweighing the benefits.
The president was right to slow the revolving door so that those who serve in his administration cannot cash in quickly after leaving. The rules will prevent senior officials from lobbying the executive branch -- not just the departments in which they served -- for the remainder of Mr. Obama's service.
On the entry side of the revolving door, the administration barred those who had been registered lobbyists in the past two years from serving in departments that they had lobbied, even if their government work would not involve issues on which they had lobbied. This limitation had its silly aspects -- you want health-care experts at the Department of Health and Human Services, for example, and some very good health-care experts have been registered lobbyists -- but the administration built in some needed flexibility by allowing for waivers.
However, after an early waiver given to Raytheon Corp.'s chief in-house lobbyist, William J. Lynn, to become deputy defense secretary, the administration was bombarded with accusations of hypocrisy. The lesson that Team Obama took from this was not to use its waivers more wisely but to crack down on them -- and to broaden the prohibition, in practice, to exclude lobbyist candidates from consideration even for jobs in departments or agencies that they hadn't lobbied. As a result, too many qualified candidates have been denied positions for which they are suited simply because of a lobbyist taint.
This approach could have the perverse consequence of driving lobbying underground and reducing the openness that the Obama administration says it wants to promote. The decision about whether to register as a lobbyist isn't always clear-cut; in the past, many people registered out of an abundance of caution. Now, some are saying privately that they will avoid registering if at all possible, shedding less sunlight on lobbying activities.
The Obama administration is also making a mistake by barring lobbyists from, well, lobbying it in some circumstances. The administration's rules on distributing stimulus funds bar registered lobbyists from telephoning or meeting with government officials about specific projects; they can make contact only in writing, with documents to be posted on the government's Web site. We understand the good-government impetus here. But why distinguish between lobbyists and corporate executives or local government officials seeking the funds, who have the biggest interests at stake? The rules are up for review soon. They should be rethought.
Friday, May 8, 2009
Australia prepares for U.S. decline
A Pacific Warning. WSJ Editorial
Australia prepares for U.S. decline.
WSJ, May 08, 2009
Since World War II, U.S. military dominance has underpinned the Asia-Pacific region's prosperity and relative peace. So it's cause for concern when one of America's closest allies sees that power ebbing amid unstable nuclear regimes such as Pakistan and North Korea and the expanding military power of China.
In the preface to a sweeping defense review released Saturday, Australian Defense Minister Joel Fitzgibbon writes: "The biggest changes to our outlook . . . have been the rise of China, the emergence of India and the beginning of the end of the so-called unipolar moment; the almost two-decade-long period in which the pre-eminence of our principal ally, the United States, was without question."
Australia isn't forecasting the end of U.S. dominance soon; the report predicts that will continue through 2030. There are also a few bright spots, such as a stronger India and the emergence of Indonesia as a stable democratic ally.
But without sustained U.S. defense spending and focus on the Asia-Pacific, it's unclear which nation will ultimately dominate the region -- and that could have profound effects on security and trade. The clearest challenge comes from China, which the Pentagon estimates spent $105 billion to $150 billion in 2008 bulking up its forces. Australia also worries about instability among its Pacific island neighbors, terrorism, the proliferation of weapons of mass destruction and emerging threats like cyber war.
In response to this outlook, Canberra is retooling its defense. It is doubling the size of its submarine fleet to 12 from six and buying about 100 Joint Strike Fighters, three destroyers and eight frigates. The ships and subs will be equipped with cruise missiles. It will also upgrade its army and special forces units and look for new ways to cooperate with the U.S. and other regional democratic powers.
Prime Minister Kevin Rudd said Saturday: "Some have argued that in the global economic recession we should reduce defense spending to ease the pressure on the budget. But the government believes the opposite to be true. In a period of global instability Australia must invest in a strong, capable and well resourced defense force."
Australia currently spends around $13.1 billion a year on defense, not counting money for new equipment. The new policy paper says spending will increase by 3% annually until 2018, which isn't much. But the importance of Canberra's message is about priorities. Australia is worried about the end of the "unipolar moment." Americans and Asians should be worried too.
Australia prepares for U.S. decline.
WSJ, May 08, 2009
Since World War II, U.S. military dominance has underpinned the Asia-Pacific region's prosperity and relative peace. So it's cause for concern when one of America's closest allies sees that power ebbing amid unstable nuclear regimes such as Pakistan and North Korea and the expanding military power of China.
In the preface to a sweeping defense review released Saturday, Australian Defense Minister Joel Fitzgibbon writes: "The biggest changes to our outlook . . . have been the rise of China, the emergence of India and the beginning of the end of the so-called unipolar moment; the almost two-decade-long period in which the pre-eminence of our principal ally, the United States, was without question."
Australia isn't forecasting the end of U.S. dominance soon; the report predicts that will continue through 2030. There are also a few bright spots, such as a stronger India and the emergence of Indonesia as a stable democratic ally.
But without sustained U.S. defense spending and focus on the Asia-Pacific, it's unclear which nation will ultimately dominate the region -- and that could have profound effects on security and trade. The clearest challenge comes from China, which the Pentagon estimates spent $105 billion to $150 billion in 2008 bulking up its forces. Australia also worries about instability among its Pacific island neighbors, terrorism, the proliferation of weapons of mass destruction and emerging threats like cyber war.
In response to this outlook, Canberra is retooling its defense. It is doubling the size of its submarine fleet to 12 from six and buying about 100 Joint Strike Fighters, three destroyers and eight frigates. The ships and subs will be equipped with cruise missiles. It will also upgrade its army and special forces units and look for new ways to cooperate with the U.S. and other regional democratic powers.
Prime Minister Kevin Rudd said Saturday: "Some have argued that in the global economic recession we should reduce defense spending to ease the pressure on the budget. But the government believes the opposite to be true. In a period of global instability Australia must invest in a strong, capable and well resourced defense force."
Australia currently spends around $13.1 billion a year on defense, not counting money for new equipment. The new policy paper says spending will increase by 3% annually until 2018, which isn't much. But the importance of Canberra's message is about priorities. Australia is worried about the end of the "unipolar moment." Americans and Asians should be worried too.
Thursday, May 7, 2009
Review of Mansoor's Baghdad at Sunrise: Rediscovering counterinsurgency in Iraq
The Learning Curve, by Mackubin Thomas Owens
Rediscovering counterinsurgency in Iraq.
The Weekly Standard, May 11, 2009, Volume 014, Issue 32
Review of Baghdad at Sunrise
A Brigade Commander's War in Iraq
by Peter R. Mansoor
Yale, 416 pp., $28
Some years ago, the late Carl Builder of RAND wrote a book entitled The Masks of War, in which he demonstrated the importance of the organizational cultures of the various military services. His point was that each service possesses a preferred way of fighting that is not easily changed.
Since the 1930s the culture of the U.S. Army has emphasized "big wars." This is the legacy of Emory Upton, an innovative 19th-century officer who became a protégé of William Tecumseh Sherman when Sherman became general-in-chief of the Army after the Civil War. Upton believed that the traditional constabulary focus of the Army was outdated. Dispatched on a world tour by Sherman, Upton was especially impressed by Prussian military policy, Prussia's ability to conduct war against the armies of other military powers, and its emphasis on professionalism. Certainly Prussia's overwhelming successes against Denmark, Austria, and France in the Wars of German Unification (1864-71) made the Prussian Army the new exemplar of military excellence in Europe.
Upon his return home, Upton proposed a number of radical reforms, including replacing the citizen-soldier model with one based on a professional soldiery, reducing civilian "interference" in military affairs, and abandoning the emphasis on the constabulary operations that had characterized Army roles during most of the 19th century (with the exception of the Mexican and Civil wars) in favor of preparing for a conflict with a potential foreign enemy.
Given the tenor of the time, all of his proposals were rejected. In ill health, Upton resigned from the Army and, in 1881, committed suicide. But the triumph of progressivism, a political program that placed a great deal of reliance on scientific expertise and professionalism, the end of the Army's constabulary duties on the Western frontier, and the problems associated with mobilizing for and fighting the Spanish American War, made Upton's proposed reforms more attractive, especially within the officer corps. In 1904 Secretary of War Elihu Root published Upton's Military Policy of the United States, and while many of Upton's more radical proposals remained unacceptable to republican America, the idea of reorienting the Army away from constabulary duties to a mission focused on defeating the conventional forces of other states caught on.
While the Army returned to constabulary duties after World War I, Upton's spirit now permeated the professional culture. World War II vindicated Upton's vision, and his view continued to govern Army thinking throughout the Cold War. The American Army that entered Iraq in 2003 was still Emory Upton's Army. Focused as it has been on state-versus-state warfare, Upton's army has not cared much for counterinsurgency, and this was apparent during the first years of the Iraq War. It is also the theme of several recent books on the conflict.
Baghdad at Sunrise is one of the best, written by a colonel who commanded the 1st Brigade of the 1st Armored Division during a particularly difficult year (May 2003-July 2004), a period that saw the rapid coalition victory over Saddam Hussein give way to a vicious insurgency that came close to defeating the United States in Iraq. A genuine soldier-scholar, Colonel Mansoor provides the unique perspective of a midlevel ground commander adapting to the requirements of fighting an insurgency under the most difficult conditions.
His perspective is enhanced by the fact that, two-and-a-half years after redeploying his brigade to Germany, he returned as executive officer to Gen. David Petraeus as Petraeus implemented the "surge" and the counter- insurgency strategy that helped turn the situation around in Iraq. Mansoor not only observed but helped to implement the Army's painful transition from an organization beholden to Emory Upton to one that recognized the necessity to adapt to an enemy who refused to fight the Upton way.
The conventional wisdom holds that it was civilian interference, especially on the part of Donald Rumsfeld, that was to blame for the difficulties U.S. forces faced in Iraq during the first years of the campaign. According to the dominant narrative, Rumsfeld willfully ignored military advice and initiated the war with a force that was too small. He ignored the need to prepare for post-conflict stability operations, and he failed to adapt to the new circumstances once things began to go wrong, not foreseeing the insurgency that engulfed the country.
It is undeniable that Rumsfeld made many critical mistakes. But the uniformed military was no more prescient than he. Did Rumsfeld insist on an early attack with a smaller force than that recommended by many uniformed officers? Yes. But the plan he pushed was a version of a scheme developed by an Army officer, Col. Douglas MacGregor. The military objective of this plan was not to occupy the country but to liberate Iraq from Saddam and turn governance over to liberal Iraqis. The approach was popular with both Rumsfeld and the military because both took their bearings from the Weinberger Doctrine, a set of rules for the use of force drafted in the 1980s which emphasized the quick, overwhelming application of military force to defeat an enemy, leaving postwar affairs to others.
Did Rumsfeld ignore postwar planning? Again, yes. But in doing so he was merely ratifying the preferences of a uniformed military that had internalized the Weinberger emphasis on an "exit strategy." The fact is that if generals are thinking about an exit strategy they are not thinking about "war termination"--how to convert military success into political success. This cultural aversion to stability operations is reflected in the fact that operational planning for Operation Iraqi Freedom took 18 months while planning for postwar stabilization began half-heartedly only a couple of months before the invasion.
Did Rumsfeld foresee the insurgency and the shift from conventional to guerrilla war? No. But neither did his critics in the uniformed services. Mansoor makes this point clear by observing that, for at least the three decades before the Iraq war, the professional military education system all but ignored counterinsurgency operations. This cultural aversion to counter- insurgency lay at the heart of the difficult years in Iraq (2003-07), and in the absence of a counterinsurgency doctrine the Army fell back on what it knew: conventional offensive operations designed to kill the enemy without protecting the population.
The Army's predisposition toward offensive operations was reinforced in the 1990s by a sort of operational "happy talk" that convinced many (who should have known better) that the American edge in emerging technologies, especially informational technologies, would permit the United States to conduct short, decisive, and relatively bloodless campaigns. This was the lesson many learned from the first Gulf war, and the result was an approach that goes under the name of Rapid Decisive Operations. Mansoor observes that Rapid Decisive Operations misunderstood the timeless nature of war: "What we learned [in Iraq]," he writes, "was that the real objective of the war was not merely the collapse of the old regime but the creation of a stable government." As the old saying goes, in war the enemy has a vote, and in the case of Iraq, our adversaries voted not to fight the kind of war Americans preferred.
As the conflict morphed into an insurgency, U.S. ground troops responded by going after the insurgents, adapting conventional tactics to a guerrilla war. In The Gamble Thomas Ricks quotes a speech by an Army officer that captures the essence of the U.S. approach in Iraq until 2007: "Anytime you fight, you always kill the other sonofabitch. Do not let him live today so he will fight you tomorrow. Kill him today."
This approach made sense when the insurgents stood and fought, as they did in Falluja in April and November 2004. It also made sense during the subsequent "rivers campaign" of 2005, designed to destroy the insurgency in al Anbar Province by depriving it of its base and infrastructure in the Sunni Triangle and the "ratlines" west and northwest of Falluja. It unquestionably killed thousands of insurgents, including Abu Musab al Zarqawi, the leader of Al Qaeda in Iraq, as well as many of his top lieutenants, and led to the capture of many more. Intelligence from captured insurgents, as well as from Zarqawi's computer, had a cascading effect, permitting the coalition to maintain pressure on the insurgency.
But while successful in disrupting insurgent operations, there were too few troops to maintain control of the towns of al Anbar. The insurgents, abandoning their Falluja approach of standing and fighting the Americans, simply melted away, only to return after coalition troops had departed. Thus, while soldiers and Marines were chasing insurgents from sanctuary to sanctuary, they were not providing security for the Iraqi population, leaving them at the mercy of the insurgents who terrorized and intimidated them.
As the insurgency metastasized in 2005 the United States had three military alternatives: continue offensive operations along the lines of those in Anbar after Falluja; adopt a counterinsurgency approach; or emphasize the training of Iraqi troops in order to effect a transition to Iraqi control of military operations. Gen. John Abizaid of Central Command, and Gen. George Casey, the overall commander in Iraq, chose the third option, supported by Rumsfeld and Joint Chiefs chairman Gen. Richard Myers.
But while moving toward Iraqi control was a logical option for the long run, it did little to solve the proximate problem of the insurgency, which had generated sectarian violence. Based on the belief of many senior commanders, especially General Abizaid, that U.S. troops were an "antibody" to Iraqi culture, U.S. forces were consolidated on large "forward operating bases," maintaining a presence only by means of motorized patrols that were particularly vulnerable to attacks by IEDs. In so doing, we ceded territory and population alike to the insurgents. Mansoor describes this approach as a mistake: "Security of the population is the fundamental basis of any successful counterinsurgency strategy."
The withdrawal of American forces to forward operating bases also contributed to a "kick-in-the-door" mentality among troops when they did interact with Iraqis. This was completely at odds with effective counterinsurgency practice, seriously undermining attempts to pacify the country. And yet, despite many difficulties (including resistance from above), some Army and Marine commanders had been implementing a counterinsurgency approach on their own initiative; that is to say, forming partnerships with the Sunni sheikhs in al Anbar province who had tired of al Qaeda's reign of terror in the Sunni Triangle. By providing security to the people in cooperation with the sheikhs, the Americans were able to isolate Al Qaeda in Iraq. And as U.S commanders were struggling with the insurgency, the Army and Marine Corps were developing a counterinsurgency doctrine based on this insight, and an operational strategy that would successfully be applied as part of the surge in 2007.
As a close associate of General Petraeus, Colonel Mansoor helped serve as midwife to the remarkable shift in Iraq arising from a more general application of the lessons that he had learned during his 2003-04 command. This new approach rejected the position articulated by Petraeus's predecessor, General Casey, who had told President George W. Bush in 2006 that "to win, we have to draw down." And General Abizaid of Central Command, sticking to his belief that American soldiers were an "antibody" to Iraqi culture, seconded Casey.
But Petraeus agreed with Mansoor's observation that "counterinsurgency is a thinking soldier's war," requiring "the counterinsurgent to adapt faster than the insurgent." The time for applying a new approach was at hand, and to his credit, President Bush saw the necessity for change and took action.
One of the debates triggered by our experience in Iraq concerns U.S. force structure. As Mansoor puts it, "If we accept the premise that [counterinsurgency and] stability operations [are] of primary concern, then the Army's organization for combat should [be] different." This debate pits the "long war" school against "traditionalists." The former argues that Iraq and Afghanistan are most characteristic of the protracted and ambiguous wars America will fight in the future, and that the military should be developing a force designed to fight the "long war" on terrorism, which envisions the necessity of preparing for small wars, or insurgencies.
The traditionalists concede that irregular warfare will occur more frequently in the future and that fighting small wars is difficult. But traditionalists also conclude that such conflicts do not threaten U.S. strategic interests, while large-scale conflicts, which they believe remain a real possibility, will threaten strategic interests. They fear that the Long War School's focus on small wars and insurgencies will transform the Army back into a constabulary force, whose new capability for conducting stability operations and "nation-building" would be purchased at a high cost: the inability to conduct large-scale conventional war.
This is by no means a parochial debate, of interest only to the uniformed military, and its outcome has implications for broader national security policy: A force structure aligned with the requirement to fight conventional wars would make it more difficult for the United States to fight small wars. This may be a legitimate choice for the United States, but it is one that should be made by policymakers, and not delegated to the uniformed military. To do so would permit military decisions to constrain policy and strategy questions that lie well within the purview of civilian authority, and our experiences in Vietnam and Iraq demonstrate the dangers of leaving military doctrine and force structure strictly to the military.
Mackubin Thomas Owens is editor of Orbis, the journal of the Foreign Policy Research Institute, and professor of national security affairs at the Naval War College.
Rediscovering counterinsurgency in Iraq.
The Weekly Standard, May 11, 2009, Volume 014, Issue 32
Review of Baghdad at Sunrise
A Brigade Commander's War in Iraq
by Peter R. Mansoor
Yale, 416 pp., $28
Some years ago, the late Carl Builder of RAND wrote a book entitled The Masks of War, in which he demonstrated the importance of the organizational cultures of the various military services. His point was that each service possesses a preferred way of fighting that is not easily changed.
Since the 1930s the culture of the U.S. Army has emphasized "big wars." This is the legacy of Emory Upton, an innovative 19th-century officer who became a protégé of William Tecumseh Sherman when Sherman became general-in-chief of the Army after the Civil War. Upton believed that the traditional constabulary focus of the Army was outdated. Dispatched on a world tour by Sherman, Upton was especially impressed by Prussian military policy, Prussia's ability to conduct war against the armies of other military powers, and its emphasis on professionalism. Certainly Prussia's overwhelming successes against Denmark, Austria, and France in the Wars of German Unification (1864-71) made the Prussian Army the new exemplar of military excellence in Europe.
Upon his return home, Upton proposed a number of radical reforms, including replacing the citizen-soldier model with one based on a professional soldiery, reducing civilian "interference" in military affairs, and abandoning the emphasis on the constabulary operations that had characterized Army roles during most of the 19th century (with the exception of the Mexican and Civil wars) in favor of preparing for a conflict with a potential foreign enemy.
Given the tenor of the time, all of his proposals were rejected. In ill health, Upton resigned from the Army and, in 1881, committed suicide. But the triumph of progressivism, a political program that placed a great deal of reliance on scientific expertise and professionalism, the end of the Army's constabulary duties on the Western frontier, and the problems associated with mobilizing for and fighting the Spanish American War, made Upton's proposed reforms more attractive, especially within the officer corps. In 1904 Secretary of War Elihu Root published Upton's Military Policy of the United States, and while many of Upton's more radical proposals remained unacceptable to republican America, the idea of reorienting the Army away from constabulary duties to a mission focused on defeating the conventional forces of other states caught on.
While the Army returned to constabulary duties after World War I, Upton's spirit now permeated the professional culture. World War II vindicated Upton's vision, and his view continued to govern Army thinking throughout the Cold War. The American Army that entered Iraq in 2003 was still Emory Upton's Army. Focused as it has been on state-versus-state warfare, Upton's army has not cared much for counterinsurgency, and this was apparent during the first years of the Iraq War. It is also the theme of several recent books on the conflict.
Baghdad at Sunrise is one of the best, written by a colonel who commanded the 1st Brigade of the 1st Armored Division during a particularly difficult year (May 2003-July 2004), a period that saw the rapid coalition victory over Saddam Hussein give way to a vicious insurgency that came close to defeating the United States in Iraq. A genuine soldier-scholar, Colonel Mansoor provides the unique perspective of a midlevel ground commander adapting to the requirements of fighting an insurgency under the most difficult conditions.
His perspective is enhanced by the fact that, two-and-a-half years after redeploying his brigade to Germany, he returned as executive officer to Gen. David Petraeus as Petraeus implemented the "surge" and the counter- insurgency strategy that helped turn the situation around in Iraq. Mansoor not only observed but helped to implement the Army's painful transition from an organization beholden to Emory Upton to one that recognized the necessity to adapt to an enemy who refused to fight the Upton way.
The conventional wisdom holds that it was civilian interference, especially on the part of Donald Rumsfeld, that was to blame for the difficulties U.S. forces faced in Iraq during the first years of the campaign. According to the dominant narrative, Rumsfeld willfully ignored military advice and initiated the war with a force that was too small. He ignored the need to prepare for post-conflict stability operations, and he failed to adapt to the new circumstances once things began to go wrong, not foreseeing the insurgency that engulfed the country.
It is undeniable that Rumsfeld made many critical mistakes. But the uniformed military was no more prescient than he. Did Rumsfeld insist on an early attack with a smaller force than that recommended by many uniformed officers? Yes. But the plan he pushed was a version of a scheme developed by an Army officer, Col. Douglas MacGregor. The military objective of this plan was not to occupy the country but to liberate Iraq from Saddam and turn governance over to liberal Iraqis. The approach was popular with both Rumsfeld and the military because both took their bearings from the Weinberger Doctrine, a set of rules for the use of force drafted in the 1980s which emphasized the quick, overwhelming application of military force to defeat an enemy, leaving postwar affairs to others.
Did Rumsfeld ignore postwar planning? Again, yes. But in doing so he was merely ratifying the preferences of a uniformed military that had internalized the Weinberger emphasis on an "exit strategy." The fact is that if generals are thinking about an exit strategy they are not thinking about "war termination"--how to convert military success into political success. This cultural aversion to stability operations is reflected in the fact that operational planning for Operation Iraqi Freedom took 18 months while planning for postwar stabilization began half-heartedly only a couple of months before the invasion.
Did Rumsfeld foresee the insurgency and the shift from conventional to guerrilla war? No. But neither did his critics in the uniformed services. Mansoor makes this point clear by observing that, for at least the three decades before the Iraq war, the professional military education system all but ignored counterinsurgency operations. This cultural aversion to counter- insurgency lay at the heart of the difficult years in Iraq (2003-07), and in the absence of a counterinsurgency doctrine the Army fell back on what it knew: conventional offensive operations designed to kill the enemy without protecting the population.
The Army's predisposition toward offensive operations was reinforced in the 1990s by a sort of operational "happy talk" that convinced many (who should have known better) that the American edge in emerging technologies, especially informational technologies, would permit the United States to conduct short, decisive, and relatively bloodless campaigns. This was the lesson many learned from the first Gulf war, and the result was an approach that goes under the name of Rapid Decisive Operations. Mansoor observes that Rapid Decisive Operations misunderstood the timeless nature of war: "What we learned [in Iraq]," he writes, "was that the real objective of the war was not merely the collapse of the old regime but the creation of a stable government." As the old saying goes, in war the enemy has a vote, and in the case of Iraq, our adversaries voted not to fight the kind of war Americans preferred.
As the conflict morphed into an insurgency, U.S. ground troops responded by going after the insurgents, adapting conventional tactics to a guerrilla war. In The Gamble Thomas Ricks quotes a speech by an Army officer that captures the essence of the U.S. approach in Iraq until 2007: "Anytime you fight, you always kill the other sonofabitch. Do not let him live today so he will fight you tomorrow. Kill him today."
This approach made sense when the insurgents stood and fought, as they did in Falluja in April and November 2004. It also made sense during the subsequent "rivers campaign" of 2005, designed to destroy the insurgency in al Anbar Province by depriving it of its base and infrastructure in the Sunni Triangle and the "ratlines" west and northwest of Falluja. It unquestionably killed thousands of insurgents, including Abu Musab al Zarqawi, the leader of Al Qaeda in Iraq, as well as many of his top lieutenants, and led to the capture of many more. Intelligence from captured insurgents, as well as from Zarqawi's computer, had a cascading effect, permitting the coalition to maintain pressure on the insurgency.
But while successful in disrupting insurgent operations, there were too few troops to maintain control of the towns of al Anbar. The insurgents, abandoning their Falluja approach of standing and fighting the Americans, simply melted away, only to return after coalition troops had departed. Thus, while soldiers and Marines were chasing insurgents from sanctuary to sanctuary, they were not providing security for the Iraqi population, leaving them at the mercy of the insurgents who terrorized and intimidated them.
As the insurgency metastasized in 2005 the United States had three military alternatives: continue offensive operations along the lines of those in Anbar after Falluja; adopt a counterinsurgency approach; or emphasize the training of Iraqi troops in order to effect a transition to Iraqi control of military operations. Gen. John Abizaid of Central Command, and Gen. George Casey, the overall commander in Iraq, chose the third option, supported by Rumsfeld and Joint Chiefs chairman Gen. Richard Myers.
But while moving toward Iraqi control was a logical option for the long run, it did little to solve the proximate problem of the insurgency, which had generated sectarian violence. Based on the belief of many senior commanders, especially General Abizaid, that U.S. troops were an "antibody" to Iraqi culture, U.S. forces were consolidated on large "forward operating bases," maintaining a presence only by means of motorized patrols that were particularly vulnerable to attacks by IEDs. In so doing, we ceded territory and population alike to the insurgents. Mansoor describes this approach as a mistake: "Security of the population is the fundamental basis of any successful counterinsurgency strategy."
The withdrawal of American forces to forward operating bases also contributed to a "kick-in-the-door" mentality among troops when they did interact with Iraqis. This was completely at odds with effective counterinsurgency practice, seriously undermining attempts to pacify the country. And yet, despite many difficulties (including resistance from above), some Army and Marine commanders had been implementing a counterinsurgency approach on their own initiative; that is to say, forming partnerships with the Sunni sheikhs in al Anbar province who had tired of al Qaeda's reign of terror in the Sunni Triangle. By providing security to the people in cooperation with the sheikhs, the Americans were able to isolate Al Qaeda in Iraq. And as U.S commanders were struggling with the insurgency, the Army and Marine Corps were developing a counterinsurgency doctrine based on this insight, and an operational strategy that would successfully be applied as part of the surge in 2007.
As a close associate of General Petraeus, Colonel Mansoor helped serve as midwife to the remarkable shift in Iraq arising from a more general application of the lessons that he had learned during his 2003-04 command. This new approach rejected the position articulated by Petraeus's predecessor, General Casey, who had told President George W. Bush in 2006 that "to win, we have to draw down." And General Abizaid of Central Command, sticking to his belief that American soldiers were an "antibody" to Iraqi culture, seconded Casey.
But Petraeus agreed with Mansoor's observation that "counterinsurgency is a thinking soldier's war," requiring "the counterinsurgent to adapt faster than the insurgent." The time for applying a new approach was at hand, and to his credit, President Bush saw the necessity for change and took action.
One of the debates triggered by our experience in Iraq concerns U.S. force structure. As Mansoor puts it, "If we accept the premise that [counterinsurgency and] stability operations [are] of primary concern, then the Army's organization for combat should [be] different." This debate pits the "long war" school against "traditionalists." The former argues that Iraq and Afghanistan are most characteristic of the protracted and ambiguous wars America will fight in the future, and that the military should be developing a force designed to fight the "long war" on terrorism, which envisions the necessity of preparing for small wars, or insurgencies.
The traditionalists concede that irregular warfare will occur more frequently in the future and that fighting small wars is difficult. But traditionalists also conclude that such conflicts do not threaten U.S. strategic interests, while large-scale conflicts, which they believe remain a real possibility, will threaten strategic interests. They fear that the Long War School's focus on small wars and insurgencies will transform the Army back into a constabulary force, whose new capability for conducting stability operations and "nation-building" would be purchased at a high cost: the inability to conduct large-scale conventional war.
This is by no means a parochial debate, of interest only to the uniformed military, and its outcome has implications for broader national security policy: A force structure aligned with the requirement to fight conventional wars would make it more difficult for the United States to fight small wars. This may be a legitimate choice for the United States, but it is one that should be made by policymakers, and not delegated to the uniformed military. To do so would permit military decisions to constrain policy and strategy questions that lie well within the purview of civilian authority, and our experiences in Vietnam and Iraq demonstrate the dangers of leaving military doctrine and force structure strictly to the military.
Mackubin Thomas Owens is editor of Orbis, the journal of the Foreign Policy Research Institute, and professor of national security affairs at the Naval War College.
The Justice Department, torture and the Demjanjuk deportation case
The Justice Department’s Torture Hypocrisy. By Andrew C. McCarthy
Investigate Bush lawyers’ torture analysis one day, cite it favorably the next.
NRO, May 6, 2009 1:30 PM
Investigate Bush lawyers’ torture analysis one day, cite it favorably the next.
NRO, May 6, 2009 1:30 PM
Federal President's attitude on free trade with Latin America
A Welcome Shift. By Jaime Daremblum
Obama appears to be moving in the right direction on free trade with Latin America.
The Weekly Standard, May 07, 2009 12:00:00 AM
Obama appears to be moving in the right direction on free trade with Latin America.
The Weekly Standard, May 07, 2009 12:00:00 AM
Federal President's attitude toward the rule of law, Chrysler & UAW
White House puts UAW ahead of property rights. By Michael Barone
Washington Examiner, May 05, 2009
Last Friday, the day after Chrysler filed for bankruptcy, I drove past the company’s headquarters on Interstate 75 in Auburn Hills, Mich.
As I glanced at the pentagram logo I felt myself tearing up a little bit. Anyone who grew up in the Detroit area, as I did, can’t help but be sad to see a once great company fail.
But my sadness turned to anger later when I heard what bankruptcy lawyer Tom Lauria said on a WJR talk show that morning. “One of my clients,” Lauria told host Frank Beckmann, “was directly threatened by the White House and in essence compelled to withdraw its opposition to the deal under threat that the full force of the White House press corps would destroy its reputation if it continued to fight.”
Lauria represented one of the bondholder firms, Perella Weinberg, which initially rejected the Obama deal that would give the bondholders about 33 cents on the dollar for their secured debts while giving the United Auto Workers retirees about 50 cents on the dollar for their unsecured debts.
This of course is a violation of one of the basic principles of bankruptcy law, which is that secured creditors — those who lended money only on the contractual promise that if the debt was unpaid they’d get specific property back — get paid off in full before unsecured creditors get anything. Perella Weinberg withdrew its objection to the settlement, but other bondholders did not, which triggered the bankruptcy filing.
After that came a denunciation of the objecting bondholders as “speculators” by Barack Obama in his news conference last Thursday. And then death threats to bondholders from parties unknown.
The White House denied that it strong-armed Perella Weinberg. The firm issued a statement saying it decided to accept the settlement, but it pointedly did not deny that it had been threatened by the White House. Which is to say, the threat worked.
The same goes for big banks that have received billions in government Troubled Asset Relief Program money. Many of them want to give back the money, but the government won’t let them. They also voted to accept the Chrysler settlement. Nice little bank ya got there, wouldn’t want anything to happen to it.
Left-wing bloggers have been saying that the White House’s denial of making threats should be taken at face value and that Lauria’s statement is not evidence to the contrary. But that’s ridiculous. Lauria is a reputable lawyer and a contributor to Democratic candidates. He has no motive to lie. The White House does.
Think carefully about what’s happening here. The White House, presumably car czar Steven Rattner and deputy Ron Bloom, is seeking to transfer the property of one group of people to another group that is politically favored. In the process, it is setting aside basic property rights in favor of rewarding the United Auto Workers for the support the union has given the Democratic Party. The only possible limit on the White House’s power is the bankruptcy judge, who might not go along.
Michigan politicians of both parties joined Obama in denouncing the holdout bondholders. They point to the sad plight of UAW retirees not getting full payment of the health care benefits the union negotiated with Chrysler. But the plight of the beneficiaries of the pension funds represented by the bondholders is sad too. Ordinarily you would expect these claims to be weighed and determined by the rule of law. But not apparently in this administration.
Obama’s attitude toward the rule of law is apparent in the words he used to describe what he is looking for in a nominee to replace Justice David Souter. He wants “someone who understands justice is not just about some abstract legal theory,” he said, but someone who has “empathy.” In other words, judges should decide cases so that the right people win, not according to the rule of law.
The Chrysler negotiations will not be the last occasion for this administration to engage in bailout favoritism and crony capitalism. There’s a May 31 deadline to come up with a settlement for General Motors. And there will be others. In the meantime, who is going to buy bonds from unionized companies if the government is going to take their money away and give it to the union? We have just seen an episode of Gangster Government. It is likely to be part of a continuing series.
Washington Examiner, May 05, 2009
Last Friday, the day after Chrysler filed for bankruptcy, I drove past the company’s headquarters on Interstate 75 in Auburn Hills, Mich.
As I glanced at the pentagram logo I felt myself tearing up a little bit. Anyone who grew up in the Detroit area, as I did, can’t help but be sad to see a once great company fail.
But my sadness turned to anger later when I heard what bankruptcy lawyer Tom Lauria said on a WJR talk show that morning. “One of my clients,” Lauria told host Frank Beckmann, “was directly threatened by the White House and in essence compelled to withdraw its opposition to the deal under threat that the full force of the White House press corps would destroy its reputation if it continued to fight.”
Lauria represented one of the bondholder firms, Perella Weinberg, which initially rejected the Obama deal that would give the bondholders about 33 cents on the dollar for their secured debts while giving the United Auto Workers retirees about 50 cents on the dollar for their unsecured debts.
This of course is a violation of one of the basic principles of bankruptcy law, which is that secured creditors — those who lended money only on the contractual promise that if the debt was unpaid they’d get specific property back — get paid off in full before unsecured creditors get anything. Perella Weinberg withdrew its objection to the settlement, but other bondholders did not, which triggered the bankruptcy filing.
After that came a denunciation of the objecting bondholders as “speculators” by Barack Obama in his news conference last Thursday. And then death threats to bondholders from parties unknown.
The White House denied that it strong-armed Perella Weinberg. The firm issued a statement saying it decided to accept the settlement, but it pointedly did not deny that it had been threatened by the White House. Which is to say, the threat worked.
The same goes for big banks that have received billions in government Troubled Asset Relief Program money. Many of them want to give back the money, but the government won’t let them. They also voted to accept the Chrysler settlement. Nice little bank ya got there, wouldn’t want anything to happen to it.
Left-wing bloggers have been saying that the White House’s denial of making threats should be taken at face value and that Lauria’s statement is not evidence to the contrary. But that’s ridiculous. Lauria is a reputable lawyer and a contributor to Democratic candidates. He has no motive to lie. The White House does.
Think carefully about what’s happening here. The White House, presumably car czar Steven Rattner and deputy Ron Bloom, is seeking to transfer the property of one group of people to another group that is politically favored. In the process, it is setting aside basic property rights in favor of rewarding the United Auto Workers for the support the union has given the Democratic Party. The only possible limit on the White House’s power is the bankruptcy judge, who might not go along.
Michigan politicians of both parties joined Obama in denouncing the holdout bondholders. They point to the sad plight of UAW retirees not getting full payment of the health care benefits the union negotiated with Chrysler. But the plight of the beneficiaries of the pension funds represented by the bondholders is sad too. Ordinarily you would expect these claims to be weighed and determined by the rule of law. But not apparently in this administration.
Obama’s attitude toward the rule of law is apparent in the words he used to describe what he is looking for in a nominee to replace Justice David Souter. He wants “someone who understands justice is not just about some abstract legal theory,” he said, but someone who has “empathy.” In other words, judges should decide cases so that the right people win, not according to the rule of law.
The Chrysler negotiations will not be the last occasion for this administration to engage in bailout favoritism and crony capitalism. There’s a May 31 deadline to come up with a settlement for General Motors. And there will be others. In the meantime, who is going to buy bonds from unionized companies if the government is going to take their money away and give it to the union? We have just seen an episode of Gangster Government. It is likely to be part of a continuing series.
WSJ Editorial Page on difficulties with Guantanamo and detainees
Obama's Gitmo Mess. WSJ Editorial
So where is the Pentagon going to send the Yemenis?
WSJ, May 07, 2009
On his second day in office, President Obama ordered the Pentagon to mothball Guantanamo within one year, purportedly to reclaim the "moral high ground." That earned applause from the anti-antiterror squadrons, yet it is now causing all kinds of practical and political problems in what used to be known as the war on terror.
This mess grew even more chaotic this week, when Democrats refused the Administration's $50 million budget request to transfer some of the remaining 241 Gitmo detainees to a prison likely to be somewhere in the U.S. and perhaps to a new one built with taxpayer dollars. "What do we do with the 50 to 100 -- probably in that ballpark -- who we cannot release and cannot try?" Defense Secretary Robert Gates recently asked Congress.
The best answer is Gitmo. But the antiwar left wants terrorists treated like garden-variety criminals in the civilian courts or maybe military courts martial. The not-so-minor problem is that even states that send leftists to Congress don't want to host Gitmo-II. Think California, where Alcatraz could be an option. The abandoned San Francisco Bay prison has Gitmo's virtue of relative isolation -- but Senator Dianne Feinstein, the chairman of the Intelligence Committee, claims it is a national treasure. The terrorist-next-door problem is also rising to a high boil in Kansas politics, given that Fort Leavenworth is being eyed too.
More urgently, the Administration risks losing all control once enemy combatants set foot on formal U.S. soil, which the courts could determine entitles the terrorists to the same Constitutional protections as U.S. citizens. One federal judge has already ordered that 17 detainees -- the Uighurs, a Chinese ethnic minority -- be released domestically. Another judge has ruled that the Supreme Court's 5-4 Boumediene decision, which granted detainees the right to file habeas petitions in U.S. courts, extends to Bagram Air Base in Afghanistan, where the military is holding three times as many prisoners as Guantanamo.
In his Boumediene dissent, Chief Justice John Roberts indicted the majority's "set of shapeless procedures to be defined by federal courts at some future date," and was he ever right. How will judges prevent the public disclosure of classified material? What about Miranda rights, or evidence obtained under battlefield conditions?
Such questions nearly scuttled the Justice Department's case against Ali Saleh Kahlah al-Marri, which flamed out last week with a sentence of only 15 years. According to the plea agreement, al-Marri entered the U.S. on September 10, 2001 on orders from Khalid Sheikh Mohammed to begin research on chemical weapons and potential targets. Prosecutors were hampered by the possibility of disclosing intelligence sources and methods, as well as (yet another) political flare-up about interrogation and detention.
For these reasons and more, the Obama Administration has done a 180-degree turn on George W. Bush's military commissions. Mr. Obama called this meticulous legal process "an enormous failure" during his campaign and suspended it when he cashiered Gitmo, but now Mr. Gates says it is "still very much on the table." The Administration may soon announce that it will be reactivated, with a few torques to the rules of secrecy and evidence to attempt to appease the human-rights lobby.
The hardest Gitmo cases are those prisoners who are known to be dangerous or were actively involved in terror networks but haven't committed crimes per se. Others involve evidence that is insufficient for successful prosecutions but sufficient enough to determine that release or transfer would pose a grave security risk. Many of these detainees are Yemeni, and the Yemeni government is demanding that Washington repatriate them.
That would be an unmitigated disaster, whatever Yemen's promises of rehabilitation. Director of National Intelligence Dennis Blair recently reported that Yemen "is re-emerging as a jihadist battleground and potential regional base of operations for al Qaeda to plan internal and external attacks, train terrorists and facilitate the movement of operatives."
Terror groups have conducted some 20 attacks on U.S. or Western targets in Yemen, the most recent in September against the U.S. embassy, which killed six guards and four civilians. The recidivism rate of those detainees who the military has judged to be good candidates for release from Gitmo is already high, and the danger for the 90 or so Yemenis and others ought to be unacceptable.
Which brings us back to Gitmo's new location, if it ever gets one. Since 1987, the political system has been deadlocked over burying a negligible amount of nuclear waste deep within a remote mountain in Nevada, so it's hard to imagine how it will deal with a terrorist problem that is far more -- how to put it? -- radioactive. Safe to say that any new setting will not be in a 2012 swing state, and you don't have to be a cynic to wonder if it will have two Republican Senators. Mr. Obama could have avoided this mess had he kept his Gitmo options open, but to adapt a famous phrase, the President broke Guantanamo so now he owns the inmates.
So where is the Pentagon going to send the Yemenis?
WSJ, May 07, 2009
On his second day in office, President Obama ordered the Pentagon to mothball Guantanamo within one year, purportedly to reclaim the "moral high ground." That earned applause from the anti-antiterror squadrons, yet it is now causing all kinds of practical and political problems in what used to be known as the war on terror.
This mess grew even more chaotic this week, when Democrats refused the Administration's $50 million budget request to transfer some of the remaining 241 Gitmo detainees to a prison likely to be somewhere in the U.S. and perhaps to a new one built with taxpayer dollars. "What do we do with the 50 to 100 -- probably in that ballpark -- who we cannot release and cannot try?" Defense Secretary Robert Gates recently asked Congress.
The best answer is Gitmo. But the antiwar left wants terrorists treated like garden-variety criminals in the civilian courts or maybe military courts martial. The not-so-minor problem is that even states that send leftists to Congress don't want to host Gitmo-II. Think California, where Alcatraz could be an option. The abandoned San Francisco Bay prison has Gitmo's virtue of relative isolation -- but Senator Dianne Feinstein, the chairman of the Intelligence Committee, claims it is a national treasure. The terrorist-next-door problem is also rising to a high boil in Kansas politics, given that Fort Leavenworth is being eyed too.
More urgently, the Administration risks losing all control once enemy combatants set foot on formal U.S. soil, which the courts could determine entitles the terrorists to the same Constitutional protections as U.S. citizens. One federal judge has already ordered that 17 detainees -- the Uighurs, a Chinese ethnic minority -- be released domestically. Another judge has ruled that the Supreme Court's 5-4 Boumediene decision, which granted detainees the right to file habeas petitions in U.S. courts, extends to Bagram Air Base in Afghanistan, where the military is holding three times as many prisoners as Guantanamo.
In his Boumediene dissent, Chief Justice John Roberts indicted the majority's "set of shapeless procedures to be defined by federal courts at some future date," and was he ever right. How will judges prevent the public disclosure of classified material? What about Miranda rights, or evidence obtained under battlefield conditions?
Such questions nearly scuttled the Justice Department's case against Ali Saleh Kahlah al-Marri, which flamed out last week with a sentence of only 15 years. According to the plea agreement, al-Marri entered the U.S. on September 10, 2001 on orders from Khalid Sheikh Mohammed to begin research on chemical weapons and potential targets. Prosecutors were hampered by the possibility of disclosing intelligence sources and methods, as well as (yet another) political flare-up about interrogation and detention.
For these reasons and more, the Obama Administration has done a 180-degree turn on George W. Bush's military commissions. Mr. Obama called this meticulous legal process "an enormous failure" during his campaign and suspended it when he cashiered Gitmo, but now Mr. Gates says it is "still very much on the table." The Administration may soon announce that it will be reactivated, with a few torques to the rules of secrecy and evidence to attempt to appease the human-rights lobby.
The hardest Gitmo cases are those prisoners who are known to be dangerous or were actively involved in terror networks but haven't committed crimes per se. Others involve evidence that is insufficient for successful prosecutions but sufficient enough to determine that release or transfer would pose a grave security risk. Many of these detainees are Yemeni, and the Yemeni government is demanding that Washington repatriate them.
That would be an unmitigated disaster, whatever Yemen's promises of rehabilitation. Director of National Intelligence Dennis Blair recently reported that Yemen "is re-emerging as a jihadist battleground and potential regional base of operations for al Qaeda to plan internal and external attacks, train terrorists and facilitate the movement of operatives."
Terror groups have conducted some 20 attacks on U.S. or Western targets in Yemen, the most recent in September against the U.S. embassy, which killed six guards and four civilians. The recidivism rate of those detainees who the military has judged to be good candidates for release from Gitmo is already high, and the danger for the 90 or so Yemenis and others ought to be unacceptable.
Which brings us back to Gitmo's new location, if it ever gets one. Since 1987, the political system has been deadlocked over burying a negligible amount of nuclear waste deep within a remote mountain in Nevada, so it's hard to imagine how it will deal with a terrorist problem that is far more -- how to put it? -- radioactive. Safe to say that any new setting will not be in a 2012 swing state, and you don't have to be a cynic to wonder if it will have two Republican Senators. Mr. Obama could have avoided this mess had he kept his Gitmo options open, but to adapt a famous phrase, the President broke Guantanamo so now he owns the inmates.
Regulation Didn't Save Canada's Banks
Regulation Didn't Save Canada's Banks. By Marie-Josee Kravis
Our neighbors to the north keep government out of lending decisions.
WSJ, May 07, 2009
Canada's five largest banks would pass the U.S. government stress test brilliantly. They were profitable in the last quarter of 2008, are well capitalized now, and have had no problems raising additional private capital. On average only 7% of their mortgage portfolios consisted of subprime loans (versus 20% in the U.S.). And no major Canadian bank has required direct government infusions of capital.
Advocates of increased regulation of U.S. financial markets have concluded that more stringent rules governing leverage and capital ratios account for Canada's impressive performance. They champion such measures here. In a Toronto speech earlier this year about reforming the U.S. banking system, former Fed chairman and Obama administration adviser Paul Volcker said the model he is considering "looks more like the Canadian system than it does the American system."
Nevertheless, Canadian banks operate in a very different context. Copying the Canadian banking system in this country, without understanding how its banking and housing sectors operate, would be a mistake.
Start with the housing sector. Canadian banks are not compelled by laws such as our Community Reinvestment Act to lend to less creditworthy borrowers. Nor does Canada have agencies like Fannie Mae and Freddie Mac promoting "affordable housing" through guarantees or purchases of high-risk and securitized loans. With fewer incentives to sell off their mortgage loans, Canadian banks held a larger share of them on their balance sheets. Bank-held mortgages tend to perform more soundly than securitized ones.
In the U.S., Federal Housing Administration programs allowed mortgages with only a 3% down payment, while the Federal Home Loan Bank provided multiple subsidies to finance borrowing. In Canada, if a down payment is less than 20% of the value of a home, the mortgage holder must purchase mortgage insurance. Mortgage interest is not tax deductible.
The differences do not end there. A homeowner in the U.S. can simply walk away from his loan if the balance on his mortgage exceeds the value of his house. The lender has no recourse except to take the house in satisfaction of the debt. Canadian mortgage holders are held strictly responsible for their home loans and banks can launch claims against their other assets.
And yet Canada's homeownership rate equals that in the U.S. (Both fluctuate, in the mid to high 60% range.)
For obvious political reasons, debate in Washington spotlights the need for future financial regulation while glossing over the role of government housing and other regulatory policies in the current crisis. This is dangerous: Without a thorough review of relevant government housing policies, laws and regulations, layering new reforms on top of our current system may only set the stage for another housing crisis in the future.
In response to the current crisis the Canadian government has thus far bought about $55 billion (Canadian) of insured loans from financial institutions (a substantial sum, given that Canada's economy is one-tenth the size of the U.S. economy). It has also played a central role supporting the availability of credit and removing potentially distressed assets from bank balance sheets. Still, these interventions have not arrested a substantial slump in Canadian GDP. Last week the Bank of Canada announced that first quarter 2009 GDP had fallen 7.3%. Bank of Canada Governor Mark Carney (Canada's Ben Bernanke) explained the sharp slowdown in growth: "[I]f we had to boil it down to one issue, it is the slowness with which other G-7 countries have dealt with the problems in their banks."
When it comes to comparing the track record of the U.S. and Canadian banking systems, it is worth noting that Canada's regulations did not prohibit the sale or purchase of asset-backed securities. Early in this decade, Canada's Toronto-Dominion bank was among the world's top 10 holders of securitized assets. The decision to exit these products four to five years ago, Toronto-Dominion's CEO Ed Clarke told me, was simple: "They became too complex. If I cannot hold them for my mother-in-law, I cannot hold them for my clients." No regulator can compete with this standard.
Tighter leverage limits in Canada may have dimmed the incentives for its banks to pursue securitization as brashly as their American counterparts. But regulations cannot take all the credit. Even with leverage ratios held on average at 18 to 1 (versus 26 to 1 for U.S. commercial banks and up to 40 to 1 for U.S. investment banks), Canadian banks would not be as healthy as they are had they not disposed of their more problematic securitized assets four to five years ago. Nothing in Canada's regulations banned risk-taking. Good, prudent management prevented excess.
Those who blame financial deregulation for the breakdown of U.S. markets should note that Canada shed its version of Glass-Steagall more than 20 years ago. Major banks thereafter rapidly bought and absorbed investment banks.
At that time, Canada established the Office of the Superintendent of Financial Institutions (OSFI) to provide common, consistent and more centralized regulation for federally regulated banks, insurance companies and pension funds. To this day OSFI is almost obsessively concerned with risk management, leaving social and economic objectives, such as access to affordable housing and diversity, to institutions better-suited to attain those goals.
Those desirous of importing Canadian banking regulations to the U.S. should first delve more deeply into the actual practices of our northern neighbor's housing and financial system. Choosing selectively often leads to choosing poorly.
Ms. Kravis is a fellow at the Hudson institute.
Our neighbors to the north keep government out of lending decisions.
WSJ, May 07, 2009
Canada's five largest banks would pass the U.S. government stress test brilliantly. They were profitable in the last quarter of 2008, are well capitalized now, and have had no problems raising additional private capital. On average only 7% of their mortgage portfolios consisted of subprime loans (versus 20% in the U.S.). And no major Canadian bank has required direct government infusions of capital.
Advocates of increased regulation of U.S. financial markets have concluded that more stringent rules governing leverage and capital ratios account for Canada's impressive performance. They champion such measures here. In a Toronto speech earlier this year about reforming the U.S. banking system, former Fed chairman and Obama administration adviser Paul Volcker said the model he is considering "looks more like the Canadian system than it does the American system."
Nevertheless, Canadian banks operate in a very different context. Copying the Canadian banking system in this country, without understanding how its banking and housing sectors operate, would be a mistake.
Start with the housing sector. Canadian banks are not compelled by laws such as our Community Reinvestment Act to lend to less creditworthy borrowers. Nor does Canada have agencies like Fannie Mae and Freddie Mac promoting "affordable housing" through guarantees or purchases of high-risk and securitized loans. With fewer incentives to sell off their mortgage loans, Canadian banks held a larger share of them on their balance sheets. Bank-held mortgages tend to perform more soundly than securitized ones.
In the U.S., Federal Housing Administration programs allowed mortgages with only a 3% down payment, while the Federal Home Loan Bank provided multiple subsidies to finance borrowing. In Canada, if a down payment is less than 20% of the value of a home, the mortgage holder must purchase mortgage insurance. Mortgage interest is not tax deductible.
The differences do not end there. A homeowner in the U.S. can simply walk away from his loan if the balance on his mortgage exceeds the value of his house. The lender has no recourse except to take the house in satisfaction of the debt. Canadian mortgage holders are held strictly responsible for their home loans and banks can launch claims against their other assets.
And yet Canada's homeownership rate equals that in the U.S. (Both fluctuate, in the mid to high 60% range.)
For obvious political reasons, debate in Washington spotlights the need for future financial regulation while glossing over the role of government housing and other regulatory policies in the current crisis. This is dangerous: Without a thorough review of relevant government housing policies, laws and regulations, layering new reforms on top of our current system may only set the stage for another housing crisis in the future.
In response to the current crisis the Canadian government has thus far bought about $55 billion (Canadian) of insured loans from financial institutions (a substantial sum, given that Canada's economy is one-tenth the size of the U.S. economy). It has also played a central role supporting the availability of credit and removing potentially distressed assets from bank balance sheets. Still, these interventions have not arrested a substantial slump in Canadian GDP. Last week the Bank of Canada announced that first quarter 2009 GDP had fallen 7.3%. Bank of Canada Governor Mark Carney (Canada's Ben Bernanke) explained the sharp slowdown in growth: "[I]f we had to boil it down to one issue, it is the slowness with which other G-7 countries have dealt with the problems in their banks."
When it comes to comparing the track record of the U.S. and Canadian banking systems, it is worth noting that Canada's regulations did not prohibit the sale or purchase of asset-backed securities. Early in this decade, Canada's Toronto-Dominion bank was among the world's top 10 holders of securitized assets. The decision to exit these products four to five years ago, Toronto-Dominion's CEO Ed Clarke told me, was simple: "They became too complex. If I cannot hold them for my mother-in-law, I cannot hold them for my clients." No regulator can compete with this standard.
Tighter leverage limits in Canada may have dimmed the incentives for its banks to pursue securitization as brashly as their American counterparts. But regulations cannot take all the credit. Even with leverage ratios held on average at 18 to 1 (versus 26 to 1 for U.S. commercial banks and up to 40 to 1 for U.S. investment banks), Canadian banks would not be as healthy as they are had they not disposed of their more problematic securitized assets four to five years ago. Nothing in Canada's regulations banned risk-taking. Good, prudent management prevented excess.
Those who blame financial deregulation for the breakdown of U.S. markets should note that Canada shed its version of Glass-Steagall more than 20 years ago. Major banks thereafter rapidly bought and absorbed investment banks.
At that time, Canada established the Office of the Superintendent of Financial Institutions (OSFI) to provide common, consistent and more centralized regulation for federally regulated banks, insurance companies and pension funds. To this day OSFI is almost obsessively concerned with risk management, leaving social and economic objectives, such as access to affordable housing and diversity, to institutions better-suited to attain those goals.
Those desirous of importing Canadian banking regulations to the U.S. should first delve more deeply into the actual practices of our northern neighbor's housing and financial system. Choosing selectively often leads to choosing poorly.
Ms. Kravis is a fellow at the Hudson institute.
Wednesday, May 6, 2009
Against the empathy standard for choosing judges
Ruth Marcus’s Misguided Defense of the Obama Standard. By Ed Whelan
Bench Memos/NRO, May 07, 2009
In today’s Washington Post, columnist Ruth Marcus offers a defense of President Obama’s so-called “empathy” standard for judges. Her defense suffers from three basic flaws.
First, while claiming that conservatives present an “absurd caricature” of Obama’s views, Marcus doesn’t present a fair account of Obama’s own words. As I discussed in this essay:
In explaining his vote against [Chief Justice] Roberts, Obama opined that deciding the “truly difficult” cases requires resort to “one’s deepest values, one’s core concerns, one’s broader perspectives on how the world works, and the depth and breadth of one’s empathy.” In short, “the critical ingredient is supplied by what is in the judge’s heart.”
Marcus quotes part of what she calls “Obama’s most controversial formulation of the empathy argument”—“we need somebody who’s got … the empathy to recognize what it’s like to be a young, teenage mom; the empathy to understand what it’s like to be poor or African American or gay or disabled or old”—but she conveniently omits Obama’s closer: “and that’s the criterion by which I’ll be selecting my judges.”
Second, Marcus asserts that “the cases that matter most … inevitably call on the judge to bring to the task his—or her—life experiences.” But she doesn’t support that assertion with argument. If the “right answer” on a constitutional question isn’t “available to a judge who merely thinks hard enough,” one obvious alternative to the judge’s indulging his or her own values—the alternative that judicial restraint requires—is to defer to the democratic enactment. In other words, if a judge can’t say with requisite certainty that an enactment is unconstitutional, the judge shouldn’t use his or her own values as some sort of tiebreaker.
Marcus states that “[a]ll judges are guided to some extent, consciously or unknowingly, by their life experience.” The question is whether they should exercise the discipline to be as dispassionate as possible or should instead indulge their passions.
Third, Marcus asserts that “[p]ossessing the ‘empathy to recognize’ should not determine the outcome of a case, but it should inform the judge’s approach.” But the line that she purports to draw is imaginary: if it’s permissible to indulge one’s own empathy, it’s impossible to say that doing so won’t be outcome-determinative in some cases. Indeed, if doing so doesn’t affect the outcome, then what’s Obama’s point?
It’s the role of the political branches to make law and policy. It’s the role of those who occupy positions in those branches, and not that of judges, to translate competing concepts of empathy and prudence into public policy and to consult their values and life experiences in doing so. President Obama is dead wrong on this fundamental matter.
Bench Memos/NRO, May 07, 2009
In today’s Washington Post, columnist Ruth Marcus offers a defense of President Obama’s so-called “empathy” standard for judges. Her defense suffers from three basic flaws.
First, while claiming that conservatives present an “absurd caricature” of Obama’s views, Marcus doesn’t present a fair account of Obama’s own words. As I discussed in this essay:
In explaining his vote against [Chief Justice] Roberts, Obama opined that deciding the “truly difficult” cases requires resort to “one’s deepest values, one’s core concerns, one’s broader perspectives on how the world works, and the depth and breadth of one’s empathy.” In short, “the critical ingredient is supplied by what is in the judge’s heart.”
Marcus quotes part of what she calls “Obama’s most controversial formulation of the empathy argument”—“we need somebody who’s got … the empathy to recognize what it’s like to be a young, teenage mom; the empathy to understand what it’s like to be poor or African American or gay or disabled or old”—but she conveniently omits Obama’s closer: “and that’s the criterion by which I’ll be selecting my judges.”
Second, Marcus asserts that “the cases that matter most … inevitably call on the judge to bring to the task his—or her—life experiences.” But she doesn’t support that assertion with argument. If the “right answer” on a constitutional question isn’t “available to a judge who merely thinks hard enough,” one obvious alternative to the judge’s indulging his or her own values—the alternative that judicial restraint requires—is to defer to the democratic enactment. In other words, if a judge can’t say with requisite certainty that an enactment is unconstitutional, the judge shouldn’t use his or her own values as some sort of tiebreaker.
Marcus states that “[a]ll judges are guided to some extent, consciously or unknowingly, by their life experience.” The question is whether they should exercise the discipline to be as dispassionate as possible or should instead indulge their passions.
Third, Marcus asserts that “[p]ossessing the ‘empathy to recognize’ should not determine the outcome of a case, but it should inform the judge’s approach.” But the line that she purports to draw is imaginary: if it’s permissible to indulge one’s own empathy, it’s impossible to say that doing so won’t be outcome-determinative in some cases. Indeed, if doing so doesn’t affect the outcome, then what’s Obama’s point?
It’s the role of the political branches to make law and policy. It’s the role of those who occupy positions in those branches, and not that of judges, to translate competing concepts of empathy and prudence into public policy and to consult their values and life experiences in doing so. President Obama is dead wrong on this fundamental matter.
In favor of the empathy standard for choosing judges
Behind Justice's Blindfold, By Ruth Marcus
WaPo, Wednesday, May 6, 2009
Should the judge be an umpire or an empathizer?
Chief Justice John Roberts memorably likened the judge to a baseball umpire, dispassionately applying existing rules to call balls and strikes.
President Obama is more, well, touchy-feely. As he weighs a replacement for retiring Justice David Souter, the president said, he wants "someone who understands that justice isn't about some abstract legal theory or footnote in a case book; it is also about how our laws affect the daily realities of people's lives." That "quality of empathy," he said, is "an essential ingredient for arriving at just decisions and outcomes."
This is red-alert talk for conservatives. "Those are all code words for an activist judge who is going to . . . be partisan on the bench," Utah Republican Sen. Orrin Hatch warned on ABC's "This Week."
Even before the election, Northwestern University law professor Steven Calabresi, a co-founder of the Federalist Society, was already at Defcon 4. In a Wall Street Journal op-ed, he argued that Obama's "emphasis on empathy in essence requires the appointment of judges committed in advance to violating" the judicial oath to do equal justice to rich and poor. "To the traditional view of justice as a blindfolded person weighing legal claims fairly on a scale, he wants to tear the blindfold off, so the judge can rule for the party he empathizes with most."
I admit to a bit of wincing at the word "empathize," with its sensitive-new-age-guy aura. If I thought Obama was advocating a pick-your-favorite-side approach, I'd be on the barricades, too. But his position is not anything like this absurd caricature. Indeed, it reflects a more thoughtful, more nuanced understanding of the judicial role than Roberts's seductive but flawed umpire analogy.
Like its downscale cousin, the dictate that judges should "interpret the law, not legislate from the bench," the judge-as-umpire trope is fundamentally misleading. Of course judges are supposed to be neutral arbiters of the cases that come before them, ruling on the merits of the claims rather than the sympathy evoked by one party or the other. Of course judges are bound by the text of legislation, the words of the Constitution, the weight of precedent.
Yet if the right answer was always available to a judge who merely thinks hard enough, we could program powerful computers to fulfill the judicial function. That's not possible -- not, anyway, in the cases that matter most. Those inevitably call on the judge to bring to the task his -- or her -- life experiences, conception of the role of the courts and, as Obama put it, "broader vision of what America should be."
Obama's most controversial formulation of the empathy argument came in a 2007 speech to Planned Parenthood. "The issues that come before the court are not sport," he said, disputing the umpire approach. "They're life and death. And we need somebody who's got . . . the empathy to recognize what it's like to be a young, teenage mom; the empathy to understand what it's like to be poor or African American or gay or disabled or old."
Possessing the "empathy to recognize" should not determine the outcome of a case, but it should inform the judge's approach. All judges are guided to some extent, consciously or unknowingly, by their life experience. The late Justice Lewis Powell, the deciding vote in Bowers v. Hardwick, the 1986 case upholding Georgia's sodomy law, told fellow justices -- and even a gay law clerk during that very term -- that he had "never met a homosexual." Would the outcome of Bowers -- an outcome Powell regretted within a few months -- have been different if the justice had known men and women in same-sex relationships?
When Bowers was overruled in 2003, the majority opinion by Justice Anthony Kennedy was infused with a greater understanding that anti-sodomy laws "seek to control a personal relationship." You got the sense that Kennedy actually knew people in such relationships.
And empathy runs both ways. In 2007, when the court rejected Lilly Ledbetter's pay discrimination lawsuit because she had waited too long to complain about her lower salary, the five-justice majority seemed moved by concern for employers unable to defend themselves against allegations of discrimination that allegedly occurred years earlier.
Justice's blindfold is a useful metaphor for impartiality. It's not a fixed prescription for insensitivity, or for obliviousness to the real world swirling outside the arid confines of the courthouse.
WaPo, Wednesday, May 6, 2009
Should the judge be an umpire or an empathizer?
Chief Justice John Roberts memorably likened the judge to a baseball umpire, dispassionately applying existing rules to call balls and strikes.
President Obama is more, well, touchy-feely. As he weighs a replacement for retiring Justice David Souter, the president said, he wants "someone who understands that justice isn't about some abstract legal theory or footnote in a case book; it is also about how our laws affect the daily realities of people's lives." That "quality of empathy," he said, is "an essential ingredient for arriving at just decisions and outcomes."
This is red-alert talk for conservatives. "Those are all code words for an activist judge who is going to . . . be partisan on the bench," Utah Republican Sen. Orrin Hatch warned on ABC's "This Week."
Even before the election, Northwestern University law professor Steven Calabresi, a co-founder of the Federalist Society, was already at Defcon 4. In a Wall Street Journal op-ed, he argued that Obama's "emphasis on empathy in essence requires the appointment of judges committed in advance to violating" the judicial oath to do equal justice to rich and poor. "To the traditional view of justice as a blindfolded person weighing legal claims fairly on a scale, he wants to tear the blindfold off, so the judge can rule for the party he empathizes with most."
I admit to a bit of wincing at the word "empathize," with its sensitive-new-age-guy aura. If I thought Obama was advocating a pick-your-favorite-side approach, I'd be on the barricades, too. But his position is not anything like this absurd caricature. Indeed, it reflects a more thoughtful, more nuanced understanding of the judicial role than Roberts's seductive but flawed umpire analogy.
Like its downscale cousin, the dictate that judges should "interpret the law, not legislate from the bench," the judge-as-umpire trope is fundamentally misleading. Of course judges are supposed to be neutral arbiters of the cases that come before them, ruling on the merits of the claims rather than the sympathy evoked by one party or the other. Of course judges are bound by the text of legislation, the words of the Constitution, the weight of precedent.
Yet if the right answer was always available to a judge who merely thinks hard enough, we could program powerful computers to fulfill the judicial function. That's not possible -- not, anyway, in the cases that matter most. Those inevitably call on the judge to bring to the task his -- or her -- life experiences, conception of the role of the courts and, as Obama put it, "broader vision of what America should be."
Obama's most controversial formulation of the empathy argument came in a 2007 speech to Planned Parenthood. "The issues that come before the court are not sport," he said, disputing the umpire approach. "They're life and death. And we need somebody who's got . . . the empathy to recognize what it's like to be a young, teenage mom; the empathy to understand what it's like to be poor or African American or gay or disabled or old."
Possessing the "empathy to recognize" should not determine the outcome of a case, but it should inform the judge's approach. All judges are guided to some extent, consciously or unknowingly, by their life experience. The late Justice Lewis Powell, the deciding vote in Bowers v. Hardwick, the 1986 case upholding Georgia's sodomy law, told fellow justices -- and even a gay law clerk during that very term -- that he had "never met a homosexual." Would the outcome of Bowers -- an outcome Powell regretted within a few months -- have been different if the justice had known men and women in same-sex relationships?
When Bowers was overruled in 2003, the majority opinion by Justice Anthony Kennedy was infused with a greater understanding that anti-sodomy laws "seek to control a personal relationship." You got the sense that Kennedy actually knew people in such relationships.
And empathy runs both ways. In 2007, when the court rejected Lilly Ledbetter's pay discrimination lawsuit because she had waited too long to complain about her lower salary, the five-justice majority seemed moved by concern for employers unable to defend themselves against allegations of discrimination that allegedly occurred years earlier.
Justice's blindfold is a useful metaphor for impartiality. It's not a fixed prescription for insensitivity, or for obliviousness to the real world swirling outside the arid confines of the courthouse.
Hedge Funds Outraged At Bullying But Also Cowering In Fear
Hedge Funds Outraged At Obama Bullying But Also Cowering In Fear. By Clifford S. Asness
Business Insider, May 5, 2009, 12:29 PM
Cliff Asness, managing partner at AQR Capital Management, distributed the following letter after listening to Obama blast the Chrysler hedge-fund holdouts. We picked the letter up at ZeroHedge.
Unafraid In Greenwich
Connecticut
Clifford S. Asness
Managing and Founding Principal
AQR Capital Management, LLC
The President has just harshly castigated hedge fund managers for being unwilling to take his administration’s bid for their Chrysler bonds. He called them “speculators” who were “refusing to sacrifice like everyone else” and who wanted “to hold out for the prospect of an unjustified taxpayer-funded bailout.”
The responses of hedge fund managers have been, appropriately, outrage, but generally have been anonymous for fear of going on the record against a powerful President (an exception, though still in the form of a “group letter”, was the superb note from “The Committee of Chrysler Non-TARP Lenders” some of the points of which I echo here, and a relatively few firms, like Oppenheimer, that have publicly defended themselves). Furthermore, one by one the managers and banks are said to be caving to the President’s wishes out of justifiable fear.
I run an approximately twenty billion dollar money management firm that offers hedge funds as well as public mutual funds and unhedged traditional investments. My company is not involved in the Chrysler situation, but I am still aghast at the President's comments (of course these are my own views not those of my company). Furthermore, for some reason I was not born with the common sense to keep it to myself, though my title should more accurately be called "Not Afraid Enough" as I am indeed fearful writing this... It’s really a bad idea to speak out. Angering the President is a mistake and, my views will annoy half my clients. I hope my clients will understand that I’m entitled to my voice and to speak it loudly, just as they are in this great country. I hope they will also like that I do not think I have the right to intentionally “sacrifice” their money without their permission.
Here's a shock. When hedge funds, pension funds, mutual funds, and individuals, including very sweet grandmothers, lend their money they expect to get it back. However, they know, or should know, they take the risk of not being paid back. But if such a bad event happens it usually does not result in a complete loss. A firm in bankruptcy still has assets. It’s not always a pretty process. Bankruptcy court is about figuring out how to most fairly divvy up the remaining assets based on who is owed what and whose contracts come first. The process already has built-in partial protections for employees and pensions, and can set lenders' contracts aside in order to help the company survive, all of which are the rules of the game lenders know before they lend. But, without this recovery process nobody would lend to risky borrowers. Essentially, lenders accept less than shareholders (means bonds return less than stocks) in good times only because they get more than shareholders in bad times.
The above is how it works in America, or how it’s supposed to work. The President and his team sought to avoid having Chrysler go through this process, proposing their own plan for re-organizing the company and partially paying off Chrysler’s creditors. Some bond holders thought this plan unfair. Specifically, they thought it unfairly favored the United Auto Workers, and unfairly paid bondholders less than they would get in bankruptcy court. So, they said no to the plan and decided, as is their right, to take their chances in the bankruptcy process. But, as his quotes above show, the President thought they were being unpatriotic or worse.
Let’s be clear, it is the job and obligation of all investment managers, including hedge fund managers, to get their clients the most return they can. They are allowed to be charitable with their own money, and many are spectacularly so, but if they give away their clients’ money to share in the “sacrifice”, they are stealing. Clients of hedge funds include, among others, pension funds of all kinds of workers, unionized and not. The managers have a fiduciary obligation to look after their clients’ money as best they can, not to support the President, nor to oppose him, nor otherwise advance their personal political views. That’s how the system works. If you hired an investment professional and he could preserve more of your money in a financial disaster, but instead he decided to spend it on the UAW so you could “share in the sacrifice”, you would not be happy.
Let’s quickly review a few side issues.
The President's attempted diktat takes money from bondholders and gives it to a labor union that delivers money and votes for him. Why is he not calling on his party to "sacrifice" some campaign contributions, and votes, for the greater good? Shaking down lenders for the benefit of political donors is recycled corruption and abuse of power.
Let’s also mention only in passing the irony of this same President begging hedge funds to borrow more to purchase other troubled securities. That he expects them to do so when he has already shown what happens if they ask for their money to be repaid fairly would be amusing if not so dangerous. That hedge funds might not participate in these programs because of fear of getting sucked into some toxic demagoguery that ends in arbitrary punishment for trying to work with the Treasury is distressing. Some useful programs, like those designed to help finance consumer loans, won't work because of this irresponsible hectoring.
Last but not least, the President screaming that the hedge funds are looking for an unjustified taxpayer-funded bailout is the big lie writ large. Find me a hedge fund that has been bailed out. Find me a hedge fund, even a failed one, that has asked for one. In fact, it was only because hedge funds have not taken government funds that they could stand up to this bullying. The TARP recipients had no choice but to go along. The hedge funds were singled out only because they are unpopular, not because they behaved any differently from any other ethical manager of other people's money. The President’s comments here are backwards and libelous. Yet, somehow I don’t think the hedge funds will be following ACORN’s lead and trucking in a bunch of paid professional protestors soon. Hedge funds really need a community organizer.
This is America. We have a free enterprise system that has worked spectacularly for us for two hundred plus years. When it fails it fixes itself. Most importantly, it is not an owned lackey of the oval office to be scolded for disobedience by the President.
I am ready for my “personalized” tax rate now.
Business Insider, May 5, 2009, 12:29 PM
Cliff Asness, managing partner at AQR Capital Management, distributed the following letter after listening to Obama blast the Chrysler hedge-fund holdouts. We picked the letter up at ZeroHedge.
Unafraid In Greenwich
Connecticut
Clifford S. Asness
Managing and Founding Principal
AQR Capital Management, LLC
The President has just harshly castigated hedge fund managers for being unwilling to take his administration’s bid for their Chrysler bonds. He called them “speculators” who were “refusing to sacrifice like everyone else” and who wanted “to hold out for the prospect of an unjustified taxpayer-funded bailout.”
The responses of hedge fund managers have been, appropriately, outrage, but generally have been anonymous for fear of going on the record against a powerful President (an exception, though still in the form of a “group letter”, was the superb note from “The Committee of Chrysler Non-TARP Lenders” some of the points of which I echo here, and a relatively few firms, like Oppenheimer, that have publicly defended themselves). Furthermore, one by one the managers and banks are said to be caving to the President’s wishes out of justifiable fear.
I run an approximately twenty billion dollar money management firm that offers hedge funds as well as public mutual funds and unhedged traditional investments. My company is not involved in the Chrysler situation, but I am still aghast at the President's comments (of course these are my own views not those of my company). Furthermore, for some reason I was not born with the common sense to keep it to myself, though my title should more accurately be called "Not Afraid Enough" as I am indeed fearful writing this... It’s really a bad idea to speak out. Angering the President is a mistake and, my views will annoy half my clients. I hope my clients will understand that I’m entitled to my voice and to speak it loudly, just as they are in this great country. I hope they will also like that I do not think I have the right to intentionally “sacrifice” their money without their permission.
Here's a shock. When hedge funds, pension funds, mutual funds, and individuals, including very sweet grandmothers, lend their money they expect to get it back. However, they know, or should know, they take the risk of not being paid back. But if such a bad event happens it usually does not result in a complete loss. A firm in bankruptcy still has assets. It’s not always a pretty process. Bankruptcy court is about figuring out how to most fairly divvy up the remaining assets based on who is owed what and whose contracts come first. The process already has built-in partial protections for employees and pensions, and can set lenders' contracts aside in order to help the company survive, all of which are the rules of the game lenders know before they lend. But, without this recovery process nobody would lend to risky borrowers. Essentially, lenders accept less than shareholders (means bonds return less than stocks) in good times only because they get more than shareholders in bad times.
The above is how it works in America, or how it’s supposed to work. The President and his team sought to avoid having Chrysler go through this process, proposing their own plan for re-organizing the company and partially paying off Chrysler’s creditors. Some bond holders thought this plan unfair. Specifically, they thought it unfairly favored the United Auto Workers, and unfairly paid bondholders less than they would get in bankruptcy court. So, they said no to the plan and decided, as is their right, to take their chances in the bankruptcy process. But, as his quotes above show, the President thought they were being unpatriotic or worse.
Let’s be clear, it is the job and obligation of all investment managers, including hedge fund managers, to get their clients the most return they can. They are allowed to be charitable with their own money, and many are spectacularly so, but if they give away their clients’ money to share in the “sacrifice”, they are stealing. Clients of hedge funds include, among others, pension funds of all kinds of workers, unionized and not. The managers have a fiduciary obligation to look after their clients’ money as best they can, not to support the President, nor to oppose him, nor otherwise advance their personal political views. That’s how the system works. If you hired an investment professional and he could preserve more of your money in a financial disaster, but instead he decided to spend it on the UAW so you could “share in the sacrifice”, you would not be happy.
Let’s quickly review a few side issues.
The President's attempted diktat takes money from bondholders and gives it to a labor union that delivers money and votes for him. Why is he not calling on his party to "sacrifice" some campaign contributions, and votes, for the greater good? Shaking down lenders for the benefit of political donors is recycled corruption and abuse of power.
Let’s also mention only in passing the irony of this same President begging hedge funds to borrow more to purchase other troubled securities. That he expects them to do so when he has already shown what happens if they ask for their money to be repaid fairly would be amusing if not so dangerous. That hedge funds might not participate in these programs because of fear of getting sucked into some toxic demagoguery that ends in arbitrary punishment for trying to work with the Treasury is distressing. Some useful programs, like those designed to help finance consumer loans, won't work because of this irresponsible hectoring.
Last but not least, the President screaming that the hedge funds are looking for an unjustified taxpayer-funded bailout is the big lie writ large. Find me a hedge fund that has been bailed out. Find me a hedge fund, even a failed one, that has asked for one. In fact, it was only because hedge funds have not taken government funds that they could stand up to this bullying. The TARP recipients had no choice but to go along. The hedge funds were singled out only because they are unpopular, not because they behaved any differently from any other ethical manager of other people's money. The President’s comments here are backwards and libelous. Yet, somehow I don’t think the hedge funds will be following ACORN’s lead and trucking in a bunch of paid professional protestors soon. Hedge funds really need a community organizer.
This is America. We have a free enterprise system that has worked spectacularly for us for two hundred plus years. When it fails it fixes itself. Most importantly, it is not an owned lackey of the oval office to be scolded for disobedience by the President.
I am ready for my “personalized” tax rate now.
An Economic Analysis of the Distributional Consequences of Cap-and-Trade
An Economic Analysis of the Distributional Consequences of Cap-and-Trade. By Aparna Mathur
Testimony
House American Energy Solutions Group
May 5, 2009
The adoption of a cap-and-trade system would increase carbon prices leading to an increase in the price of energy and non-energy goods. Price increases would fall more heavily on low income households, however the regional distribution of the burden will depend on the nature of electricity regulation. The cost burden would be distributed relatively evenly if permits are auctioned, however, free allocation of permits would allow a greater share of the burden to fall on consumers in states where electricity is not regulated.
Cap and trade systems increase costs of production for firms since firms will either need to undertake abatement measures to reduce carbon emissions (which may be costly) or they will need to buy permits from other firms to be able to continue emitting carbon without abatement. It can be shown that these costs will be incurred irrespective of whether the initial permits are auctioned or they are freely allocated. That result was borne out in the cap-and-trade programs for sulfur dioxide in the United States and for CO2 in Europe, where consumer prices rose even though producers were given allowances for free.
These higher costs of production will translate to higher energy and product prices. In a paper that I co-authored with my colleagues at AEI, we estimate that a cap and trade system, with a $15 permit price, will increase the cost of everything--from food, clothing, shoes and home furnishings by about 1 percent, of gasoline by 7.7%, electricity 12.5%, and natural gas 12.3%. Of course, as previous experience with cap and trade programs has shown, permit prices are likely to be extremely volatile and rising over time, and our $15 price estimate is likely to be conservative. Other studies suggest that the price could be above $50 in 2015, close to $100 in 2030 and about $200 in 2050. We can safely project that our estimates will be some multiple of these higher prices i.e. our price increases will be much higher than we project here.
The burden of these higher prices will be felt the most by the lower income households. As a fraction of income, this is about 4 percent of income for the bottom 10 percent of the population and about 1 percent of income for the top. In other words, the burden on the lower income households will be nearly 4 times the burden on the top income households. In terms of actual dollar values, the total cost of a cap and trade system on the bottom 10 percent of the population would be about $315 annually (in 2003 dollars), while on the top 10 percent it would be about $1324 annually (in 2003 dollars). For the average middle income household, it would be about $635 annually. Of course, if the permit prices are higher, then these costs could double or triple.
Read the full testimony in the link above.
Testimony
House American Energy Solutions Group
May 5, 2009
The adoption of a cap-and-trade system would increase carbon prices leading to an increase in the price of energy and non-energy goods. Price increases would fall more heavily on low income households, however the regional distribution of the burden will depend on the nature of electricity regulation. The cost burden would be distributed relatively evenly if permits are auctioned, however, free allocation of permits would allow a greater share of the burden to fall on consumers in states where electricity is not regulated.
Cap and trade systems increase costs of production for firms since firms will either need to undertake abatement measures to reduce carbon emissions (which may be costly) or they will need to buy permits from other firms to be able to continue emitting carbon without abatement. It can be shown that these costs will be incurred irrespective of whether the initial permits are auctioned or they are freely allocated. That result was borne out in the cap-and-trade programs for sulfur dioxide in the United States and for CO2 in Europe, where consumer prices rose even though producers were given allowances for free.
These higher costs of production will translate to higher energy and product prices. In a paper that I co-authored with my colleagues at AEI, we estimate that a cap and trade system, with a $15 permit price, will increase the cost of everything--from food, clothing, shoes and home furnishings by about 1 percent, of gasoline by 7.7%, electricity 12.5%, and natural gas 12.3%. Of course, as previous experience with cap and trade programs has shown, permit prices are likely to be extremely volatile and rising over time, and our $15 price estimate is likely to be conservative. Other studies suggest that the price could be above $50 in 2015, close to $100 in 2030 and about $200 in 2050. We can safely project that our estimates will be some multiple of these higher prices i.e. our price increases will be much higher than we project here.
The burden of these higher prices will be felt the most by the lower income households. As a fraction of income, this is about 4 percent of income for the bottom 10 percent of the population and about 1 percent of income for the top. In other words, the burden on the lower income households will be nearly 4 times the burden on the top income households. In terms of actual dollar values, the total cost of a cap and trade system on the bottom 10 percent of the population would be about $315 annually (in 2003 dollars), while on the top 10 percent it would be about $1324 annually (in 2003 dollars). For the average middle income household, it would be about $635 annually. Of course, if the permit prices are higher, then these costs could double or triple.
Read the full testimony in the link above.
The luck of the Irish runs out
Waiting for Dough, by Christopher Caldwell
The luck of the Irish runs out.
The Weekly Standard, May 11, 2009, Volume 014, Issue 32
More than any other country over the past two decades--more even than China--Ireland has given up its traditional culture for the global economy. In a quarter century, it went from being a little, poverty-stricken, priest-ridden agricultural backwater to a swingin', low-tax, wide-open, unregulated global-economy entrepôt. Last year, on paper, it was the seventh-richest country, per capita, in the world, ahead of the United States and trailing only a few oil exporters and tax havens. In the decade up to 2007, Ireland's GDP increased 350 percent. House prices quintupled.
At the same time, Ireland abandoned the "backward" parts of its culture. Partly through a string of sex scandals in the 1990s, but largely through its hostility to consumerism, the Catholic Church was discredited, and the culture built on it faded. (One small illustration: There are placards on public garbage cans all over Dublin bearing the catchy but not very Christian sentiment "Litter is disgusting--so are those responsible.") Ireland is not prudish anymore, either. A couple decades ago, 1 in 60 Irish babies were born out of wedlock; today 1 in 3 are. The country has some of the most liberal gay-rights and environmental laws in Europe. Nor is Ireland provincial. Its economy draws immigrants. There is a whole wall of books at the Waterstone's on Dawson Street in Dublin marked "Polskie Ksiazki." Dublin has numerous mosques. Tiny Waterford (pop. 45,775) has an African Women's Forum, not to mention two "adult stores" (in case you're ever in Waterford and need to buy an adult).
This is all very exciting for the Irish, but there is nothing particularly Irish about it. Irish identity has often been--explicitly and officially--a matter of protecting citizens from both the temptations of modernity and the vicissitudes of prosperity. In 1927 a Manchester Guardian journalist asked Eamon de Valera, the father of the modern Irish state, whether he understood that closing Ireland off from trade, the better to protect its culture, would mean a lower standard of living. De Valera replied,
You say "lower" when you ought to say a less costly standard of living. I think it quite possible that a less costly standard of living is desirable and that it would prove, in fact, to be a higher standard of living. I am not satisfied that the standard of living and the mode of living in Western Europe is a right or proper one.
De Valera's Irish Republic was organized around the idea that money doesn't matter that much. This may have been a noble aspiration, it may have been sanctimony and foolishness, but there was at the very least something bold and, as Yeats would say, indomitable about it. Next to De Valera's uncompromising Christian renunciation, those two something-for-nothing ideologies, modern capitalism and modern socialism, are practically indistinguishable. Over the last 20 years, Ireland found riches a good substitute for its traditional culture. But now the country has been harder hit by the financial downturn than any country in Western Europe. We may be about to discover what happens when a traditionally poor country returns to poverty without its culture.
Tiger in the tank
Until around the time the dot-com bubble burst, the Irish described their economy as the Celtic Tiger, after the high-tech and pharmaceutical companies that opened European offices there in the 1980s and 1990s. One senses De Valera wouldn't have liked these places. Much of the world's Viagra is made by Pfizer in the western village of Ringaskiddy. Botox comes from the elegant town of Westport, and one of the largest silicone-breast-implant factories in the world was until recently located in Arklow. Reductil, the slimming drug sold on the Internet, comes from Sligo. Google's European offices are in Dublin. Intel and Dell are still Ireland's two largest high-tech employers. But neither of those employs more than 5,000 people, and Dell laid off over 2,000 of them this winter.
The Celtic Tiger was partly the result of global economic conditions and partly the result of the country's policies. Ireland's decision to join the euro in the 1990s forced it to eliminate its chronic budget deficits and gave it the windfall of super-low interest rates, set for a European economy dragged down by Germany's struggles with reunification. Ireland offered a low-cost English-speaking labor force at a time when U.S. high-tech companies were looking for a springboard into European markets. Even today, Ireland is highly dependent on U.S. corporations, which account, directly or indirectly, for 300,000 jobs. Should the United States go protectionist, or should it inflate, which for Ireland's purposes would amount to the same thing, Ireland would be in trouble. On his St. Patrick's Day visit to Washington, D.C., the Irish taoiseach (prime minister), Brian Cowen, is said to have received an assurance from Barack Obama that the president didn't see Ireland as a tax haven.
This makes Ireland sound like a northern equivalent of a maquiladora economy, like Mexico in the years immediately after NAFTA. The Irish are sensitive about this imputation. "Our natural resource is brainpower," says one Dublin personnel consultant. That is true enough. It is probably not a coincidence that the biggest beneficiaries of the Celtic Tiger were the first generation of Irish born after the institution of universal public education in the 1960s. But education is not a commodity that can be monopolized. As labor costs have risen (by a third in real terms in the past decade), international companies have discovered that there are other, cheaper workforces that can also perform new-economy tasks. Jobs have left for Latin America, southeast Asia, and Eastern Europe. That Arklow breast-enhancement business wound up in Costa Rica.
So how has Ireland continued to grow at staggering rates for the last decade? Mostly thanks to a housing bubble, which was like the American one on steroids. Run-of-the-mill three-bedroom houses in provincial towns were selling for 1.5 million euros. Prices in Dublin were up to seven times as high as in similar U.S. urban markets.
There seemed to be good fundamental reasons for a steep rise in house prices, starting with a rate of home-ownership that approached 80 percent. On top of that there was immigration, the return of Irish exiles, a growing demand for vacation homes, and the new phenomenon of widespread divorce (making two homes necessary where one used to suffice). There were also government incentives for real estate developers and for the building of vacation homes in depressed areas. The most glamorous part of new, swingin' Ireland was deeply implicated in this speculation. Harry Crosbie, real estate-and-rock-music mogul, conceived--and, stranger yet, got financing for--a billion-euro construction project along the River Liffey. It would have included two skyscrapers, including a "U2 Tower," in which the band had invested heavily, and an Ozymandian 15-story sculpture of a giant man overlooking the Liffey. It was a narrow escape for Dublin architecture when Crosbie abandoned the project for lack of funds.
The result of the bubble was that, by the time of the U.S. subprime collapse, Ireland already had as many as 100,000 vacant houses. It also has empty golf courses, empty hotels, and empty shopping malls. Every last developable acre in the country, it seems, has been bought up (and bid up) by speculators. The bad loans attached to this overbuilding might reach 20 billion euros, or 10 percent of GDP. Housing prices are predicted to drop 50 percent from their peak, and development land 70 percent. Alan Ahearne, a former U.S. Federal Reserve economist who is now an adviser to the Irish finance minister, predicted over the winter that, "with possibly one exception, this country will record the largest cumulative drop in national income in an advanced economy since the Second World War."
Partners in crime
The villains of Irish finance, unlike those in New York and the City of London, were not wizards deploying the Black-Scholes equation or the Gaussian copula to turn the dross of subprime real estate into the fool's gold of CDOs. Far from it. They were just go-get-'em businessmen who started to believe their own blarney, cross-collateralized their properties, and got in way over their heads. As the financial journalist Tom McGurk put it: "Were you to gather together all of the senior principals in the six banks and building societies that approved this outrageous behavior, and join them to the property speculators to whom they loaned the billions, they would hardly fill a good-sized bus."
A few of the developers were young and dashing. Most of them were dissolute and lecherous-looking geriatrics. You could see them in the Irish weekend newspapers, posing in front of their huge houses and at charity balls with their toothy wives. Each of the developers had a signature Croesian excess, whether personal or professional. There was Sean Dunne, who had proposed to dynamite two of Dublin's historic hotels in leafy Ballsbridge to build a billion-dollar skyscraper; there was Sean Mulryan with his stables full of racehorses and his whole fleet of helicopters; and there was Johnny Ronan, who reportedly flew several dozen friends to Italy to have Pavarotti sing to him personally.
There is a good side to this lack of sophistication, of course. Boosters of Ireland's economy are keen to point out that there is little toxicity in its real estate market. Even harsh critics of the ruling Fianna Fáil party's economic policy--like the economics spokeswoman of the Labour party, Joan Burton--will grant this. It doesn't take an MIT doctorate to figure out what's wrong with the Irish economy, so the markets have had little trouble pricing the economy's assets. Shares in the Bank of Ireland, which were at 18.83 euros in 2007, now trade for 12 cents.
But there is a dark side to having your small economic pond fouled by only a handful of big fish: The whole social, economic, and political system looks like a con. Why, people now wonder, was so much of the financing in this supposedly open economy done by local Irish banks? The easiest answer to hand--and probably the correct one--is that the bankers and developers bought the protection and indulgence of Fianna Fáil. (The two main Irish political parties don't really have ideologies, but Fianna Fáil is the more historically nationalist and machine-oriented of the two.)
This unease has been heightened by a shocking lack of accountability. The banks' top brass has hardly changed from the days when they were actually making money. One of the few executives to resign was Sean FitzPatrick, chairman of the spectacularly reckless Anglo-Irish Bank, or "Anglo," but he was an extraordinary case. FitzPatrick got "director's loans" worth 83.3 million euros (which the bank's accountants, Ernst and Young, failed to notice) at a time when he was running the bank into the ground. Anglo peaked at 7.50 euros a share and was nationalized this year at 22 cents. When Bear Stearns collapsed, Anglo lost 23 percent of its value. When Lehman Brothers collapsed, Anglo threatened to take the whole Irish banking system with it. The government gave a blanket guarantee to Irish banks, using as security the country's National Pension Reserve Fund.
Now, imagine what the reaction has been to the discovery that the National Pensions Reserve Fund has been used to underwrite golden parachutes. This spring, Michael Fingleton, the chairman of Irish Nationwide, became the face of Irish banking malfeasance in much the way that the AIG financial team became the face of the U.S. banking scandal--and for the same reason. Fingleton's last reported salary was only 2.3 million euros. But in 2007 Irish Nationwide reported that it had transferred a 27.6-million-euro pension fund on behalf of "members" (note the plural) of the plan. Turns out the plan had only one member--Fingleton. And that money constituted 85 percent of Irish Nationwide's reported pension-fund assets at the end of 2006. Fingleton agreed to resign at the end of April. He also agreed to pay back a million-euro bonus he had received after the government had used those taxpayer pensions to secure Irish Nationwide against collapse. Rather like Barack Obama faced with AIG, the Irish finance minister, Brian Lenihan, had seen that the Fingleton bonus was a disaster in the making. He had threatened to withdraw Irish Nationwide's guarantee if Fingleton kept his bonus. This led to a funny photograph on the cover of the satirical magazine the Phoenix:
Lenihan: Return the bonus, or get the sack.
Fingleton: How much is in the sack?
Real money and fake
Ireland once looked to many investors like a versatile economy, matching American dynamism with European security. But now certain commentators are warning that it might be the other way around. "Ireland is like a jockey riding two horses," said the economist and author David McWilliams in Dublin this March. "When the horses are moving together, everything works well. When they're not, the jockey can get a terrible pain in the groin."
Ordinarily, a small, export-dependent economy in which wages are becoming less competitive can regain its edge by devaluing its currency. But Ireland is in the euro, and it is having a hard time staying in. According to the European Stability and Growth Pact, all countries must keep their deficits below 3 percent of GDP. No country until recently has ever gone above 5 percent. Ireland is now at 11 percent, or 20 billion euros. The European Central Bank has given Ireland until 2013 to get back into conformity.
Two-thirds of companies surveyed by the accountants Price Waterhouse Coopers said they were planning on cutting jobs this year. Consumer spending is already down 20 percent. So the government is now faced with the need to raise taxes dramatically and cut spending in the face of a looming recession. On April 7, it announced its budget. Top tax rates have soared past 50 percent, and capital gains and value-added taxes will rise, too. A property tax will be added. And that will make only the merest dent in the 20-billion-euro shortfall.
A basic question remains, though: How, in the absence of leverage-creating derivatives and "toxicity," did a few real-estate yahoos' going broke cause the risk of sovereign default to loom over the previously solvent Irish state? In Ireland now, there is (as an American could predict) a lot of talk about how the past few years have been an era of greed, lacking in social solidarity. But in Ireland's case, this is not true in the slightest. There was plenty of care for the less well-off. It is just that the government got no credit for it because it delivered that care by lowering poor people's taxes.
The fiscal emergency had its roots in the generous-sounding "social partnership" model agreed on by the country's leading politicians in the late 1980s--a sort of Irish answer to Germany's social market economy. The key to the social partnership was assuring "competitiveness" in international markets. Irish workers wanted higher wages, but businesses wishing to locate in Ireland wanted lower ones. How do you square that circle? Through government. In return for workers' moderating wage demands, government would make sure they paid very little in income taxes. Half the income tax in Ireland is paid by people earning over 100,000 euros, and 750,000 people--a third of the workforce--pay no income tax at all. Where does the money come from, then? From transaction taxes ("stamp" duties, capital gains taxes, corporate profits tax) that were paid mostly by the real-estate speculators who were making money hand over fist. Everyone seemed content with this system, even the speculators. It is, after all, easier to tax people's fake money than their real money.
The government could thus stimulate consumption (and consumerism) through low income taxes without having to stint on entitlements. This explains how, for a while, the ruling Fianna Fáil party managed to become the party of both the upper-middle class and the working class--much as U.S. Republicans did by a superficially different but essentially similar shell game.
We can see now that this arrangement was a time bomb. We can also see that the politically connected developers did a good deal to wreck the economy. But there is no denying that, by golly, they paid a lot of taxes. All this meant, though, was that the state became just as dependent on the housing bubble as the private sector. When more money came in, the government just spent it. The featherbedding patronage state is about the only Irish tradition that the global economy did not kill. Government employees make 25 percent more than equivalently situated private employees, according to the Dublin-based Economic and Social Research Institute. Government employees got very powerful in the process. So over the winter, when politicians decided to cut public-sector pay, they weren't exactly forthright--they proposed lopping 7.5 percent off the top and called it a "pension levy."
The head of the Labour party, Eamon Gilmore, recently attacked the finance ministry on the grounds that its new, trimmed budget requires more austerity than the country at large can handle. Gilmore has called instead for an Obama-style stimulus package. This is far more difficult to pull off when you are dependent on foreign investors and don't control the currency in which your debts are denominated--but at least it promises a continuation of something-for-nothing! Unsurprisingly, Gilmore is now the most popular politician in the country.
Prisoners of the open economy
There is lots of unrest and anger in Ireland now, and it is finding various outlets. Government statistics envision unemployment rising to 500,000 this year (in a labor force of 2.2 million). In protest against the pension levy, 150,000 people demonstrated in Dublin in March. Police unions and parts of the armed forces have also taken to the streets, which is supposed to be illegal. A representative of the army wrote to the defense minister for reassurance that it would not be used to break strikes, although it has often been called upon to do just that. Until recently, Irish workers had little to strike about. The country's largest union body, the Irish Congress of Trade Unions, negotiated a 6 percent pay raise on the very eve of the banking collapse and some businesses are even honoring it.
Today unions are united. They are irate. Workers are having their wages cut unilaterally by the government and via negotiations with private business. You would expect upheaval. But even under the circumstances, unions' power to shake the government or win themselves a better deal appears to be meager. Ambitious plans for a national strike fizzled out when Impact, the largest public-sector union, failed to get the necessary votes.
Part of the unions' problem is that the economy is configured so that striking will do them little good. Ireland is now abjectly dependent on the global economy. Paradoxical though it may sound, this makes its workers dependent on government. The social partnership model has an iron logic. Once income tax rates fall to a certain level, Irish workers' disposable income can rise only through transfer payments. If it rises through wage increases, foreign investors will be driven off and there will be no pie to divide. So Ireland must run its economy in a two-step process: First, attract the direct investment from the global market. Then, if you think the workers deserve a better shake, use government to redistribute the income at home.
People are coming to understand how appallingly little there is to redistribute. Waterford Crystal, the mainstay of a market town that has been making glass since the 16th century, recently ran into trouble. The company employed 3,500 people in the 1980s, but under the leadership of Tony O'Reilly--ex-rugby star, ex-chairman of Heinz, ex-billionaire--it screwed things up royally. It over-manufactured items for the millennium: The public turned out not to want the year 2000 commemorated in glass. It went on an acquisition binge. The death blow was Waterford's acquisition of the German porcelain-maker Rosenthal and its tardy discovery that Rosenthal had German-sized pension-and-benefits liabilities. And then came the bad luck of the falling dollar and pound. The company wound up with 400 million euros in debts and a share price of a tenth of a cent. Waterford went into receivership; the Waterford name is now owned by KPS, a private equity company in New York.
On a Friday afternoon in January, KPS announced it would shut down manufacturing at Waterford's Kilbarry plant. Two hundred workers staged an occupation of the visitors' center, which was once ("once" meaning last year) Ireland's fourth-largest tourist attraction, getting 350,000 visitors. The occupation won widespread sympathy in the town of Waterford itself. And why shouldn't it have? Much of the town's economy--hotels and mini-museums, sweater shops and espresso bars--is configured around those foreigners who want to see how Waterford glass gets made. Local women brought meals to the factory canteen to feed the strikers, and teenagers volunteered for janitorial work.
The town will wait in vain for the plant's reopening. After an eight-week sit-in, the company offered the 800 workers left at the plant a 10-million-euro payment--1,200 euros apiece--and they jumped at it, 90 percent of them voting for the settlement. A hundred some-odd workers have been allowed to stay on for another six months. Some labor movement.
Much of the plant's production will be transferred to the Czech Republic, and there is justice in that. Waterford is indeed an ancient glassmaking city, but its name became an international eponym for beautiful crystal only after 1947, when Charles Bacik, a war refugee from Czechoslovakia, revamped the place with the help of a number of fellow Czech glassblowers. One was Miroslav Havel, who developed the beloved Lismore pattern--the one you probably registered for when you got married.
On the other hand, since the factory will remain in the town, and since all these extraordinary Irish craftsmen will, too, it is hard not to share the puzzlement of trade union rep Macdara Doyle of the ICTU that a national treasure, cultural symbol, and economic behemoth like Waterford could just evaporate this way. "What's puzzling and annoying us is this," he said in March. "If you are going to re-grow your economy, surely you have to do high-end manufacturing. We're not all going to get rich pushing strange financial products. Waterford is to Ireland what BMW is to Germany. It's by no means a busted flush."
I visited the factory with John Stenson, a 56-year-old master wedge cutter who took early retirement last year, after working at the plant for 40 years. Stenson is still owed tens of thousands out of his severance payment that he expects never to see. He apprenticed at the company for five years in the time of "Mister Havel," blowing glass and cutting it on old ceramic wheels. It was quickly apparent he was a gifted glassmaker. After three years of further training, he was certified a master, and ran his own "bench" of six artisans. He toured the United States explaining and demonstrating to Americans the craft of glass-cutting as practiced by artisans of Waterford. He handmade six serving dishes for Prince Charles. He designed, sculpted, and cut the crystal grandfather clock that is (or was) the central exhibit in the Waterford visitors' center.
"He is a man of many talents!" said one of his admiring former colleagues, also unemployed after working at the plant for decades, who had managed to get a temp job as a security guard there.
"Well, I'm on the dole now," Stenson said.
Christopher Caldwell is a senior editor at THE WEEKLY STANDARD. His book Reflections on the Revolution in Europe: Immigration, Islam and the West will be published in July.
The luck of the Irish runs out.
The Weekly Standard, May 11, 2009, Volume 014, Issue 32
More than any other country over the past two decades--more even than China--Ireland has given up its traditional culture for the global economy. In a quarter century, it went from being a little, poverty-stricken, priest-ridden agricultural backwater to a swingin', low-tax, wide-open, unregulated global-economy entrepôt. Last year, on paper, it was the seventh-richest country, per capita, in the world, ahead of the United States and trailing only a few oil exporters and tax havens. In the decade up to 2007, Ireland's GDP increased 350 percent. House prices quintupled.
At the same time, Ireland abandoned the "backward" parts of its culture. Partly through a string of sex scandals in the 1990s, but largely through its hostility to consumerism, the Catholic Church was discredited, and the culture built on it faded. (One small illustration: There are placards on public garbage cans all over Dublin bearing the catchy but not very Christian sentiment "Litter is disgusting--so are those responsible.") Ireland is not prudish anymore, either. A couple decades ago, 1 in 60 Irish babies were born out of wedlock; today 1 in 3 are. The country has some of the most liberal gay-rights and environmental laws in Europe. Nor is Ireland provincial. Its economy draws immigrants. There is a whole wall of books at the Waterstone's on Dawson Street in Dublin marked "Polskie Ksiazki." Dublin has numerous mosques. Tiny Waterford (pop. 45,775) has an African Women's Forum, not to mention two "adult stores" (in case you're ever in Waterford and need to buy an adult).
This is all very exciting for the Irish, but there is nothing particularly Irish about it. Irish identity has often been--explicitly and officially--a matter of protecting citizens from both the temptations of modernity and the vicissitudes of prosperity. In 1927 a Manchester Guardian journalist asked Eamon de Valera, the father of the modern Irish state, whether he understood that closing Ireland off from trade, the better to protect its culture, would mean a lower standard of living. De Valera replied,
You say "lower" when you ought to say a less costly standard of living. I think it quite possible that a less costly standard of living is desirable and that it would prove, in fact, to be a higher standard of living. I am not satisfied that the standard of living and the mode of living in Western Europe is a right or proper one.
De Valera's Irish Republic was organized around the idea that money doesn't matter that much. This may have been a noble aspiration, it may have been sanctimony and foolishness, but there was at the very least something bold and, as Yeats would say, indomitable about it. Next to De Valera's uncompromising Christian renunciation, those two something-for-nothing ideologies, modern capitalism and modern socialism, are practically indistinguishable. Over the last 20 years, Ireland found riches a good substitute for its traditional culture. But now the country has been harder hit by the financial downturn than any country in Western Europe. We may be about to discover what happens when a traditionally poor country returns to poverty without its culture.
Tiger in the tank
Until around the time the dot-com bubble burst, the Irish described their economy as the Celtic Tiger, after the high-tech and pharmaceutical companies that opened European offices there in the 1980s and 1990s. One senses De Valera wouldn't have liked these places. Much of the world's Viagra is made by Pfizer in the western village of Ringaskiddy. Botox comes from the elegant town of Westport, and one of the largest silicone-breast-implant factories in the world was until recently located in Arklow. Reductil, the slimming drug sold on the Internet, comes from Sligo. Google's European offices are in Dublin. Intel and Dell are still Ireland's two largest high-tech employers. But neither of those employs more than 5,000 people, and Dell laid off over 2,000 of them this winter.
The Celtic Tiger was partly the result of global economic conditions and partly the result of the country's policies. Ireland's decision to join the euro in the 1990s forced it to eliminate its chronic budget deficits and gave it the windfall of super-low interest rates, set for a European economy dragged down by Germany's struggles with reunification. Ireland offered a low-cost English-speaking labor force at a time when U.S. high-tech companies were looking for a springboard into European markets. Even today, Ireland is highly dependent on U.S. corporations, which account, directly or indirectly, for 300,000 jobs. Should the United States go protectionist, or should it inflate, which for Ireland's purposes would amount to the same thing, Ireland would be in trouble. On his St. Patrick's Day visit to Washington, D.C., the Irish taoiseach (prime minister), Brian Cowen, is said to have received an assurance from Barack Obama that the president didn't see Ireland as a tax haven.
This makes Ireland sound like a northern equivalent of a maquiladora economy, like Mexico in the years immediately after NAFTA. The Irish are sensitive about this imputation. "Our natural resource is brainpower," says one Dublin personnel consultant. That is true enough. It is probably not a coincidence that the biggest beneficiaries of the Celtic Tiger were the first generation of Irish born after the institution of universal public education in the 1960s. But education is not a commodity that can be monopolized. As labor costs have risen (by a third in real terms in the past decade), international companies have discovered that there are other, cheaper workforces that can also perform new-economy tasks. Jobs have left for Latin America, southeast Asia, and Eastern Europe. That Arklow breast-enhancement business wound up in Costa Rica.
So how has Ireland continued to grow at staggering rates for the last decade? Mostly thanks to a housing bubble, which was like the American one on steroids. Run-of-the-mill three-bedroom houses in provincial towns were selling for 1.5 million euros. Prices in Dublin were up to seven times as high as in similar U.S. urban markets.
There seemed to be good fundamental reasons for a steep rise in house prices, starting with a rate of home-ownership that approached 80 percent. On top of that there was immigration, the return of Irish exiles, a growing demand for vacation homes, and the new phenomenon of widespread divorce (making two homes necessary where one used to suffice). There were also government incentives for real estate developers and for the building of vacation homes in depressed areas. The most glamorous part of new, swingin' Ireland was deeply implicated in this speculation. Harry Crosbie, real estate-and-rock-music mogul, conceived--and, stranger yet, got financing for--a billion-euro construction project along the River Liffey. It would have included two skyscrapers, including a "U2 Tower," in which the band had invested heavily, and an Ozymandian 15-story sculpture of a giant man overlooking the Liffey. It was a narrow escape for Dublin architecture when Crosbie abandoned the project for lack of funds.
The result of the bubble was that, by the time of the U.S. subprime collapse, Ireland already had as many as 100,000 vacant houses. It also has empty golf courses, empty hotels, and empty shopping malls. Every last developable acre in the country, it seems, has been bought up (and bid up) by speculators. The bad loans attached to this overbuilding might reach 20 billion euros, or 10 percent of GDP. Housing prices are predicted to drop 50 percent from their peak, and development land 70 percent. Alan Ahearne, a former U.S. Federal Reserve economist who is now an adviser to the Irish finance minister, predicted over the winter that, "with possibly one exception, this country will record the largest cumulative drop in national income in an advanced economy since the Second World War."
Partners in crime
The villains of Irish finance, unlike those in New York and the City of London, were not wizards deploying the Black-Scholes equation or the Gaussian copula to turn the dross of subprime real estate into the fool's gold of CDOs. Far from it. They were just go-get-'em businessmen who started to believe their own blarney, cross-collateralized their properties, and got in way over their heads. As the financial journalist Tom McGurk put it: "Were you to gather together all of the senior principals in the six banks and building societies that approved this outrageous behavior, and join them to the property speculators to whom they loaned the billions, they would hardly fill a good-sized bus."
A few of the developers were young and dashing. Most of them were dissolute and lecherous-looking geriatrics. You could see them in the Irish weekend newspapers, posing in front of their huge houses and at charity balls with their toothy wives. Each of the developers had a signature Croesian excess, whether personal or professional. There was Sean Dunne, who had proposed to dynamite two of Dublin's historic hotels in leafy Ballsbridge to build a billion-dollar skyscraper; there was Sean Mulryan with his stables full of racehorses and his whole fleet of helicopters; and there was Johnny Ronan, who reportedly flew several dozen friends to Italy to have Pavarotti sing to him personally.
There is a good side to this lack of sophistication, of course. Boosters of Ireland's economy are keen to point out that there is little toxicity in its real estate market. Even harsh critics of the ruling Fianna Fáil party's economic policy--like the economics spokeswoman of the Labour party, Joan Burton--will grant this. It doesn't take an MIT doctorate to figure out what's wrong with the Irish economy, so the markets have had little trouble pricing the economy's assets. Shares in the Bank of Ireland, which were at 18.83 euros in 2007, now trade for 12 cents.
But there is a dark side to having your small economic pond fouled by only a handful of big fish: The whole social, economic, and political system looks like a con. Why, people now wonder, was so much of the financing in this supposedly open economy done by local Irish banks? The easiest answer to hand--and probably the correct one--is that the bankers and developers bought the protection and indulgence of Fianna Fáil. (The two main Irish political parties don't really have ideologies, but Fianna Fáil is the more historically nationalist and machine-oriented of the two.)
This unease has been heightened by a shocking lack of accountability. The banks' top brass has hardly changed from the days when they were actually making money. One of the few executives to resign was Sean FitzPatrick, chairman of the spectacularly reckless Anglo-Irish Bank, or "Anglo," but he was an extraordinary case. FitzPatrick got "director's loans" worth 83.3 million euros (which the bank's accountants, Ernst and Young, failed to notice) at a time when he was running the bank into the ground. Anglo peaked at 7.50 euros a share and was nationalized this year at 22 cents. When Bear Stearns collapsed, Anglo lost 23 percent of its value. When Lehman Brothers collapsed, Anglo threatened to take the whole Irish banking system with it. The government gave a blanket guarantee to Irish banks, using as security the country's National Pension Reserve Fund.
Now, imagine what the reaction has been to the discovery that the National Pensions Reserve Fund has been used to underwrite golden parachutes. This spring, Michael Fingleton, the chairman of Irish Nationwide, became the face of Irish banking malfeasance in much the way that the AIG financial team became the face of the U.S. banking scandal--and for the same reason. Fingleton's last reported salary was only 2.3 million euros. But in 2007 Irish Nationwide reported that it had transferred a 27.6-million-euro pension fund on behalf of "members" (note the plural) of the plan. Turns out the plan had only one member--Fingleton. And that money constituted 85 percent of Irish Nationwide's reported pension-fund assets at the end of 2006. Fingleton agreed to resign at the end of April. He also agreed to pay back a million-euro bonus he had received after the government had used those taxpayer pensions to secure Irish Nationwide against collapse. Rather like Barack Obama faced with AIG, the Irish finance minister, Brian Lenihan, had seen that the Fingleton bonus was a disaster in the making. He had threatened to withdraw Irish Nationwide's guarantee if Fingleton kept his bonus. This led to a funny photograph on the cover of the satirical magazine the Phoenix:
Lenihan: Return the bonus, or get the sack.
Fingleton: How much is in the sack?
Real money and fake
Ireland once looked to many investors like a versatile economy, matching American dynamism with European security. But now certain commentators are warning that it might be the other way around. "Ireland is like a jockey riding two horses," said the economist and author David McWilliams in Dublin this March. "When the horses are moving together, everything works well. When they're not, the jockey can get a terrible pain in the groin."
Ordinarily, a small, export-dependent economy in which wages are becoming less competitive can regain its edge by devaluing its currency. But Ireland is in the euro, and it is having a hard time staying in. According to the European Stability and Growth Pact, all countries must keep their deficits below 3 percent of GDP. No country until recently has ever gone above 5 percent. Ireland is now at 11 percent, or 20 billion euros. The European Central Bank has given Ireland until 2013 to get back into conformity.
Two-thirds of companies surveyed by the accountants Price Waterhouse Coopers said they were planning on cutting jobs this year. Consumer spending is already down 20 percent. So the government is now faced with the need to raise taxes dramatically and cut spending in the face of a looming recession. On April 7, it announced its budget. Top tax rates have soared past 50 percent, and capital gains and value-added taxes will rise, too. A property tax will be added. And that will make only the merest dent in the 20-billion-euro shortfall.
A basic question remains, though: How, in the absence of leverage-creating derivatives and "toxicity," did a few real-estate yahoos' going broke cause the risk of sovereign default to loom over the previously solvent Irish state? In Ireland now, there is (as an American could predict) a lot of talk about how the past few years have been an era of greed, lacking in social solidarity. But in Ireland's case, this is not true in the slightest. There was plenty of care for the less well-off. It is just that the government got no credit for it because it delivered that care by lowering poor people's taxes.
The fiscal emergency had its roots in the generous-sounding "social partnership" model agreed on by the country's leading politicians in the late 1980s--a sort of Irish answer to Germany's social market economy. The key to the social partnership was assuring "competitiveness" in international markets. Irish workers wanted higher wages, but businesses wishing to locate in Ireland wanted lower ones. How do you square that circle? Through government. In return for workers' moderating wage demands, government would make sure they paid very little in income taxes. Half the income tax in Ireland is paid by people earning over 100,000 euros, and 750,000 people--a third of the workforce--pay no income tax at all. Where does the money come from, then? From transaction taxes ("stamp" duties, capital gains taxes, corporate profits tax) that were paid mostly by the real-estate speculators who were making money hand over fist. Everyone seemed content with this system, even the speculators. It is, after all, easier to tax people's fake money than their real money.
The government could thus stimulate consumption (and consumerism) through low income taxes without having to stint on entitlements. This explains how, for a while, the ruling Fianna Fáil party managed to become the party of both the upper-middle class and the working class--much as U.S. Republicans did by a superficially different but essentially similar shell game.
We can see now that this arrangement was a time bomb. We can also see that the politically connected developers did a good deal to wreck the economy. But there is no denying that, by golly, they paid a lot of taxes. All this meant, though, was that the state became just as dependent on the housing bubble as the private sector. When more money came in, the government just spent it. The featherbedding patronage state is about the only Irish tradition that the global economy did not kill. Government employees make 25 percent more than equivalently situated private employees, according to the Dublin-based Economic and Social Research Institute. Government employees got very powerful in the process. So over the winter, when politicians decided to cut public-sector pay, they weren't exactly forthright--they proposed lopping 7.5 percent off the top and called it a "pension levy."
The head of the Labour party, Eamon Gilmore, recently attacked the finance ministry on the grounds that its new, trimmed budget requires more austerity than the country at large can handle. Gilmore has called instead for an Obama-style stimulus package. This is far more difficult to pull off when you are dependent on foreign investors and don't control the currency in which your debts are denominated--but at least it promises a continuation of something-for-nothing! Unsurprisingly, Gilmore is now the most popular politician in the country.
Prisoners of the open economy
There is lots of unrest and anger in Ireland now, and it is finding various outlets. Government statistics envision unemployment rising to 500,000 this year (in a labor force of 2.2 million). In protest against the pension levy, 150,000 people demonstrated in Dublin in March. Police unions and parts of the armed forces have also taken to the streets, which is supposed to be illegal. A representative of the army wrote to the defense minister for reassurance that it would not be used to break strikes, although it has often been called upon to do just that. Until recently, Irish workers had little to strike about. The country's largest union body, the Irish Congress of Trade Unions, negotiated a 6 percent pay raise on the very eve of the banking collapse and some businesses are even honoring it.
Today unions are united. They are irate. Workers are having their wages cut unilaterally by the government and via negotiations with private business. You would expect upheaval. But even under the circumstances, unions' power to shake the government or win themselves a better deal appears to be meager. Ambitious plans for a national strike fizzled out when Impact, the largest public-sector union, failed to get the necessary votes.
Part of the unions' problem is that the economy is configured so that striking will do them little good. Ireland is now abjectly dependent on the global economy. Paradoxical though it may sound, this makes its workers dependent on government. The social partnership model has an iron logic. Once income tax rates fall to a certain level, Irish workers' disposable income can rise only through transfer payments. If it rises through wage increases, foreign investors will be driven off and there will be no pie to divide. So Ireland must run its economy in a two-step process: First, attract the direct investment from the global market. Then, if you think the workers deserve a better shake, use government to redistribute the income at home.
People are coming to understand how appallingly little there is to redistribute. Waterford Crystal, the mainstay of a market town that has been making glass since the 16th century, recently ran into trouble. The company employed 3,500 people in the 1980s, but under the leadership of Tony O'Reilly--ex-rugby star, ex-chairman of Heinz, ex-billionaire--it screwed things up royally. It over-manufactured items for the millennium: The public turned out not to want the year 2000 commemorated in glass. It went on an acquisition binge. The death blow was Waterford's acquisition of the German porcelain-maker Rosenthal and its tardy discovery that Rosenthal had German-sized pension-and-benefits liabilities. And then came the bad luck of the falling dollar and pound. The company wound up with 400 million euros in debts and a share price of a tenth of a cent. Waterford went into receivership; the Waterford name is now owned by KPS, a private equity company in New York.
On a Friday afternoon in January, KPS announced it would shut down manufacturing at Waterford's Kilbarry plant. Two hundred workers staged an occupation of the visitors' center, which was once ("once" meaning last year) Ireland's fourth-largest tourist attraction, getting 350,000 visitors. The occupation won widespread sympathy in the town of Waterford itself. And why shouldn't it have? Much of the town's economy--hotels and mini-museums, sweater shops and espresso bars--is configured around those foreigners who want to see how Waterford glass gets made. Local women brought meals to the factory canteen to feed the strikers, and teenagers volunteered for janitorial work.
The town will wait in vain for the plant's reopening. After an eight-week sit-in, the company offered the 800 workers left at the plant a 10-million-euro payment--1,200 euros apiece--and they jumped at it, 90 percent of them voting for the settlement. A hundred some-odd workers have been allowed to stay on for another six months. Some labor movement.
Much of the plant's production will be transferred to the Czech Republic, and there is justice in that. Waterford is indeed an ancient glassmaking city, but its name became an international eponym for beautiful crystal only after 1947, when Charles Bacik, a war refugee from Czechoslovakia, revamped the place with the help of a number of fellow Czech glassblowers. One was Miroslav Havel, who developed the beloved Lismore pattern--the one you probably registered for when you got married.
On the other hand, since the factory will remain in the town, and since all these extraordinary Irish craftsmen will, too, it is hard not to share the puzzlement of trade union rep Macdara Doyle of the ICTU that a national treasure, cultural symbol, and economic behemoth like Waterford could just evaporate this way. "What's puzzling and annoying us is this," he said in March. "If you are going to re-grow your economy, surely you have to do high-end manufacturing. We're not all going to get rich pushing strange financial products. Waterford is to Ireland what BMW is to Germany. It's by no means a busted flush."
I visited the factory with John Stenson, a 56-year-old master wedge cutter who took early retirement last year, after working at the plant for 40 years. Stenson is still owed tens of thousands out of his severance payment that he expects never to see. He apprenticed at the company for five years in the time of "Mister Havel," blowing glass and cutting it on old ceramic wheels. It was quickly apparent he was a gifted glassmaker. After three years of further training, he was certified a master, and ran his own "bench" of six artisans. He toured the United States explaining and demonstrating to Americans the craft of glass-cutting as practiced by artisans of Waterford. He handmade six serving dishes for Prince Charles. He designed, sculpted, and cut the crystal grandfather clock that is (or was) the central exhibit in the Waterford visitors' center.
"He is a man of many talents!" said one of his admiring former colleagues, also unemployed after working at the plant for decades, who had managed to get a temp job as a security guard there.
"Well, I'm on the dole now," Stenson said.
Christopher Caldwell is a senior editor at THE WEEKLY STANDARD. His book Reflections on the Revolution in Europe: Immigration, Islam and the West will be published in July.
Tuesday, May 5, 2009
Famine-monger Lester Brown still gets it wrong after all these years
Never Right, But Never in Doubt. By Ronald Bailey
Famine-monger Lester Brown still gets it wrong after all these years
Reason, May 5, 2009
"Could food shortages bring down civilization?," asks environmental activist Lester Brown in the current issue of Scientific American. Not surprisingly, Brown's answer is an emphatic yes. He claims that for years he has "resisted the idea that food shortages could bring down not only individual governments but also our global civilization." Now, however, Brown says, "I can no longer ignore that risk." Balderdash. Brown, head of the Earth Policy Institute, has been a prominent and perennial predictor of imminent global famine for more than 45 years. Why should we believe him now?
For instance, back in 1965, when Brown was a young bureaucrat in the U.S. Department of Agriculture, he declared, "the food problem emerging in the less-developing regions may be one of the most nearly insoluble problems facing man over the next few decades." In 1974, Brown maintained that farmers "can no longer keep up with rising demand; thus the outlook is for chronic scarcities and rising prices." In 1981, Brown stated that "global food insecurity is increasing," and further claimed that "the slim excess of growth in food production over population is narrowing." In 1989, Brown contended that "population growth is exceeding the farmer's ability to keep up," concluding that, "our oldest enemy, hunger, is again at the door." In 1995, Brown starkly warned, "Humanity's greatest challenge may soon be just making it to the next harvest." In 1997, Brown again proclaimed, "Food scarcity will be the defining issue of the new era now unfolding."
But this time it's different, right? After all, Brown claims that "when the 2008 harvest began, world carryover stocks of grain (the amount in the bin when the new harvest begins) were at 62 days of consumption, a near record low." But Brown has played this game before with world grain stocks. As the folks at the pro-life Population Research Institute (PRI) report, Brown claimed in 1974 that there were only 26 days of grain reserves left, but later he upped that number to 61 days. In 1976, reserves were supposed to have fallen to just 31 days, but again Brown raised that number in 1988 to 79 days. In 1980, only a 40-day supply was allegedly on hand, but a few years later he changed that estimate to 71 days. The PRI analysts noted that Brown has repeatedly issued differing figures for 1974: 26 or 27 days (1974); 33 days (1975); 40 days (1981); 43 days (1987); and 61 days (1988). In 2004, Brown claimed that the world's grain reserves had fallen to only 59 days of consumption, the lowest level in 30 years.
In any case, Brown must know that the world's farmers produced a bumper crop last year. Stocks of wheat are at a six-year high and increases in other stocks of grains are not far off. This jump in reserves is not at all surprising considering the steep run-up in grain prices last year, which encouraged farmers around the world to plant more crops. By citing pre-2008 harvest reserves, Brown evidently hopes to frighten gullible Scientific American readers into thinking that the world's food situation is really desperate this time.
Brown argues that the world's food economy is being undermined by a troika of growing environmental calamities: falling water tables, eroding soils, and rising temperatures. He acknowledges that the application of scientific agriculture produced vast increases in crop yields in the 1960s and 1970s, but insists that "the technological fix" won't work this time. But Brown is wrong, again.
It is true that water tables are falling in many parts of the world as farmers drain aquifers in India, China, and the United States. Part of the problem is that water for irrigation is often subsidized by governments who encourage farmers to waste it. However, the proper pricing of water will rectify that by encouraging farmers to transition to drip irrigation, switch from thirsty crops like rice to dryland ones like wheat, and help crop breeders to develop more drought-tolerant crop varieties. In addition, crop biotechnologists are now seeking to transfer the C4 photosynthetic pathway into rice, which currently uses the less efficient C3 pathway. This could boost rice yields by 50 percent while reducing water use.
To support his claims about the dangers of soil erosion, Brown cites studies in impoverished Haiti and Lesotho. To be sure, soil erosion is a problem for poor countries whose subsistence farmers have no secure property rights. However, one 1995 study concluded that soil erosion would reduce U.S. agriculture production by 3 percent over the next 100 years. Such a reduction would be swamped by annual crop productivity increases of 1 to 2 percent per year—which has been the average rate for many decades. A 2007 study by European researchers found "it highly unlikely that erosion may pose a serious threat to food production in modern societies within the coming centuries." In addition, modern biotech herbicide-resistant crops make it possible for farmers to practice no-till agriculture, thus dramatically reducing soil erosion.
Brown's final fear centers on the effects of man-made global warming on agriculture. There is an ongoing debate among experts on this topic. For example, University of California, Santa Barbara economist Olivier Deschenes and Massachusetts Institute of Technology economist Michael Greenstone calculated that global warming would increase the profits of U.S. farmers by 4 percent, concluding that "large negative or positive effects are unlikely." Other researchers have recently disputed Deschenes' and Greenstone's findings, arguing that the impact of global warming on U.S. agriculture is "likely to be strongly negative." Fortunately, biotechnology research—the very technology fix dismissed by Brown—is already finding new ways to make crops more heat and drought tolerant.
On the other hand, Brown is right about two things in his Scientific American article: the U.S. should stop subsidizing bioethanol production (turning food into fuel) and countries everywhere should stop banning food exports in a misguided effort to lower local prices. Of course these policy prescriptions have been made by far more knowledgeable and trustworthy commentators than Brown.
Given the fact that Brown's dismal record as a prognosticator of doom is so well-known, it is just plain sad to see a respectable publication like Scientific American lending its credibility to this old charlatan.
Ronald Bailey is Reason magazine's science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is now available from Prometheus Books.
Famine-monger Lester Brown still gets it wrong after all these years
Reason, May 5, 2009
"Could food shortages bring down civilization?," asks environmental activist Lester Brown in the current issue of Scientific American. Not surprisingly, Brown's answer is an emphatic yes. He claims that for years he has "resisted the idea that food shortages could bring down not only individual governments but also our global civilization." Now, however, Brown says, "I can no longer ignore that risk." Balderdash. Brown, head of the Earth Policy Institute, has been a prominent and perennial predictor of imminent global famine for more than 45 years. Why should we believe him now?
For instance, back in 1965, when Brown was a young bureaucrat in the U.S. Department of Agriculture, he declared, "the food problem emerging in the less-developing regions may be one of the most nearly insoluble problems facing man over the next few decades." In 1974, Brown maintained that farmers "can no longer keep up with rising demand; thus the outlook is for chronic scarcities and rising prices." In 1981, Brown stated that "global food insecurity is increasing," and further claimed that "the slim excess of growth in food production over population is narrowing." In 1989, Brown contended that "population growth is exceeding the farmer's ability to keep up," concluding that, "our oldest enemy, hunger, is again at the door." In 1995, Brown starkly warned, "Humanity's greatest challenge may soon be just making it to the next harvest." In 1997, Brown again proclaimed, "Food scarcity will be the defining issue of the new era now unfolding."
But this time it's different, right? After all, Brown claims that "when the 2008 harvest began, world carryover stocks of grain (the amount in the bin when the new harvest begins) were at 62 days of consumption, a near record low." But Brown has played this game before with world grain stocks. As the folks at the pro-life Population Research Institute (PRI) report, Brown claimed in 1974 that there were only 26 days of grain reserves left, but later he upped that number to 61 days. In 1976, reserves were supposed to have fallen to just 31 days, but again Brown raised that number in 1988 to 79 days. In 1980, only a 40-day supply was allegedly on hand, but a few years later he changed that estimate to 71 days. The PRI analysts noted that Brown has repeatedly issued differing figures for 1974: 26 or 27 days (1974); 33 days (1975); 40 days (1981); 43 days (1987); and 61 days (1988). In 2004, Brown claimed that the world's grain reserves had fallen to only 59 days of consumption, the lowest level in 30 years.
In any case, Brown must know that the world's farmers produced a bumper crop last year. Stocks of wheat are at a six-year high and increases in other stocks of grains are not far off. This jump in reserves is not at all surprising considering the steep run-up in grain prices last year, which encouraged farmers around the world to plant more crops. By citing pre-2008 harvest reserves, Brown evidently hopes to frighten gullible Scientific American readers into thinking that the world's food situation is really desperate this time.
Brown argues that the world's food economy is being undermined by a troika of growing environmental calamities: falling water tables, eroding soils, and rising temperatures. He acknowledges that the application of scientific agriculture produced vast increases in crop yields in the 1960s and 1970s, but insists that "the technological fix" won't work this time. But Brown is wrong, again.
It is true that water tables are falling in many parts of the world as farmers drain aquifers in India, China, and the United States. Part of the problem is that water for irrigation is often subsidized by governments who encourage farmers to waste it. However, the proper pricing of water will rectify that by encouraging farmers to transition to drip irrigation, switch from thirsty crops like rice to dryland ones like wheat, and help crop breeders to develop more drought-tolerant crop varieties. In addition, crop biotechnologists are now seeking to transfer the C4 photosynthetic pathway into rice, which currently uses the less efficient C3 pathway. This could boost rice yields by 50 percent while reducing water use.
To support his claims about the dangers of soil erosion, Brown cites studies in impoverished Haiti and Lesotho. To be sure, soil erosion is a problem for poor countries whose subsistence farmers have no secure property rights. However, one 1995 study concluded that soil erosion would reduce U.S. agriculture production by 3 percent over the next 100 years. Such a reduction would be swamped by annual crop productivity increases of 1 to 2 percent per year—which has been the average rate for many decades. A 2007 study by European researchers found "it highly unlikely that erosion may pose a serious threat to food production in modern societies within the coming centuries." In addition, modern biotech herbicide-resistant crops make it possible for farmers to practice no-till agriculture, thus dramatically reducing soil erosion.
Brown's final fear centers on the effects of man-made global warming on agriculture. There is an ongoing debate among experts on this topic. For example, University of California, Santa Barbara economist Olivier Deschenes and Massachusetts Institute of Technology economist Michael Greenstone calculated that global warming would increase the profits of U.S. farmers by 4 percent, concluding that "large negative or positive effects are unlikely." Other researchers have recently disputed Deschenes' and Greenstone's findings, arguing that the impact of global warming on U.S. agriculture is "likely to be strongly negative." Fortunately, biotechnology research—the very technology fix dismissed by Brown—is already finding new ways to make crops more heat and drought tolerant.
On the other hand, Brown is right about two things in his Scientific American article: the U.S. should stop subsidizing bioethanol production (turning food into fuel) and countries everywhere should stop banning food exports in a misguided effort to lower local prices. Of course these policy prescriptions have been made by far more knowledgeable and trustworthy commentators than Brown.
Given the fact that Brown's dismal record as a prognosticator of doom is so well-known, it is just plain sad to see a respectable publication like Scientific American lending its credibility to this old charlatan.
Ronald Bailey is Reason magazine's science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is now available from Prometheus Books.
The Best Judges Obama Can't Pick
The Best Judges Obama Can't Pick. By Benjamin Wittes
Brookings, May 3, 2009
What do Merrick Garland, David Tatel and Jose Cabranes have in common?
All are sitting federal court of appeals judges who were nominated by Democratic presidents. All three are deeply admired by their colleagues and are among a small group of the very finest federal judges in the country. And all three have names you probably won't hear often in public discussions about whom President Obama should tap to replace retiring Justice David H. Souter.
Garland: white guy. Tatel: white guy and, at 67, too old. Cabranes: Hispanic, sure, but even older.
I have nothing against the people whose names have so far been floated as possible nominees (some of them are excellent), and I'm not against diversity on the high court. Far from it: It's important to have a court that looks like America, and it is particularly important that following Sandra Day O'Connor's retirement in 2005 an additional woman join the high court.
That said, there are significant costs to the nominating system that we have developed, in which gender, ethnicity and age have, from the very start of the search for Souter's replacement, placed off-limits many lawyers and judges whose colleagues regard as some of the best in their profession. The dirty little secret is that the conservative talent pool on the federal courts these days is larger and deeper than the liberal one, mainly because Republicans have been in power far longer than Democrats recently and have therefore had more opportunity to cultivate a strong bench on the bench.
While both parties feel pressure to keep the bench diverse, Democrats have less latitude for bucking these expectations in judicial nominations than Republicans do. The core constituency that Republicans must satisfy in high court nominations is the party's social conservative base, which fundamentally cares about issues, not diversity, and has accepted white men who practice the judging it admires. By contrast, identity-oriented groups are part of the core Democratic coalition, so it's not enough for a Democrat to appoint a liberal. At least some of the time, it will have to be a liberal who also satisfies certain diversity categories.
The age issue has particularly striking consequences. It used to be commonplace for presidents to appoint justices who were well into their 60s. Lewis Powell, Earl Warren, Charles Evans Hughes (the second time around), William Howard Taft and Oliver Wendell Holmes, for example, were at least 60 when nominated, as was Justice Ruth Bader Ginsburg when President Clinton nominated her in 1993. Older judges brought experience to the table, and because life tenure is shorter for them than for younger judges, the stakes are lower in their confirmations.
Yet the ever-escalating political war over the courts has put a premium on youth -- on justices who can hang around for decades as members of rival ideological camps. Judge J. Harvie Wilkinson III, one of the most esteemed conservative jurists in the country, might well be on the Supreme Court today, for example, had he not had the temerity to be 60 when O'Connor retired and opened up a slot. Nor is Obama likely to follow Clinton's lead in declining to discriminate against the late-middle-aged. After all, if conservatives only appoint relative youngsters such as John G. Roberts and Samuel Alito (50 and 55, respectively, at the time of their nominations), it's unilateral disarmament for a liberal to do otherwise.
The result is a strange conversation about who should replace Souter -- one that self-consciously omits many of the judges whose work is most actively studied by those who engage day-to-day with the courts. This may well be a reasonable price to pay for a diverse bench, and for those who don't read judicial opinions, it is in any event an invisible price. But let's be candid about paying it.
Brookings, May 3, 2009
What do Merrick Garland, David Tatel and Jose Cabranes have in common?
All are sitting federal court of appeals judges who were nominated by Democratic presidents. All three are deeply admired by their colleagues and are among a small group of the very finest federal judges in the country. And all three have names you probably won't hear often in public discussions about whom President Obama should tap to replace retiring Justice David H. Souter.
Garland: white guy. Tatel: white guy and, at 67, too old. Cabranes: Hispanic, sure, but even older.
I have nothing against the people whose names have so far been floated as possible nominees (some of them are excellent), and I'm not against diversity on the high court. Far from it: It's important to have a court that looks like America, and it is particularly important that following Sandra Day O'Connor's retirement in 2005 an additional woman join the high court.
That said, there are significant costs to the nominating system that we have developed, in which gender, ethnicity and age have, from the very start of the search for Souter's replacement, placed off-limits many lawyers and judges whose colleagues regard as some of the best in their profession. The dirty little secret is that the conservative talent pool on the federal courts these days is larger and deeper than the liberal one, mainly because Republicans have been in power far longer than Democrats recently and have therefore had more opportunity to cultivate a strong bench on the bench.
While both parties feel pressure to keep the bench diverse, Democrats have less latitude for bucking these expectations in judicial nominations than Republicans do. The core constituency that Republicans must satisfy in high court nominations is the party's social conservative base, which fundamentally cares about issues, not diversity, and has accepted white men who practice the judging it admires. By contrast, identity-oriented groups are part of the core Democratic coalition, so it's not enough for a Democrat to appoint a liberal. At least some of the time, it will have to be a liberal who also satisfies certain diversity categories.
The age issue has particularly striking consequences. It used to be commonplace for presidents to appoint justices who were well into their 60s. Lewis Powell, Earl Warren, Charles Evans Hughes (the second time around), William Howard Taft and Oliver Wendell Holmes, for example, were at least 60 when nominated, as was Justice Ruth Bader Ginsburg when President Clinton nominated her in 1993. Older judges brought experience to the table, and because life tenure is shorter for them than for younger judges, the stakes are lower in their confirmations.
Yet the ever-escalating political war over the courts has put a premium on youth -- on justices who can hang around for decades as members of rival ideological camps. Judge J. Harvie Wilkinson III, one of the most esteemed conservative jurists in the country, might well be on the Supreme Court today, for example, had he not had the temerity to be 60 when O'Connor retired and opened up a slot. Nor is Obama likely to follow Clinton's lead in declining to discriminate against the late-middle-aged. After all, if conservatives only appoint relative youngsters such as John G. Roberts and Samuel Alito (50 and 55, respectively, at the time of their nominations), it's unilateral disarmament for a liberal to do otherwise.
The result is a strange conversation about who should replace Souter -- one that self-consciously omits many of the judges whose work is most actively studied by those who engage day-to-day with the courts. This may well be a reasonable price to pay for a diverse bench, and for those who don't read judicial opinions, it is in any event an invisible price. But let's be candid about paying it.
Subscribe to:
Posts (Atom)