Deregulation and the Financial Panic, by Phil Gramm
Loose money and politicized mortgages are the real villains.
WSJ, Feb 20, 2009
The debate about the cause of the current crisis in our financial markets is important because the reforms implemented by Congress will be profoundly affected by what people believe caused the crisis.
If the cause was an unsustainable boom in house prices and irresponsible mortgage lending that corrupted the balance sheets of the world's financial institutions, reforming the housing credit system and correcting attendant problems in the financial system are called for. But if the fundamental structure of the financial system is flawed, a more profound restructuring is required.
I believe that a strong case can be made that the financial crisis stemmed from a confluence of two factors. The first was the unintended consequences of a monetary policy, developed to combat inventory cycle recessions in the last half of the 20th century, that was not well suited to the speculative bubble recession of 2001. The second was the politicization of mortgage lending.
The 2001 recession was brought on when a speculative bubble in the equity market burst, causing investment to collapse. But unlike previous postwar recessions, consumption and the housing industry remained strong at the trough of the recession. Critics of Federal Reserve Chairman Alan Greenspan say he held interest rates too low for too long, and in the process overstimulated the economy. That criticism does not capture what went wrong, however. The consequences of the Fed's monetary policy lay elsewhere.
In the inventory-cycle recessions experienced in the last half of the 20th century, involuntary build up of inventories produced retrenchment in the production chain. Workers were laid off and investment and consumption, including the housing sector, slumped.
In the 2001 recession, however, consumption and home building remained strong as investment collapsed. The Fed's sharp, prolonged reduction in interest rates stimulated a housing market that was already booming -- triggering six years of double-digit increases in housing prices during a period when the general inflation rate was low.
Buyers bought houses they couldn't afford, believing they could refinance in the future and benefit from the ongoing appreciation. Lenders assumed that even if everything else went wrong, properties could still be sold for more than they cost and the loan could be repaid. This mentality permeated the market from the originator to the holder of securitized mortgages, from the rating agency to the financial regulator.
Meanwhile, mortgage lending was becoming increasingly politicized. Community Reinvestment Act (CRA) requirements led regulators to foster looser underwriting and encouraged the making of more and more marginal loans. Looser underwriting standards spread beyond subprime to the whole housing market.
As Mr. Greenspan testified last October at a hearing of the House Committee on Oversight and Government Reform, "It's instructive to go back to the early stages of the subprime market, which has essentially emerged out of CRA." It was not just that CRA and federal housing policy pressured lenders to make risky loans -- but that they gave lenders the excuse and the regulatory cover.
Countrywide Financial Corp. cloaked itself in righteousness and silenced any troubled regulator by being the first mortgage lender to sign a HUD "Declaration of Fair Lending Principles and Practices." Given privileged status by Fannie Mae as a reward for "the most flexible underwriting criteria," it became the world's largest mortgage lender -- until it became the first major casualty of the financial crisis.
The 1992 Housing Bill set quotas or "targets" that Fannie and Freddie were to achieve in meeting the housing needs of low- and moderate-income Americans. In 1995 HUD raised the primary quota for low- and moderate-income housing loans from the 30% set by Congress in 1992 to 40% in 1996 and to 42% in 1997.
By the time the housing market collapsed, Fannie and Freddie faced three quotas. The first was for mortgages to individuals with below-average income, set at 56% of their overall mortgage holdings. The second targeted families with incomes at or below 60% of area median income, set at 27% of their holdings. The third targeted geographic areas deemed to be underserved, set at 35%.
The results? In 1994, 4.5% of the mortgage market was subprime and 31% of those subprime loans were securitized. By 2006, 20.1% of the entire mortgage market was subprime and 81% of those loans were securitized. The Congressional Budget Office now estimates that GSE losses will cost $240 billion in fiscal year 2009. If this crisis proves nothing else, it proves you cannot help people by lending them more money than they can pay back.
Blinded by the experience of the postwar period, where aggregate housing prices had never declined on an annual basis, and using the last 20 years as a measure of the norm, rating agencies and regulators viewed securitized mortgages, even subprime and undocumented Alt-A mortgages, as embodying little risk. It was not that regulators were not empowered; it was that they were not alarmed.
With near universal approval of regulators world-wide, these securities were injected into the arteries of the world's financial system. When the bubble burst, the financial system lost the indispensable ingredients of confidence and trust. We all know the rest of the story.
The principal alternative to the politicization of mortgage lending and bad monetary policy as causes of the financial crisis is deregulation. How deregulation caused the crisis has never been specifically explained. Nevertheless, two laws are most often blamed: the Gramm-Leach-Bliley (GLB) Act of 1999 and the Commodity Futures Modernization Act of 2000.
GLB repealed part of the Great Depression era Glass-Steagall Act, and allowed banks, securities companies and insurance companies to affiliate under a Financial Services Holding Company. It seems clear that if GLB was the problem, the crisis would have been expected to have originated in Europe where they never had Glass-Steagall requirements to begin with. Also, the financial firms that failed in this crisis, like Lehman, were the least diversified and the ones that survived, like J.P. Morgan, were the most diversified.
Moreover, GLB didn't deregulate anything. It established the Federal Reserve as a superregulator, overseeing all Financial Services Holding Companies. All activities of financial institutions continued to be regulated on a functional basis by the regulators that had regulated those activities prior to GLB.
When no evidence was ever presented to link GLB to the financial crisis -- and when former President Bill Clinton gave a spirited defense of this law, which he signed -- proponents of the deregulation thesis turned to the Commodity Futures Modernization Act (CFMA), and specifically to credit default swaps.
Yet it is amazing how well the market for credit default swaps has functioned during the financial crisis. That market has never lost liquidity and the default rate has been low, given the general state of the underlying assets. In any case, the CFMA did not deregulate credit default swaps. All swaps were given legal certainty by clarifying that swaps were not futures, but remained subject to regulation just as before based on who issued the swap and the nature of the underlying contracts.
In reality the financial "deregulation" of the last two decades has been greatly exaggerated. As the housing crisis mounted, financial regulators had more power, larger budgets and more personnel than ever. And yet, with the notable exception of Mr. Greenspan's warning about the risk posed by the massive mortgage holdings of Fannie and Freddie, regulators seemed unalarmed as the crisis grew. There is absolutely no evidence that if financial regulators had had more resources or more authority that anything would have been different.
Since politicization of the mortgage market was a primary cause of this crisis, we should be especially careful to prevent the politicization of the banks that have been given taxpayer assistance. Did Citi really change its view on mortgage cram-downs or was it pressured? How much pressure was really applied to force Bank of America to go through with the Merrill acquisition?
Restrictions on executive compensation are good fun for politicians, but they are just one step removed from politicians telling banks who to lend to and for what. We have been down that road before, and we know where it leads.
Finally, it should give us pause in responding to the financial crisis of today to realize that this crisis itself was in part an unintended consequence of the monetary policy we employed to deal with the previous recession. Surely, unintended consequences are a real danger when the monetary base has been bloated by a doubling of the Federal Reserve's balance sheet, and the federal deficit seems destined to exceed $1.7 trillion.
Mr. Gramm, a former U.S. Senator from Texas, is vice chairman of UBS Investment Bank. UBS. This op-ed is adapted from a recent paper he delivered at the American Enterprise Institute.
Friday, February 20, 2009
Memo to Bandwagon Obama Fans: Get Tough!
Memo to Bandwagon Obama Fans: Get Tough! By Maegan Carberry
Huffington Post, February 20, 2009 02:10 AM (EST)
If I hear one more person declare that Obama's "honeymoon is over" or that the Republican response to the stimulus proves that his quest for a bipartisan America was naïve and ineffectual, I will surely scream. What I'd really like to know is: Where were these wise naysayers circa summer of 2007? Toting Hillary signs? Blathering about Barack's enormous potential, but (voice lowers, candidly) proclaiming the nation just wasn't ready for a black president?
See, a couple years ago, some of us were hard at work executing the inklings of an ambitious vision that the majority said was impossible. It's a good thing my friends who spent A YEAR knocking on the doors of every home in Council Bluffs, Iowa, connecting on a person-to-person level and exchanging new ideas that inspired a community, then a state, then a party, then a nation to adopt a new mindset and make history didn't listen to contrarians who could only echo simple-minded soundbites. They understood, like most forward-thinking leaders, that real, lasting problem solving happens at the root cause and is built by meticulously gaining the trust and support of invested parties. (Including, dare we acknowledge it!, the Republican party.) Any other approach is subject to the whimsical ebb and flow of partisan politics, resulting in hard-fought legislation undone each election cycle. Who wants to bleed and sweat for change that isn't going to endure?
The limited vision some of my close pals, favorite pundits and fellow Obamamaniacs have displayed post-inauguration is beyond disappointing. It's as though, in our collective gloom about the economy and the dilapidated state of our nation's affairs, we've forgotten that the prize we sought during the election was always going to be another staggering, indefinite uphill battle. How many times did you say in conversation during the campaign, "Do you think whoever wins this thing is really going to want the job when he finally gets it?"
For the last couple weeks I have wanted to shake some people and remind them of the early days, when we were the only ones who believed in the senator from Illinois. I can remember recruiting people to attend events in LA in the spring of 2007 and being blown off and told I was a dreamer. I wore my campaign buttons religiously to spark conversations, most of which centered around Obama's supposedly far-fetched viability as a presidential contender. I recall being advised by mentors to jump off the hope train and position myself more strategically in alliances with Clinton staffers.
We had to have those conversations defending Barack every day. It was a year before SuperTuesday, when we rounded a corner and people started to open their minds and hear the message. Eventually, those conversations turned even Ohio and Florida from red to blue.
Of course I'm pissed that the Republicans, desperately in need of displaying a united front after getting their asses kicked, decided to err on the side of belligerence. It wasn't a particularly bold way to lead. In fact, their lackluster stimulus performance is reminiscent of a lil vote in 2002, in which no Democrats could be found to prevent the obviously ill-advised invasion of Iraq. I'd love to call them cowards and tell them there are now plenty of vacancies at the Hotel Guantanamo Bay if they'd like to secede from the union and start their own backwards society. That would not be helpful.
I will never forget how disempowering it was in 2000, when George Bush took office and started systematically slashing the accomplishments of the Clinton administration and undercutting the hard work that a generation of progressives put into the comparably glorious 1990s era. I was appalled that a leader could be so divisive, and I was amazed at the fleeting nature of political power.
I chose to support Barack Obama because he built his coalition for America's future from the bottom up. He focused not on party politics, but encouraged us to find common interests and work together whenever it was possible. He addressed the root cause of apathy in our disengaged collective citizenship, convincing individuals through the most successful grassroots viral marketing campaign in American history that they could be leaders in their own communities. The combined choices of anyone who, as a result of his leadership, has decided to be a solution-oriented person who will act on his beliefs is the real power of this administration. I did not traipse around Des Moines, Portland, Seattle, Denver, Austin, Chicago and Los Angeles with "Change We Can Believe In" signs just to throw my hands in the air, exasperated by a media-infused political squabble a month into this thing, and give up on the mission we signed up to execute. We have not even scratched the surface of what we can do yet.
The MSM continues to cover what politicians and other pundits are saying and doing. That's why the early days of Obama's presidency are being told in a narrative framed by a politics-as-usual perspective. Just like they missed the stories of what was really happening on the ground as the Obama campaign's dynamic field teams enlisted supporter after supporter, they're getting this story wrong too.
What's really happening is that the hype and bandwagon support that characterized the home stretch of the Obama campaign is scaling back to its dedicated core. The people who bought into the mania were destined to crash as abruptly as they clung to us when it was the hip thing to do. Secondly, the core is tired. We worked our asses off just to realize we have to do it again. Some dove into the new administration with as much energy as could be mustered, and others of us have just needed to mentally check out for a couple months and regroup. You know, think about our own long-neglected lives for a change instead of phone banking and knocking on doors every weekend.
If you fall in the latter half of the burned-out Barackers (and I certainly do), it's probably time to crawl out of hiding and come back to work. We were always the stewards of this undertaking, and our fearless leader needs us out in the field. As exhausting as it is watching this circus of cynics try to take him down after building him up, we can't possibly be as tired as he is.
Huffington Post, February 20, 2009 02:10 AM (EST)
If I hear one more person declare that Obama's "honeymoon is over" or that the Republican response to the stimulus proves that his quest for a bipartisan America was naïve and ineffectual, I will surely scream. What I'd really like to know is: Where were these wise naysayers circa summer of 2007? Toting Hillary signs? Blathering about Barack's enormous potential, but (voice lowers, candidly) proclaiming the nation just wasn't ready for a black president?
See, a couple years ago, some of us were hard at work executing the inklings of an ambitious vision that the majority said was impossible. It's a good thing my friends who spent A YEAR knocking on the doors of every home in Council Bluffs, Iowa, connecting on a person-to-person level and exchanging new ideas that inspired a community, then a state, then a party, then a nation to adopt a new mindset and make history didn't listen to contrarians who could only echo simple-minded soundbites. They understood, like most forward-thinking leaders, that real, lasting problem solving happens at the root cause and is built by meticulously gaining the trust and support of invested parties. (Including, dare we acknowledge it!, the Republican party.) Any other approach is subject to the whimsical ebb and flow of partisan politics, resulting in hard-fought legislation undone each election cycle. Who wants to bleed and sweat for change that isn't going to endure?
The limited vision some of my close pals, favorite pundits and fellow Obamamaniacs have displayed post-inauguration is beyond disappointing. It's as though, in our collective gloom about the economy and the dilapidated state of our nation's affairs, we've forgotten that the prize we sought during the election was always going to be another staggering, indefinite uphill battle. How many times did you say in conversation during the campaign, "Do you think whoever wins this thing is really going to want the job when he finally gets it?"
For the last couple weeks I have wanted to shake some people and remind them of the early days, when we were the only ones who believed in the senator from Illinois. I can remember recruiting people to attend events in LA in the spring of 2007 and being blown off and told I was a dreamer. I wore my campaign buttons religiously to spark conversations, most of which centered around Obama's supposedly far-fetched viability as a presidential contender. I recall being advised by mentors to jump off the hope train and position myself more strategically in alliances with Clinton staffers.
We had to have those conversations defending Barack every day. It was a year before SuperTuesday, when we rounded a corner and people started to open their minds and hear the message. Eventually, those conversations turned even Ohio and Florida from red to blue.
Of course I'm pissed that the Republicans, desperately in need of displaying a united front after getting their asses kicked, decided to err on the side of belligerence. It wasn't a particularly bold way to lead. In fact, their lackluster stimulus performance is reminiscent of a lil vote in 2002, in which no Democrats could be found to prevent the obviously ill-advised invasion of Iraq. I'd love to call them cowards and tell them there are now plenty of vacancies at the Hotel Guantanamo Bay if they'd like to secede from the union and start their own backwards society. That would not be helpful.
I will never forget how disempowering it was in 2000, when George Bush took office and started systematically slashing the accomplishments of the Clinton administration and undercutting the hard work that a generation of progressives put into the comparably glorious 1990s era. I was appalled that a leader could be so divisive, and I was amazed at the fleeting nature of political power.
I chose to support Barack Obama because he built his coalition for America's future from the bottom up. He focused not on party politics, but encouraged us to find common interests and work together whenever it was possible. He addressed the root cause of apathy in our disengaged collective citizenship, convincing individuals through the most successful grassroots viral marketing campaign in American history that they could be leaders in their own communities. The combined choices of anyone who, as a result of his leadership, has decided to be a solution-oriented person who will act on his beliefs is the real power of this administration. I did not traipse around Des Moines, Portland, Seattle, Denver, Austin, Chicago and Los Angeles with "Change We Can Believe In" signs just to throw my hands in the air, exasperated by a media-infused political squabble a month into this thing, and give up on the mission we signed up to execute. We have not even scratched the surface of what we can do yet.
The MSM continues to cover what politicians and other pundits are saying and doing. That's why the early days of Obama's presidency are being told in a narrative framed by a politics-as-usual perspective. Just like they missed the stories of what was really happening on the ground as the Obama campaign's dynamic field teams enlisted supporter after supporter, they're getting this story wrong too.
What's really happening is that the hype and bandwagon support that characterized the home stretch of the Obama campaign is scaling back to its dedicated core. The people who bought into the mania were destined to crash as abruptly as they clung to us when it was the hip thing to do. Secondly, the core is tired. We worked our asses off just to realize we have to do it again. Some dove into the new administration with as much energy as could be mustered, and others of us have just needed to mentally check out for a couple months and regroup. You know, think about our own long-neglected lives for a change instead of phone banking and knocking on doors every weekend.
If you fall in the latter half of the burned-out Barackers (and I certainly do), it's probably time to crawl out of hiding and come back to work. We were always the stewards of this undertaking, and our fearless leader needs us out in the field. As exhausting as it is watching this circus of cynics try to take him down after building him up, we can't possibly be as tired as he is.
Thursday, February 19, 2009
Conservative views: Common Article 3 of the Geneva Conventions and US Detainee Policy
Common Article 3 of the Geneva Conventions and U.S. Detainee Policy. By David Rivkin, Lee Casey, and Charles Stimson
Heritage, Feb 19, 2009
Full article w/references here.
The Geneva Conventions loom large over U.S. terrorist detainee policy—even when the conventions may not strictly, as a matter of law, apply. In addition to their legal force, the conventions carry the weight of moral authority. It is no small matter, then, to question whether U.S. detention efforts fall short of the standards of Article 3—an article that is common to all four Geneva Conventions (hence its designation as "Common Article 3," or CA3). But that was the implication when President Barack Obama ordered the secretary of defense to conduct an immediate 30-day review of the conditions of detention in Guantanamo to "ensure full compliance" with CA3.
What exactly such compliance requires is open to debate.
CA3: Already in Force
From the military's point of view, Common Article 3 has been in full force for over two and a half years at Guantanamo. In June 2006, the United States Supreme Court ruled in the case of Hamdan v. Rumsfeld that America's armed conflict with al-Qaeda was non-international in character and, as such, was governed by CA3.[1] Within a week of that ruling, Deputy Secretary of Defense Gordon England issued a department-wide memorandum requiring all Department of Defense components to comply with CA3. Shortly thereafter, all components of the Department of Defense reported that they were in full compliance; this included the Joint Task Force in charge of detention operations at Guantanamo Bay, Cuba.
On September 6, 2006, the Department of Defense issued a department-wide directive applicable to all detainees in DOD custody or effective control. That directive incorporated verbatim CA3 of the Geneva Conventions and required the entire Department of Defense, including Guantanamo, to comply with CA3.
Whether this September 2006 directive marks the end of the story depends on what the text of CA3 means. And that is not so straightforward an inquiry.
Defining CA3
Common Article 3 is the third article common to each of the four Geneva Conventions. The Geneva Conventions codify much, albeit not all, of the law regulating armed conflict and the humane treatment of persons detained during armed conflict. The four conventions, as most recently revised and expanded in 1949, comprise a system of safeguards that attempt to regulate the ways wars are fought and to provide protections for individuals during wartime. The conventions themselves were a response to the horrific atrocities of World War II. The first convention covers soldiers wounded on the battlefield, the second covers sailors wounded and shipwrecked at sea, the third covers prisoners of war, and the fourth covers civilians taken by an enemy military or otherwise impacted.
What CA3 precisely requires and what it forbids is subject to debate. According to the actual language of CA 3, detainees "shall in all circumstances be treated humanely," but the term humanely is never defined. "[O]utrages upon personal dignity, in particular humiliating and degrading treatment," are strictly prohibited, whatever they may be. Also prohibited are "cruel treatment and torture," but again, there is no definition of these terms. CA3 is a good statement of principles, but aside from banning murder and hostage-taking, it provides no concrete guidance to anyone actually holding detainees.
Nonetheless, CA3 is a part of U.S. treaty and criminal law. Congress, in the 1999 amendments to the War Crimes Act, made it a crime to violate CA3. For certain acts, such as murder, taking hostages, and obvious acts of torture, the prohibited conduct should be clear, since Congress has defined the elements necessary to prove these crimes in statutory law.
But what exactly constitutes "outrages upon personal dignity, in particular humiliating and degrading treatment"? No universal or even national consensus as to the definition of these terms exists. There is, however, no doubt that what constitutes humiliation or degradation, as distinct from acceptable treatment, is highly context-specific and culture-dependent. For example, any custodial interrogation or incarceration entails elements of humiliation that would be unacceptable in other contexts. Likewise, some societies find placing women in a position of authority, as guards or interrogators, over detained individuals unacceptable; for other cultures that believe in basic gender equality, these practices are not even remotely humiliating. Even Jean Pictet, the world-renowned human rights attorney who helped draft the Geneva Conventions and led the International Committee of the Red Cross, noted that with respect to CA3, the drafters wanted to target those acts that "world public opinion finds particularly revolting." This is a highly uncertain guide.
Pictet also stated that the outrages upon personal dignity referenced by the treaty were of a sort "committed frequently during the Second World War." This too gives little guidance. Presumably, the prohibition would include forcing ethnic or religious minorities to wear insignia for purposes of identification, such as the infamous yellow star imposed by the Nazi regime on the Jewish population of Germany and occupied Europe. What else it may include is very much open to debate; the Axis Powers were ingenious in the area of humiliating and degrading treatment.
Principles of CA3
In interpreting this important provision, the United States would be justified in following some basic principles inferred from CA3.
First, CA3 imposes obligations on the parties to a conflict. This suggests that to violate the provision, the conduct must be both of a sort that world opinion finds "particularly revolting" and systemic, undertaken as a matter of policy rather than simply the actions of individual miscreants or criminals. Thus, although the treatment of some detainees by a few guards may have been outrageous, humiliating and degrading—and perhaps criminal—it would not violate CA3 unless it was ordered as a matter of policy or the responsible authorities failed to suppress and punish the conduct once it became known to them. All allegations of mistreatment are required to be investigated as a matter of written order.
Likewise, the use of the law of war paradigm cannot, by definition, be a violation of CA3, even if its specific application produces a less than ideal result. For example, detaining individuals believed to be enemy combatants is no violation of CA3, even if subsequent review concludes that their status classification was erroneous and they were not, in fact, enemy combatants. Under the same logic, and despite some oft-invoked but misguided criticisms of the U.S. detention policy, detaining captured enemy combatants for the duration of hostilities and not charging them with specific criminal offenses does not violate CA3.
Second, the purpose of CA3 was to compel compliance with the most basic requirements in the context of a civil war or other internal conflict, where it was acknowledged that the other provisions of the four conventions would not apply. Thus, it is a fair assumption that CA3 should not be interpreted as simply incorporating those other Geneva Convention provisions into the conflicts to which CA3 is applicable. Outrages upon personal dignity would not, therefore, include simply denying captives the rights and privileges of honorable prisoners of war under the third convention or of civilian persons under the fourth.
Third, CA3, like any other specific treaty provision, should be construed in the context of the overall treaty regime of which it is a part. In this regard, the overarching purpose of the 1949 Conventions (and all of the other laws of war-related treaty norms) has been to ensure that the popular passions aroused by war and even the consideration of military necessity do not vitiate the fundamental requirements of humane treatment. To suggest that, for example, the wartime standards of treatment should be fundamentally superior to the peacetime standards would turn this logic upside down and is untenable. Accordingly, such incarceration-related practices as single-cell confinement and involuntary-feeding—which, subject of course to appropriate safeguards, are used in civilian penal institutions of many Western democracies—cannot, by definition, infringe CA3.
There is no doubt that the intentions reflected in CA3 are laudable, but it is a less than perfect standard for the law of war, which must provide real and precise answers to an entire range of specific questions. Indeed, CA3's language is ambiguous, capacious, and difficult to apply in some circumstances. Fortunately, U.S. detention operations in general, and post-2006 in particular, have featured conditions for detainees that—structured in ways that provide more than sufficient compliance with CA3—compare favorably with any detention facilities in the history of warfare.
David Rivkin and Lee Casey are partners in the Washington, D.C., office of Baker and Hostetler LLP and served in the Justice Department during the Reagan and George H. W. Bush Administrations. Cully Stimson is a Senior Legal Fellow at The Heritage Foundation and served as deputy assistant secretary of defense for detainee affairs from 2006 to 2007.
References
[1]Hamdan v. Rumsfeld, 126 S.Ct 2749 (2006). It is worth noting that, insofar as the Hamdan case dealt with the legality of military commissions, and the Court's observations about the applicability of the CA3 were raised in that context, the Bush Administration could have opted to read the case holding narrowly. However, the Administration and the Department of Defense chose to construe Hamdan's teaching broadly and applied CA3 across the entire range of detention operations.
Heritage, Feb 19, 2009
Full article w/references here.
The Geneva Conventions loom large over U.S. terrorist detainee policy—even when the conventions may not strictly, as a matter of law, apply. In addition to their legal force, the conventions carry the weight of moral authority. It is no small matter, then, to question whether U.S. detention efforts fall short of the standards of Article 3—an article that is common to all four Geneva Conventions (hence its designation as "Common Article 3," or CA3). But that was the implication when President Barack Obama ordered the secretary of defense to conduct an immediate 30-day review of the conditions of detention in Guantanamo to "ensure full compliance" with CA3.
What exactly such compliance requires is open to debate.
CA3: Already in Force
From the military's point of view, Common Article 3 has been in full force for over two and a half years at Guantanamo. In June 2006, the United States Supreme Court ruled in the case of Hamdan v. Rumsfeld that America's armed conflict with al-Qaeda was non-international in character and, as such, was governed by CA3.[1] Within a week of that ruling, Deputy Secretary of Defense Gordon England issued a department-wide memorandum requiring all Department of Defense components to comply with CA3. Shortly thereafter, all components of the Department of Defense reported that they were in full compliance; this included the Joint Task Force in charge of detention operations at Guantanamo Bay, Cuba.
On September 6, 2006, the Department of Defense issued a department-wide directive applicable to all detainees in DOD custody or effective control. That directive incorporated verbatim CA3 of the Geneva Conventions and required the entire Department of Defense, including Guantanamo, to comply with CA3.
Whether this September 2006 directive marks the end of the story depends on what the text of CA3 means. And that is not so straightforward an inquiry.
Defining CA3
Common Article 3 is the third article common to each of the four Geneva Conventions. The Geneva Conventions codify much, albeit not all, of the law regulating armed conflict and the humane treatment of persons detained during armed conflict. The four conventions, as most recently revised and expanded in 1949, comprise a system of safeguards that attempt to regulate the ways wars are fought and to provide protections for individuals during wartime. The conventions themselves were a response to the horrific atrocities of World War II. The first convention covers soldiers wounded on the battlefield, the second covers sailors wounded and shipwrecked at sea, the third covers prisoners of war, and the fourth covers civilians taken by an enemy military or otherwise impacted.
What CA3 precisely requires and what it forbids is subject to debate. According to the actual language of CA 3, detainees "shall in all circumstances be treated humanely," but the term humanely is never defined. "[O]utrages upon personal dignity, in particular humiliating and degrading treatment," are strictly prohibited, whatever they may be. Also prohibited are "cruel treatment and torture," but again, there is no definition of these terms. CA3 is a good statement of principles, but aside from banning murder and hostage-taking, it provides no concrete guidance to anyone actually holding detainees.
Nonetheless, CA3 is a part of U.S. treaty and criminal law. Congress, in the 1999 amendments to the War Crimes Act, made it a crime to violate CA3. For certain acts, such as murder, taking hostages, and obvious acts of torture, the prohibited conduct should be clear, since Congress has defined the elements necessary to prove these crimes in statutory law.
But what exactly constitutes "outrages upon personal dignity, in particular humiliating and degrading treatment"? No universal or even national consensus as to the definition of these terms exists. There is, however, no doubt that what constitutes humiliation or degradation, as distinct from acceptable treatment, is highly context-specific and culture-dependent. For example, any custodial interrogation or incarceration entails elements of humiliation that would be unacceptable in other contexts. Likewise, some societies find placing women in a position of authority, as guards or interrogators, over detained individuals unacceptable; for other cultures that believe in basic gender equality, these practices are not even remotely humiliating. Even Jean Pictet, the world-renowned human rights attorney who helped draft the Geneva Conventions and led the International Committee of the Red Cross, noted that with respect to CA3, the drafters wanted to target those acts that "world public opinion finds particularly revolting." This is a highly uncertain guide.
Pictet also stated that the outrages upon personal dignity referenced by the treaty were of a sort "committed frequently during the Second World War." This too gives little guidance. Presumably, the prohibition would include forcing ethnic or religious minorities to wear insignia for purposes of identification, such as the infamous yellow star imposed by the Nazi regime on the Jewish population of Germany and occupied Europe. What else it may include is very much open to debate; the Axis Powers were ingenious in the area of humiliating and degrading treatment.
Principles of CA3
In interpreting this important provision, the United States would be justified in following some basic principles inferred from CA3.
First, CA3 imposes obligations on the parties to a conflict. This suggests that to violate the provision, the conduct must be both of a sort that world opinion finds "particularly revolting" and systemic, undertaken as a matter of policy rather than simply the actions of individual miscreants or criminals. Thus, although the treatment of some detainees by a few guards may have been outrageous, humiliating and degrading—and perhaps criminal—it would not violate CA3 unless it was ordered as a matter of policy or the responsible authorities failed to suppress and punish the conduct once it became known to them. All allegations of mistreatment are required to be investigated as a matter of written order.
Likewise, the use of the law of war paradigm cannot, by definition, be a violation of CA3, even if its specific application produces a less than ideal result. For example, detaining individuals believed to be enemy combatants is no violation of CA3, even if subsequent review concludes that their status classification was erroneous and they were not, in fact, enemy combatants. Under the same logic, and despite some oft-invoked but misguided criticisms of the U.S. detention policy, detaining captured enemy combatants for the duration of hostilities and not charging them with specific criminal offenses does not violate CA3.
Second, the purpose of CA3 was to compel compliance with the most basic requirements in the context of a civil war or other internal conflict, where it was acknowledged that the other provisions of the four conventions would not apply. Thus, it is a fair assumption that CA3 should not be interpreted as simply incorporating those other Geneva Convention provisions into the conflicts to which CA3 is applicable. Outrages upon personal dignity would not, therefore, include simply denying captives the rights and privileges of honorable prisoners of war under the third convention or of civilian persons under the fourth.
Third, CA3, like any other specific treaty provision, should be construed in the context of the overall treaty regime of which it is a part. In this regard, the overarching purpose of the 1949 Conventions (and all of the other laws of war-related treaty norms) has been to ensure that the popular passions aroused by war and even the consideration of military necessity do not vitiate the fundamental requirements of humane treatment. To suggest that, for example, the wartime standards of treatment should be fundamentally superior to the peacetime standards would turn this logic upside down and is untenable. Accordingly, such incarceration-related practices as single-cell confinement and involuntary-feeding—which, subject of course to appropriate safeguards, are used in civilian penal institutions of many Western democracies—cannot, by definition, infringe CA3.
There is no doubt that the intentions reflected in CA3 are laudable, but it is a less than perfect standard for the law of war, which must provide real and precise answers to an entire range of specific questions. Indeed, CA3's language is ambiguous, capacious, and difficult to apply in some circumstances. Fortunately, U.S. detention operations in general, and post-2006 in particular, have featured conditions for detainees that—structured in ways that provide more than sufficient compliance with CA3—compare favorably with any detention facilities in the history of warfare.
David Rivkin and Lee Casey are partners in the Washington, D.C., office of Baker and Hostetler LLP and served in the Justice Department during the Reagan and George H. W. Bush Administrations. Cully Stimson is a Senior Legal Fellow at The Heritage Foundation and served as deputy assistant secretary of defense for detainee affairs from 2006 to 2007.
References
[1]Hamdan v. Rumsfeld, 126 S.Ct 2749 (2006). It is worth noting that, insofar as the Hamdan case dealt with the legality of military commissions, and the Court's observations about the applicability of the CA3 were raised in that context, the Bush Administration could have opted to read the case holding narrowly. However, the Administration and the Department of Defense chose to construe Hamdan's teaching broadly and applied CA3 across the entire range of detention operations.
Libertarian: "CO2-Capture Coal Plants: A Ban by Another Name"
CO2-Capture Coal Plants: A Ban by Another Name, by Marlo Lewis
February 19, 2009
The top agenda item for many climate activists (James Hansen, for example) is stopping the construction of new coal-fired power plants. Coal is the most carbon-intensive fuel, and the carbon dioxide (CO2) emissions from new coal plants at various planning stages could swamp by as much as 5 to 1 all the emissions reductions the European Union, Russia, and Japan might achieve under the Kyoto Protocol. Either climate activists kill coal, or coal will bury Kyoto.
Al Gore and his comrades at “We Can Solve It” go even further. They urge policymakers to “re-power” America with zero-carbon, emission-free electricity by 2018. In 2007, the favored renewables—wind, solar, geothermal, municipal waste, and biomass—produced 72.4 gigawatts of electricity in the U.S. power sector (see Table A16 of EIA’s 2009 Annual Energy Outlook Summary Reference Case Tables). In contrast, total power-sector generation in 2007 was 3,827 gigawatts. So the “We Can Solve It” crowd wants energy sources that supply less than 2% of U.S. electric generation today to supplant the coal- and gas-fired power plants that provide almost 70% of current generation–all in 10 years.
If seriously pursued, this agenda would lead to hyperinflation of electricity prices (because demand for renewable electricity, ramped up by mandates, would vastly exceed supply), turn out the lights (a transition that big and that fast would not be smooth), and crash the economy. It would also set a world record for government waste, because hundreds of coal and natural gas power plants would be de-commissioned long before the end of their useful lives.
The proposal is so cockamamie I would feel silly blogging about it were it not the brainchild of a former Vice President, Nobel Prize Winner, New York Times best-seller-list author, and Academy Award film star.
The more urbane climate activists don’t talk about tearing down coal plants or even banning them. Instead, they call for a “moratorium” on new coal plants until such time as the technology and infrastructure are deployed to capture and store the CO2 emissions.
What prompts these observations is an article earlier this week in Greenwire summarizing a study by Emerging Energy Research (EER) on the pace and funding of carbon capture and storage (CCS) demonstration projects around the world. According to the study (which costs $3,750 to download in PDF, so I am relying on Greenwire’s review and a report outline posted on EER’s Web site), governments have earmarked about $20 billion for CCS projects. How much of that will actually be spent is anyone’s guess.
Neither Greenwire nor EER’s Web site provide what would seem to be the most relevant datum for potential investors, policymakers, and consumers—namely, how much it costs per ton for a coal-fired power plant to capture and store CO2.
According to the Department of Energy, carbon capture of CO2 from coal electric generating units costs about $150 per ton. That is more than twice the EIA-estimated cost of carbon permits in 2030 under the Lieberman-Warner Climate Security Act (S. 2191), a bill the U.S. Senate did not see fit to pass.
In 2007, EIA analyzed the National Commission on Energy Policy’s cap-and-trade proposal featuring a “safety valve” price initially set at $7.00 per ton CO2-equivalent and increasing by 5% annually above inflation. EIA concluded that the proposed carbon penalty was not steep enough to make CCS economical (see p. 16 of the EIA report). That’s hardly surprising, if CCS costs $150 per ton. So all we need to do is increase the carbon penalty, and then we’ll get lots of investment in CCS, right?
Well, maybe not. In the same EIA report (p. 11), coal generation in the reference case (no cap-and-trade) increases from 2,505 billion kWh in 2006 to 3,381 billion kWh in 2030—a 34% increase. In contrast, under the NCEP cap-and-trade program with a gradually increasing safety valve price initially set at $7.00 per ton, coal generation increases from 2,505 billion kWh in 2006 to only 2,530 billion kWh in 2030—a 0.9% increase. That’s if auctioning of carbon permits is phased in. If all permits are auctioned from the get-go, coal generation actually declines slightly—from 2,505 billion kWh in 2006 to 2,500 billion kWh in 2030.
If a relatively small carbon penalty can essentially block new coal generation, a large carbon penalty might just as well lead to capital flight from coal rather than to a surge of investment in CCS.
A recent news item may be relevant to this discussion. Just two days after EPA Administrator Lisa Jackson said she would consider interpreting the Clean Air Act to require coal power plants applying for Prevention of Significant Deterioration (PSD) pre-construction permits to install best available control technology (BACT) for CO2, “AES Corporation, one of the world’s largest power companies with almost $14 billion in revenues in 2007, announced it would withdraw an application to build a new coal-fired power plant in Oklahoma,” the Huffington Post reports.
BACT for CO2 from coal power plants would most likely take the form of process efficiency upgrades, not anything nearly as costly or experimental as CCS. Yet, apparently, the risk of potential BACT regulation of CO2 was enough to deter AES from investing in a new coal power plant.
Experts I have spoken with say it could take a decade of research just to determine whether CCS would be economical under a range of carbon penalties, another decade to build the infrastructure of pipelines and storage facilities at an industrial scale (the CO2 pipeline network would rival the U.S. natural gas and petroleum pipeline networks in size; see p. ix of MIT’s CCS study), years to work out the regulatory and liability issues, and years to overcome NIMBY opposition.
So the so-called moratorium on new coal plants lacking CCS is just a ban by another name. What are the risks?
U.S. electricity demand is growing (or at least was growing before the recession), and coal is the fuel of choice in many markets. EIA projects that between 2006 and 2030, coal will provide 42% of all new electric power-sector generation in the United States, with new coal power plants providing 8% all U.S. power-sector generation by 2030. Banning that much new coal generation would, at a minimum, drive up electricity and natural gas prices. The effectual ban could also create significant imbalances between supply and demand, increasing the risk of local or regional energy crises.
February 19, 2009
The top agenda item for many climate activists (James Hansen, for example) is stopping the construction of new coal-fired power plants. Coal is the most carbon-intensive fuel, and the carbon dioxide (CO2) emissions from new coal plants at various planning stages could swamp by as much as 5 to 1 all the emissions reductions the European Union, Russia, and Japan might achieve under the Kyoto Protocol. Either climate activists kill coal, or coal will bury Kyoto.
Al Gore and his comrades at “We Can Solve It” go even further. They urge policymakers to “re-power” America with zero-carbon, emission-free electricity by 2018. In 2007, the favored renewables—wind, solar, geothermal, municipal waste, and biomass—produced 72.4 gigawatts of electricity in the U.S. power sector (see Table A16 of EIA’s 2009 Annual Energy Outlook Summary Reference Case Tables). In contrast, total power-sector generation in 2007 was 3,827 gigawatts. So the “We Can Solve It” crowd wants energy sources that supply less than 2% of U.S. electric generation today to supplant the coal- and gas-fired power plants that provide almost 70% of current generation–all in 10 years.
If seriously pursued, this agenda would lead to hyperinflation of electricity prices (because demand for renewable electricity, ramped up by mandates, would vastly exceed supply), turn out the lights (a transition that big and that fast would not be smooth), and crash the economy. It would also set a world record for government waste, because hundreds of coal and natural gas power plants would be de-commissioned long before the end of their useful lives.
The proposal is so cockamamie I would feel silly blogging about it were it not the brainchild of a former Vice President, Nobel Prize Winner, New York Times best-seller-list author, and Academy Award film star.
The more urbane climate activists don’t talk about tearing down coal plants or even banning them. Instead, they call for a “moratorium” on new coal plants until such time as the technology and infrastructure are deployed to capture and store the CO2 emissions.
What prompts these observations is an article earlier this week in Greenwire summarizing a study by Emerging Energy Research (EER) on the pace and funding of carbon capture and storage (CCS) demonstration projects around the world. According to the study (which costs $3,750 to download in PDF, so I am relying on Greenwire’s review and a report outline posted on EER’s Web site), governments have earmarked about $20 billion for CCS projects. How much of that will actually be spent is anyone’s guess.
Neither Greenwire nor EER’s Web site provide what would seem to be the most relevant datum for potential investors, policymakers, and consumers—namely, how much it costs per ton for a coal-fired power plant to capture and store CO2.
According to the Department of Energy, carbon capture of CO2 from coal electric generating units costs about $150 per ton. That is more than twice the EIA-estimated cost of carbon permits in 2030 under the Lieberman-Warner Climate Security Act (S. 2191), a bill the U.S. Senate did not see fit to pass.
In 2007, EIA analyzed the National Commission on Energy Policy’s cap-and-trade proposal featuring a “safety valve” price initially set at $7.00 per ton CO2-equivalent and increasing by 5% annually above inflation. EIA concluded that the proposed carbon penalty was not steep enough to make CCS economical (see p. 16 of the EIA report). That’s hardly surprising, if CCS costs $150 per ton. So all we need to do is increase the carbon penalty, and then we’ll get lots of investment in CCS, right?
Well, maybe not. In the same EIA report (p. 11), coal generation in the reference case (no cap-and-trade) increases from 2,505 billion kWh in 2006 to 3,381 billion kWh in 2030—a 34% increase. In contrast, under the NCEP cap-and-trade program with a gradually increasing safety valve price initially set at $7.00 per ton, coal generation increases from 2,505 billion kWh in 2006 to only 2,530 billion kWh in 2030—a 0.9% increase. That’s if auctioning of carbon permits is phased in. If all permits are auctioned from the get-go, coal generation actually declines slightly—from 2,505 billion kWh in 2006 to 2,500 billion kWh in 2030.
If a relatively small carbon penalty can essentially block new coal generation, a large carbon penalty might just as well lead to capital flight from coal rather than to a surge of investment in CCS.
A recent news item may be relevant to this discussion. Just two days after EPA Administrator Lisa Jackson said she would consider interpreting the Clean Air Act to require coal power plants applying for Prevention of Significant Deterioration (PSD) pre-construction permits to install best available control technology (BACT) for CO2, “AES Corporation, one of the world’s largest power companies with almost $14 billion in revenues in 2007, announced it would withdraw an application to build a new coal-fired power plant in Oklahoma,” the Huffington Post reports.
BACT for CO2 from coal power plants would most likely take the form of process efficiency upgrades, not anything nearly as costly or experimental as CCS. Yet, apparently, the risk of potential BACT regulation of CO2 was enough to deter AES from investing in a new coal power plant.
Experts I have spoken with say it could take a decade of research just to determine whether CCS would be economical under a range of carbon penalties, another decade to build the infrastructure of pipelines and storage facilities at an industrial scale (the CO2 pipeline network would rival the U.S. natural gas and petroleum pipeline networks in size; see p. ix of MIT’s CCS study), years to work out the regulatory and liability issues, and years to overcome NIMBY opposition.
So the so-called moratorium on new coal plants lacking CCS is just a ban by another name. What are the risks?
U.S. electricity demand is growing (or at least was growing before the recession), and coal is the fuel of choice in many markets. EIA projects that between 2006 and 2030, coal will provide 42% of all new electric power-sector generation in the United States, with new coal power plants providing 8% all U.S. power-sector generation by 2030. Banning that much new coal generation would, at a minimum, drive up electricity and natural gas prices. The effectual ban could also create significant imbalances between supply and demand, increasing the risk of local or regional energy crises.
Industry Views: The Washington Post Again Calls for Higher Energy Taxes
The Washington Post Again Calls for Higher Energy Taxes
IER, February 19, 2009
For the third time in the past 5 months The Washington Post has called for new taxes on energy. This time the Post is calling for a carbon tax because cap-and-trade regimes for greenhouse gas emissions are flawed. According to the Post:
The Post’s answer to the flaws with cap-and-trade schemes is to implement a carbon tax instead. Cap and trade and carbon taxes have a similar goal—increase the price of energy to encourage conservation. Carbon taxes increase the price of anything that uses oil, coal, or natural gas an input. This includes nearly all goods or services in the United States because 85 percent of the energy we use comes from coal, oil, or natural gas.
Increasing the costs of doing business in American makes it harder for American businesses to compete with foreign companies. The high price of natural gas in the United States has already contributed to the loss of 3.1 million manufacturing jobs since 2000.[1] Higher energy taxes will further drive more businesses overseas and make life more difficult for American consumers struggling to make ends meet.
It is not clear what The Washington Post hopes to accomplish with a carbon tax. The earth has warmed over the past 30 years, but not as much as the climate models predict. Climate alarmists point to the models as evidence of catastrophic warming, even though there has been no warming trend since 2001 according to the satellite data.
One thing that is clear is that a unilateral carbon tax imposed on U.S. citizens will do little to nothing about global warming. Global carbon dioxide emissions are not driven by the United States, but by the developing world. According to data from the Global Carbon Project, between 2000 and 2007 the U.S.’s carbon dioxide emissions increased 3% while China’s increased 98% and became the world’s largest emitter of carbon dioxide. By way of comparison, from 2000 to 2007 India’s carbon dioxide emissions increased 36%, the global total increased 26%, Russia’s increased 10%, and Japan’s increased 3%. These data are displayed in the graph below:
[see graph here]
The U.S. will emit a smaller and smaller share of the world’s total greenhouse gas emissions[2] making unilateral efforts, such as a domestic carbon tax, ineffective at influencing climate. If the U.S. were to completely cease using fossil fuels, the increase from the rest of the world would replace U.S. emissions in less than eight years.[3]
The Washington Post says that “A carbon tax, by contrast, is simple and sure in its effects.” This is correct. A carbon tax is simple, and we can indeed be sure of its effects: it will harm America economically with few corresponding environmental benefits.
References
[1] Paul N. Cicio, Testimony of Paul N. Cicio, President of Industrial Energy Consumers of America before the House of Representatives, Dec. 6, 2007, http://www.ieca-us.com/documents/IECAHouseTestimony-NaturalGas_12.06.07.pdf.
[2] According to the Global Carbon project in 2007 China emitted 21% of the world’s carbon equivalent and the U.S. emitted 19%.
[3] Calculated using the emission data from the Global Carbon Project. According to these data, the U.S. emitted 1,586,213 GgC in 2007. Without the U.S., the world’s emissions were 5,203,987 GgC in 2000, increasing to 6,884,787 GgC in 2007.
IER, February 19, 2009
For the third time in the past 5 months The Washington Post has called for new taxes on energy. This time the Post is calling for a carbon tax because cap-and-trade regimes for greenhouse gas emissions are flawed. According to the Post:
Cap-and-trade regimes have advantages, notably the ability to set a limit
on emissions and to integrate with other countries. But they are complex and
vulnerable to lobbying and special pleading, and they do not guarantee
success.
The experience of the European Union is Exhibit A.
The Post’s answer to the flaws with cap-and-trade schemes is to implement a carbon tax instead. Cap and trade and carbon taxes have a similar goal—increase the price of energy to encourage conservation. Carbon taxes increase the price of anything that uses oil, coal, or natural gas an input. This includes nearly all goods or services in the United States because 85 percent of the energy we use comes from coal, oil, or natural gas.
Increasing the costs of doing business in American makes it harder for American businesses to compete with foreign companies. The high price of natural gas in the United States has already contributed to the loss of 3.1 million manufacturing jobs since 2000.[1] Higher energy taxes will further drive more businesses overseas and make life more difficult for American consumers struggling to make ends meet.
It is not clear what The Washington Post hopes to accomplish with a carbon tax. The earth has warmed over the past 30 years, but not as much as the climate models predict. Climate alarmists point to the models as evidence of catastrophic warming, even though there has been no warming trend since 2001 according to the satellite data.
One thing that is clear is that a unilateral carbon tax imposed on U.S. citizens will do little to nothing about global warming. Global carbon dioxide emissions are not driven by the United States, but by the developing world. According to data from the Global Carbon Project, between 2000 and 2007 the U.S.’s carbon dioxide emissions increased 3% while China’s increased 98% and became the world’s largest emitter of carbon dioxide. By way of comparison, from 2000 to 2007 India’s carbon dioxide emissions increased 36%, the global total increased 26%, Russia’s increased 10%, and Japan’s increased 3%. These data are displayed in the graph below:
[see graph here]
The U.S. will emit a smaller and smaller share of the world’s total greenhouse gas emissions[2] making unilateral efforts, such as a domestic carbon tax, ineffective at influencing climate. If the U.S. were to completely cease using fossil fuels, the increase from the rest of the world would replace U.S. emissions in less than eight years.[3]
The Washington Post says that “A carbon tax, by contrast, is simple and sure in its effects.” This is correct. A carbon tax is simple, and we can indeed be sure of its effects: it will harm America economically with few corresponding environmental benefits.
References
[1] Paul N. Cicio, Testimony of Paul N. Cicio, President of Industrial Energy Consumers of America before the House of Representatives, Dec. 6, 2007, http://www.ieca-us.com/documents/IECAHouseTestimony-NaturalGas_12.06.07.pdf.
[2] According to the Global Carbon project in 2007 China emitted 21% of the world’s carbon equivalent and the U.S. emitted 19%.
[3] Calculated using the emission data from the Global Carbon Project. According to these data, the U.S. emitted 1,586,213 GgC in 2007. Without the U.S., the world’s emissions were 5,203,987 GgC in 2000, increasing to 6,884,787 GgC in 2007.
Experts from Brookings, Heritage, and others: Statement on the Fiscal Responsibility Summit
Statement on the Fiscal Responsibility Summit. Experts from Brookings, Heritage, and others.
February 19, 2009
President Obama’s intention to convene a fiscal responsibility summit is a very welcome development. It offers a valuable opportunity to focus public attention on our nation’s unsustainable budget outlook and to highlight various approaches to meaningful action.
As a group of budget analysts and former senior budget officials, we view this summit as the first step to addressing the enormous long-term fiscal problem facing the United States. Without decisive action this problem will lead to serious harm to our economy and a huge financial burden on our children and grandchildren.
Tackling these problems will require a degree of sacrifice impossible under the existing policy process, which discourages bipartisan compromise and encourages procrastination and obstructionism. Unless those procedures are modified, and the American people are engaged in the process, future legislative attempts to address the looming fiscal crisis will almost certainly fail.
In our view, the American people are ready to confront the challenge. For the last three years several of us have traveled around the country as a group, discussing these issues with thousands of Americans in dozens of cities, in a bipartisan effort known as the Fiscal Wake-Up Tour. We have found that when Americans are given the facts and options in a neutral and bipartisan way, they want action and are willing to make difficult trade-offs.
We therefore urge the President to lead a major public engagement effort – beyond a oneday summit – to inform Americans of the scale and nature of the long-term fiscal crisis,explain the consequences of inaction and discuss the options for solving the problem. This should be bipartisan, and involve a serious conversation with Americans to help guide action in Washington. As a group with some experience in this domain, we stand ready to assist if needed.
We also believe that for this policy commitment to produce tangible results, the President and others who share the goal of fiscal responsibility must address the fact that the regular political process has been incapable of dealing with long-term fiscal issues. We see no alternative but to create an independent and truly bipartisan commission or other mechanism capable of bringing about decisive action that has broad public support. We therefore urge the President to support such a commission. For this commission or some other mechanism to break through the legislative logjam it will need four key elements:
• It must be truly bipartisan and develop solutions that command wide support.
• It must have a broad mandate to address all aspects of the fiscal problem while fostering strong economic growth.
• There must be no preconditions to the deliberations. All options must be on the table for discussion. Nobody should be required to agree in advance to any option.
• Recommendations must go before Congress for an up-or-down vote with few if any amendments. Such a game-changing process is not without precedents; controversial military base closings or the ratification of international trade agreements, for example, have long been governed by special rules along these lines, not by business as usual.
We are deeply worried about the long-term fiscal imbalance and the dangers it carries for the economy and for our children and grandchildren. We know the President is concerned as well, as are many Members of Congress in both political parties. We are ready to help in building public understanding of the problem and the options, and in crafting an approach that will enable the legislative process to deal with the problem.
This statement is offered by members of the Brookings-Heritage Fiscal Seminar. The views expressed are those of the individuals involved and should not be interpreted as representing the views of their respective institutions. For purposes of identification, the affiliation of each signatory is listed.
Signatories:
Joe Antos-American Enterprise Institute
Will Marshall-Progressive Policy Institute
Robert Bixby-Concord Coalition
Pietro Nivola-Brookings Institution
Stuart Butler-Heritage Foundation
Rudolph Penner-Urban Institute
Alison Fraser-Heritage Foundation
Robert Reischauer-Urban Institute
William Galston-Brookings Institution
Alice M. Rivlin-Brookings Institution
Ron Haskins-Brookings Institution
Isabel Sawhill-Brookings Institution
Julia Isaacs-Brookings Institution
C. Eugene Steuerle-Peter G. Peterson Foundation
February 19, 2009
President Obama’s intention to convene a fiscal responsibility summit is a very welcome development. It offers a valuable opportunity to focus public attention on our nation’s unsustainable budget outlook and to highlight various approaches to meaningful action.
As a group of budget analysts and former senior budget officials, we view this summit as the first step to addressing the enormous long-term fiscal problem facing the United States. Without decisive action this problem will lead to serious harm to our economy and a huge financial burden on our children and grandchildren.
Tackling these problems will require a degree of sacrifice impossible under the existing policy process, which discourages bipartisan compromise and encourages procrastination and obstructionism. Unless those procedures are modified, and the American people are engaged in the process, future legislative attempts to address the looming fiscal crisis will almost certainly fail.
In our view, the American people are ready to confront the challenge. For the last three years several of us have traveled around the country as a group, discussing these issues with thousands of Americans in dozens of cities, in a bipartisan effort known as the Fiscal Wake-Up Tour. We have found that when Americans are given the facts and options in a neutral and bipartisan way, they want action and are willing to make difficult trade-offs.
We therefore urge the President to lead a major public engagement effort – beyond a oneday summit – to inform Americans of the scale and nature of the long-term fiscal crisis,explain the consequences of inaction and discuss the options for solving the problem. This should be bipartisan, and involve a serious conversation with Americans to help guide action in Washington. As a group with some experience in this domain, we stand ready to assist if needed.
We also believe that for this policy commitment to produce tangible results, the President and others who share the goal of fiscal responsibility must address the fact that the regular political process has been incapable of dealing with long-term fiscal issues. We see no alternative but to create an independent and truly bipartisan commission or other mechanism capable of bringing about decisive action that has broad public support. We therefore urge the President to support such a commission. For this commission or some other mechanism to break through the legislative logjam it will need four key elements:
• It must be truly bipartisan and develop solutions that command wide support.
• It must have a broad mandate to address all aspects of the fiscal problem while fostering strong economic growth.
• There must be no preconditions to the deliberations. All options must be on the table for discussion. Nobody should be required to agree in advance to any option.
• Recommendations must go before Congress for an up-or-down vote with few if any amendments. Such a game-changing process is not without precedents; controversial military base closings or the ratification of international trade agreements, for example, have long been governed by special rules along these lines, not by business as usual.
We are deeply worried about the long-term fiscal imbalance and the dangers it carries for the economy and for our children and grandchildren. We know the President is concerned as well, as are many Members of Congress in both political parties. We are ready to help in building public understanding of the problem and the options, and in crafting an approach that will enable the legislative process to deal with the problem.
This statement is offered by members of the Brookings-Heritage Fiscal Seminar. The views expressed are those of the individuals involved and should not be interpreted as representing the views of their respective institutions. For purposes of identification, the affiliation of each signatory is listed.
Signatories:
Joe Antos-American Enterprise Institute
Will Marshall-Progressive Policy Institute
Robert Bixby-Concord Coalition
Pietro Nivola-Brookings Institution
Stuart Butler-Heritage Foundation
Rudolph Penner-Urban Institute
Alison Fraser-Heritage Foundation
Robert Reischauer-Urban Institute
William Galston-Brookings Institution
Alice M. Rivlin-Brookings Institution
Ron Haskins-Brookings Institution
Isabel Sawhill-Brookings Institution
Julia Isaacs-Brookings Institution
C. Eugene Steuerle-Peter G. Peterson Foundation
New Insecticides Are Crucial in Battle Against Malaria
New Insecticides Are Crucial in Battle Against Malaria. By Roger Bate
AEI, Thursday, February 19, 2009
Insecticides are the most important preventative tool against malaria, dengue and filariasis. Even for yellow fever, where a vaccine exists, insecticides are needed to control common outbreaks.
Characteristics that make good public health insecticides (PHIs) are not necessarily shared by modern agricultural or industrial insecticides. For example, DDT (Dichloro-Diphenyl-Trichloroethane) was dropped as an agrochemical in the 1970s because it biodegrades slowly and accumulates in biological systems when widely used in agriculture. Yet its persistence makes it a powerful PHI. When sprayed inside houses, DDT protects all inhabitants from malaria for approximately one year--no alternative lasts as long at the same cost. Even in places where mosquitoes are resistant to DDT's toxic actions, it deters mosquitoes from entering houses sprayed with the chemical.
Insecticide-treated nets (ITNs) primarily protect only the people sleeping under them and are less effective than generally assumed because ITN distribution does not equate with consistent use. And while the mass distribution of ITNs has undoubtedly saved thousands of lives, mosquitoes are becoming increasingly resistant to the insecticides used on them, partly because the same chemicals are used in farming. Recent studies have identified growing resistance in Benin and Uganda, where free net distribution has occurred for a decade.
We urgently need new insecticides that combine repellency and persistence with strong binding properties (for use with ITNs).
Stumbling blocks
Yet, while government donors and organisations such as the Gates Foundation and the World Bank spend billions on research into new vaccines and drugs, they invest very little in the search for new PHIs. The newest compound recommended by the WHO for malaria control, etofenprox, was made available in 1986.
The private sector has developed every major insecticide since 1940--for agricultural or industrial purposes. Public funds played a part in early stage research, but private markets drove product development. The market for PHIs is about 1.3 per cent of the total insecticide market (US$400 million out of US$30 billion). So while most large companies who make insecticides--such as Bayer, and Syngenta--are pleased to sell PHIs, they lack financial incentives to develop products specifically for that sector.
And, according to a Boston Consulting Group report backed by the Gates Foundation, the cost of research and development in the agrochemical industry has risen 500 per cent over the past 20 years.
Opposition from environmentalists makes investing in the search for new PHIs even less attractive.
In 1993, the Pan American Health Organization banned the use of pesticides identified by influential environmentalists and in 1997, a World Health Assembly resolution called on WHO member states to reduce reliance on insecticides for controlling vector-borne diseases.
Environmental groups, led by Pesticide Action Network, regularly champion replacing insecticides with "environmentally sound and socially just alternatives". This often means reducing breeding opportunities for mosquitoes and using larvivorous fish to eat mosquito larvae, which can work--but only under specific circumstances. Exclusively relying on these techniques addresses the risks of man-made chemicals, but gives scant consideration to the far more dangerous disease threats. Yet aware of the implacable environmental community, aid groups have been reluctant to defend insecticide spraying.
Current efforts
The WHO has offered no leadership in the struggle to identify and invest in more effective insecticides. Its Pesticide Evaluation Scheme remains underfunded and slow in responding to the needs of the public health community.
The UN Stockholm Convention on Persistent Organic Pollutants and the Global Environmental Facility at the World Bank have paid lip service to finding new insecticides. But the Gates Foundation is the only organisation to have seriously supported the effort. In 2005, it gave US$50 million to the Innovative Vector Control Consortium (IVCC)--an international partnership between five research institutions in South Africa, United Kingdom and the United States--which aims to "develop new and better ways to control the transmission of insect-borne disease".
Promisingly, the initiative has begun investigating compounds to replace the current crop of insecticides, working with major manufacturers (notably Bayer and Syngenta) to bring new products to the market. But public documents indicate that the IVCC is not investigating repellency, which seems misguided.
Other than this initiative--which is dwarfed by the dollars donors spend on bed nets and treatment of insect-borne diseases every year--very little research is being done.
To encourage research, rich countries must urgently adopt measures similar to the Orphan Drug Act and priority review vouchers, both of which have helped attract investment in medicines for which there is no profitable market.
The credit crunch has made investment and risk-taking difficult. But if malaria deaths are to end, politicians, businessmen, celebrities and activists must devote their efforts to creating an environment that encourages investment in new PHIs.
Roger Bate is a resident fellow at AEI.
AEI, Thursday, February 19, 2009
Insecticides are the most important preventative tool against malaria, dengue and filariasis. Even for yellow fever, where a vaccine exists, insecticides are needed to control common outbreaks.
Characteristics that make good public health insecticides (PHIs) are not necessarily shared by modern agricultural or industrial insecticides. For example, DDT (Dichloro-Diphenyl-Trichloroethane) was dropped as an agrochemical in the 1970s because it biodegrades slowly and accumulates in biological systems when widely used in agriculture. Yet its persistence makes it a powerful PHI. When sprayed inside houses, DDT protects all inhabitants from malaria for approximately one year--no alternative lasts as long at the same cost. Even in places where mosquitoes are resistant to DDT's toxic actions, it deters mosquitoes from entering houses sprayed with the chemical.
Insecticide-treated nets (ITNs) primarily protect only the people sleeping under them and are less effective than generally assumed because ITN distribution does not equate with consistent use. And while the mass distribution of ITNs has undoubtedly saved thousands of lives, mosquitoes are becoming increasingly resistant to the insecticides used on them, partly because the same chemicals are used in farming. Recent studies have identified growing resistance in Benin and Uganda, where free net distribution has occurred for a decade.
We urgently need new insecticides that combine repellency and persistence with strong binding properties (for use with ITNs).
Stumbling blocks
Yet, while government donors and organisations such as the Gates Foundation and the World Bank spend billions on research into new vaccines and drugs, they invest very little in the search for new PHIs. The newest compound recommended by the WHO for malaria control, etofenprox, was made available in 1986.
The private sector has developed every major insecticide since 1940--for agricultural or industrial purposes. Public funds played a part in early stage research, but private markets drove product development. The market for PHIs is about 1.3 per cent of the total insecticide market (US$400 million out of US$30 billion). So while most large companies who make insecticides--such as Bayer, and Syngenta--are pleased to sell PHIs, they lack financial incentives to develop products specifically for that sector.
And, according to a Boston Consulting Group report backed by the Gates Foundation, the cost of research and development in the agrochemical industry has risen 500 per cent over the past 20 years.
Opposition from environmentalists makes investing in the search for new PHIs even less attractive.
In 1993, the Pan American Health Organization banned the use of pesticides identified by influential environmentalists and in 1997, a World Health Assembly resolution called on WHO member states to reduce reliance on insecticides for controlling vector-borne diseases.
Environmental groups, led by Pesticide Action Network, regularly champion replacing insecticides with "environmentally sound and socially just alternatives". This often means reducing breeding opportunities for mosquitoes and using larvivorous fish to eat mosquito larvae, which can work--but only under specific circumstances. Exclusively relying on these techniques addresses the risks of man-made chemicals, but gives scant consideration to the far more dangerous disease threats. Yet aware of the implacable environmental community, aid groups have been reluctant to defend insecticide spraying.
Current efforts
The WHO has offered no leadership in the struggle to identify and invest in more effective insecticides. Its Pesticide Evaluation Scheme remains underfunded and slow in responding to the needs of the public health community.
The UN Stockholm Convention on Persistent Organic Pollutants and the Global Environmental Facility at the World Bank have paid lip service to finding new insecticides. But the Gates Foundation is the only organisation to have seriously supported the effort. In 2005, it gave US$50 million to the Innovative Vector Control Consortium (IVCC)--an international partnership between five research institutions in South Africa, United Kingdom and the United States--which aims to "develop new and better ways to control the transmission of insect-borne disease".
Promisingly, the initiative has begun investigating compounds to replace the current crop of insecticides, working with major manufacturers (notably Bayer and Syngenta) to bring new products to the market. But public documents indicate that the IVCC is not investigating repellency, which seems misguided.
Other than this initiative--which is dwarfed by the dollars donors spend on bed nets and treatment of insect-borne diseases every year--very little research is being done.
To encourage research, rich countries must urgently adopt measures similar to the Orphan Drug Act and priority review vouchers, both of which have helped attract investment in medicines for which there is no profitable market.
The credit crunch has made investment and risk-taking difficult. But if malaria deaths are to end, politicians, businessmen, celebrities and activists must devote their efforts to creating an environment that encourages investment in new PHIs.
Roger Bate is a resident fellow at AEI.
Kyoto II as Congressional-Executive Agreement: The Emerging Strategy?
Kyoto II as Congressional-Executive Agreement: The Emerging Strategy? By Christopher C. Horner
The Federalist Society, February 16, 2009
The new Obama administration generated great expectations, not least among those seeking to craft a successor treaty to the 1997 Kyoto Protocol on “global warming.” Yet early signals that an Obama administration had no plans to join Europe in agreeing to a successor pact next December, as expected, indicated that this issue would instead prove a stunning disappointment to its champions. Now, however, it appears that Kyoto will be the subject of a controversial effort to sharply revise U.S. environmental treaty practice.
Those parties with high hopes include a broad array of interests: rent-seeking companies, countries, pressure groups and supranational organizations. European Union countries, for example, seek to maintain certain treaty advantages similar to those in the original pact, off ering possible avoidance of pain and even financial reward. These included artifices such as the 1990 baseline year against which performance is measured, to give them credit for Kyoto-based greenhouse gas (GHG) emission reductions actually resulting from the post-1990 economic collapse in Central and Eastern Europe. Combined with the ability to pool emissions, this set provided the EU a less onerous path to complying with Kyoto’s terms.
Others include exempt, major emitting parties such as China, India, Mexico, Indonesia, Brazil and South Korea. And, of course, Russia hopes to perpetuate wealth transfers helping them modernize their energy infrastructure in the name of combating what remains the subject of a scientifically controversial theory. All of these factors contributed to the Bush administration declining to pursue Kyoto ratifi cation. Kyoto covers GHG emissions among 37 countries, from 2008-2012. To avoid a gap between regimes, it is generally agreed that a new pact must be adopted at the scheduled meeting in Copenhagen, December 2009. That event was expected to serve as the platform for Kyoto II symbolizing Obama’s break with the Bush foreign policy.
The long held presumption was that whomever succeeded Bush would accept U.S. participation in a “binding” pact, even one which still selectively covers only a handful of wealthy nations. This, the story went, would remove the one remaining obstacle to a successful international regime, namely U.S. opposition. The latter assumption is largely a myth, given that the U.S. signed the 1997 Kyoto treaty but neither the Clinton nor Bush administration sought ratifi cation. Also, there is nothing in the Constitution or statute requiring a president to formally ask the Senate to ratify a signed agreement, only tradition. No senator has showed any interest in forcing matters, likely due to the absence of the two-thirds vote necessary for such an agreement under Article II, Section 2 of the Constitution.
EU officials readily admit a commitment to the issue of “climate change” as a means to establish a “European” political identity. As such, their self-described “world leadership” is at stake, and the tremendous “loss of face” that Europe risks in the event that no new treaty is agreed is an express consideration. Unwelcome realities—such as the initial treaty’s failure to reduce emissions, even with great costs, including massive wealth transfers and capital fl ight (“carbon leakage” to exempt nations)—rarely intrude.
So, the narrative has thrived—Bush “rejected” or “withdrew from” Kyoto in some stark contrast to the Clinton administration—and set the stage for a dramatic gesture by his successor. U.S acceptance of Kyoto’s successor was to revive Kyoto as a viable international political and legal enterprise, even if the U.S. was the sole additional country to volunteer to bind its economic fortunes under this pact since it was hatched more than a decade ago.
Full paper here.
The Federalist Society, February 16, 2009
The new Obama administration generated great expectations, not least among those seeking to craft a successor treaty to the 1997 Kyoto Protocol on “global warming.” Yet early signals that an Obama administration had no plans to join Europe in agreeing to a successor pact next December, as expected, indicated that this issue would instead prove a stunning disappointment to its champions. Now, however, it appears that Kyoto will be the subject of a controversial effort to sharply revise U.S. environmental treaty practice.
Those parties with high hopes include a broad array of interests: rent-seeking companies, countries, pressure groups and supranational organizations. European Union countries, for example, seek to maintain certain treaty advantages similar to those in the original pact, off ering possible avoidance of pain and even financial reward. These included artifices such as the 1990 baseline year against which performance is measured, to give them credit for Kyoto-based greenhouse gas (GHG) emission reductions actually resulting from the post-1990 economic collapse in Central and Eastern Europe. Combined with the ability to pool emissions, this set provided the EU a less onerous path to complying with Kyoto’s terms.
Others include exempt, major emitting parties such as China, India, Mexico, Indonesia, Brazil and South Korea. And, of course, Russia hopes to perpetuate wealth transfers helping them modernize their energy infrastructure in the name of combating what remains the subject of a scientifically controversial theory. All of these factors contributed to the Bush administration declining to pursue Kyoto ratifi cation. Kyoto covers GHG emissions among 37 countries, from 2008-2012. To avoid a gap between regimes, it is generally agreed that a new pact must be adopted at the scheduled meeting in Copenhagen, December 2009. That event was expected to serve as the platform for Kyoto II symbolizing Obama’s break with the Bush foreign policy.
The long held presumption was that whomever succeeded Bush would accept U.S. participation in a “binding” pact, even one which still selectively covers only a handful of wealthy nations. This, the story went, would remove the one remaining obstacle to a successful international regime, namely U.S. opposition. The latter assumption is largely a myth, given that the U.S. signed the 1997 Kyoto treaty but neither the Clinton nor Bush administration sought ratifi cation. Also, there is nothing in the Constitution or statute requiring a president to formally ask the Senate to ratify a signed agreement, only tradition. No senator has showed any interest in forcing matters, likely due to the absence of the two-thirds vote necessary for such an agreement under Article II, Section 2 of the Constitution.
EU officials readily admit a commitment to the issue of “climate change” as a means to establish a “European” political identity. As such, their self-described “world leadership” is at stake, and the tremendous “loss of face” that Europe risks in the event that no new treaty is agreed is an express consideration. Unwelcome realities—such as the initial treaty’s failure to reduce emissions, even with great costs, including massive wealth transfers and capital fl ight (“carbon leakage” to exempt nations)—rarely intrude.
So, the narrative has thrived—Bush “rejected” or “withdrew from” Kyoto in some stark contrast to the Clinton administration—and set the stage for a dramatic gesture by his successor. U.S acceptance of Kyoto’s successor was to revive Kyoto as a viable international political and legal enterprise, even if the U.S. was the sole additional country to volunteer to bind its economic fortunes under this pact since it was hatched more than a decade ago.
Full paper here.
What to Do With the Uighurs?
What to Do With the Uighurs? By Matthew J. Franck
Bench Memos/NRO, Thursday, February 19, 2009
Back in October, Judge Ricardo Urbina of the federal district court in D.C. ruled, on a habeas petition, that 17 Uighurs (Muslims of Turkic ethnicity from western China) held at Guantanamo must be released into the United States. The government no longer considers the Uighurs to be enemy combatants—i.e. included among those against whom military force was authorized in 2001—but neither does it wish to see them come stateside, for very good reasons (like their membership in a group our State Department lists as a terrorist organization). And sending them back to China looks like a sentence to torture and death at the hands of the Chinese government.
So clearly the Bush administration—as now its successor the Obama administration—had a dilemma about the final disposition of the Uighurs. But it was judicial activism of the worst sort for a judge to order the release of these men into the U.S. Thus it was good news yesterday when a three-judge panel of the D.C. Circuit Court of Appeals unanimously overturned the district court ruling. One of the three appellate judges, though, Judith Rogers, concurred in the judgment on grounds that Judge Urbina had merely acted hastily; Judge Rogers would appear to be ready to release the Uighurs into the U.S. after a couple more questions are answered.
Thank goodness, then, for Judges Arthur Randolph and Karen Henderson, who held (in Randolph's opinion for the court) that there is no power presently in the hands of federal judges to admit aliens to the United States whom the political branches of government have not seen fit to admit under relevant immigration laws and procedures. The Uighurs, Randolph pointed out, haven't even applied for admission to the United States under any immigration rubric. And never in our history has a federal court fashioned an extra-statutory ground for ordering the entry into the U.S. of an alien kept out of the country by the government.
Judge Rogers argues that the majority vitiates the Supreme Court's Boumediene ruling granting the Guantanamo detainees the privilege of habeas petitions in federal courts. She might better take that complaint to Justice Anthony Kennedy and his colleagues in the Boumediene majority, who created this mess. As Judge Randolph notes, it is one thing to say that a habeas-wielding court can order someone's release from detention—as Boumediene seemed to say of Gitmo detainees. But it's another thing to say that a court using habeas powers can order someone's entry into the U.S. in defiance of our immigration laws and of the policymakers who implement them. Not for the first time, the D.C. Circuit finds itself dealing with a crapstorm originating above them in the Supreme Court.
Does this mean Boumediene is a dead letter? One can hope so, but I doubt it. It does mean that while this matter is left to courts, easy answers won't be forthcoming. And it means that President Obama—who can enjoy this moment as the named respondent in Kiyemba v. Obama—has the ball in his court now. Close Gitmo? Sure. Then what?
Maybe he can brainstorm up a solution with that other legal genius, Anthony Kennedy.
Bench Memos/NRO, Thursday, February 19, 2009
Back in October, Judge Ricardo Urbina of the federal district court in D.C. ruled, on a habeas petition, that 17 Uighurs (Muslims of Turkic ethnicity from western China) held at Guantanamo must be released into the United States. The government no longer considers the Uighurs to be enemy combatants—i.e. included among those against whom military force was authorized in 2001—but neither does it wish to see them come stateside, for very good reasons (like their membership in a group our State Department lists as a terrorist organization). And sending them back to China looks like a sentence to torture and death at the hands of the Chinese government.
So clearly the Bush administration—as now its successor the Obama administration—had a dilemma about the final disposition of the Uighurs. But it was judicial activism of the worst sort for a judge to order the release of these men into the U.S. Thus it was good news yesterday when a three-judge panel of the D.C. Circuit Court of Appeals unanimously overturned the district court ruling. One of the three appellate judges, though, Judith Rogers, concurred in the judgment on grounds that Judge Urbina had merely acted hastily; Judge Rogers would appear to be ready to release the Uighurs into the U.S. after a couple more questions are answered.
Thank goodness, then, for Judges Arthur Randolph and Karen Henderson, who held (in Randolph's opinion for the court) that there is no power presently in the hands of federal judges to admit aliens to the United States whom the political branches of government have not seen fit to admit under relevant immigration laws and procedures. The Uighurs, Randolph pointed out, haven't even applied for admission to the United States under any immigration rubric. And never in our history has a federal court fashioned an extra-statutory ground for ordering the entry into the U.S. of an alien kept out of the country by the government.
Judge Rogers argues that the majority vitiates the Supreme Court's Boumediene ruling granting the Guantanamo detainees the privilege of habeas petitions in federal courts. She might better take that complaint to Justice Anthony Kennedy and his colleagues in the Boumediene majority, who created this mess. As Judge Randolph notes, it is one thing to say that a habeas-wielding court can order someone's release from detention—as Boumediene seemed to say of Gitmo detainees. But it's another thing to say that a court using habeas powers can order someone's entry into the U.S. in defiance of our immigration laws and of the policymakers who implement them. Not for the first time, the D.C. Circuit finds itself dealing with a crapstorm originating above them in the Supreme Court.
Does this mean Boumediene is a dead letter? One can hope so, but I doubt it. It does mean that while this matter is left to courts, easy answers won't be forthcoming. And it means that President Obama—who can enjoy this moment as the named respondent in Kiyemba v. Obama—has the ball in his court now. Close Gitmo? Sure. Then what?
Maybe he can brainstorm up a solution with that other legal genius, Anthony Kennedy.
U.S. Ahead of Moscow Treaty Schedule in Reducing Its Nuclear Arsenal
U.S. Ahead of Moscow Treaty Schedule in Reducing Its Nuclear Arsenal. By Walter Pincus
Washington Post, Friday, February 13, 2009; Page A03
The United States is more than two years ahead of the schedule set under the Moscow Treaty in reducing the number of its nuclear warheads operationally deployed on strategic missiles and bombers, according to congressional and administration sources.
There are fewer than 2,200 deployed warheads, the goal originally set to be reached by Dec. 31, 2012.
"The reduction was initially planned to be met in 2012, then 2010, but was achieved a few days ago," said Hans Kristensen, director of the nuclear information project of the Federation of American Scientists, who first disclosed the information on his Web site.
While not giving the exact number of deployed warheads, Rep. Ellen O. Tauscher (D-Calif.), chairman of the House Armed Services subcommittee on strategic forces, said Wednesday, "We are in compliance with the Moscow Treaty." She said reaching that goal early could support an effort by the Obama administration to get the Russians to go to even lower levels.
The total U.S. nuclear stockpile remains above 5,000 warheads, with the majority held in strategic reserve but available for deployment if necessary. There are probably 3,000 to 4,000 more warheads in storage awaiting dismantling, according to Kristensen.
Tauscher noted that the weakness of the 2002 Moscow Treaty was that while it limited the number of operationally deployed warheads, it left out those not connected to delivery systems and in storage. She said she expects the United States to push for improvements in the Moscow agreement. "We need to broaden the definitions and work with the Russians to account for everything," she said.
Kristensen said modest reductions in deployed warheads that began during the Clinton administration were expanded under President George W. Bush. "In the past four years, they have overhauled the strategic war plan," he said. "Strategic Command believes it can adequately meet the White House guidance with far less weapons."
Some experts think the Bush administration does not get enough credit for the reductions it has made in nuclear weapons. Robert S. Norris, a senior research associate at the Natural Resources Defense Fund, said yesterday, "It is little appreciated or known that the two Bush presidencies have gotten rid of three-quarters of the U.S. nuclear stockpile."
According to Norris, the United States had about 22,000 strategic and tactical nuclear warheads at the end of the Cold War. In 1991, President George H.W. Bush ordered the withdrawal of all tactical weapons and signed the Strategic Arms Reduction Treaty (START), cutting the total to approximately 11,000. "His son cut it in half again by the end of his administration," Norris said, "and this will be the baseline for further reductions during the Obama administration."
As the Bush administration was reducing deployed warheads, it was pressing Congress to approve funding for development of a new warhead under the Reliable Replacement Warhead program. The RRW was to be based on an old, tested design with no new testing needed before being deployed. It was to be more secure and reliable over the next decade than today's aging Cold War nuclear warheads, even those that had been refurbished. Congress, however, eliminated funding for the RRW in fiscal 2009, with members saying they would await results of the Obama administration's nuclear posture review.
Meanwhile, the Obama administration has proposed an aggressive arms-control agenda that includes reductions in nuclear weapons, Senate ratification of the Comprehensive Test Ban Treaty and renewal of the verification procedures of START, which run out at the end of the year.
Washington Post, Friday, February 13, 2009; Page A03
The United States is more than two years ahead of the schedule set under the Moscow Treaty in reducing the number of its nuclear warheads operationally deployed on strategic missiles and bombers, according to congressional and administration sources.
There are fewer than 2,200 deployed warheads, the goal originally set to be reached by Dec. 31, 2012.
"The reduction was initially planned to be met in 2012, then 2010, but was achieved a few days ago," said Hans Kristensen, director of the nuclear information project of the Federation of American Scientists, who first disclosed the information on his Web site.
While not giving the exact number of deployed warheads, Rep. Ellen O. Tauscher (D-Calif.), chairman of the House Armed Services subcommittee on strategic forces, said Wednesday, "We are in compliance with the Moscow Treaty." She said reaching that goal early could support an effort by the Obama administration to get the Russians to go to even lower levels.
The total U.S. nuclear stockpile remains above 5,000 warheads, with the majority held in strategic reserve but available for deployment if necessary. There are probably 3,000 to 4,000 more warheads in storage awaiting dismantling, according to Kristensen.
Tauscher noted that the weakness of the 2002 Moscow Treaty was that while it limited the number of operationally deployed warheads, it left out those not connected to delivery systems and in storage. She said she expects the United States to push for improvements in the Moscow agreement. "We need to broaden the definitions and work with the Russians to account for everything," she said.
Kristensen said modest reductions in deployed warheads that began during the Clinton administration were expanded under President George W. Bush. "In the past four years, they have overhauled the strategic war plan," he said. "Strategic Command believes it can adequately meet the White House guidance with far less weapons."
Some experts think the Bush administration does not get enough credit for the reductions it has made in nuclear weapons. Robert S. Norris, a senior research associate at the Natural Resources Defense Fund, said yesterday, "It is little appreciated or known that the two Bush presidencies have gotten rid of three-quarters of the U.S. nuclear stockpile."
According to Norris, the United States had about 22,000 strategic and tactical nuclear warheads at the end of the Cold War. In 1991, President George H.W. Bush ordered the withdrawal of all tactical weapons and signed the Strategic Arms Reduction Treaty (START), cutting the total to approximately 11,000. "His son cut it in half again by the end of his administration," Norris said, "and this will be the baseline for further reductions during the Obama administration."
As the Bush administration was reducing deployed warheads, it was pressing Congress to approve funding for development of a new warhead under the Reliable Replacement Warhead program. The RRW was to be based on an old, tested design with no new testing needed before being deployed. It was to be more secure and reliable over the next decade than today's aging Cold War nuclear warheads, even those that had been refurbished. Congress, however, eliminated funding for the RRW in fiscal 2009, with members saying they would await results of the Obama administration's nuclear posture review.
Meanwhile, the Obama administration has proposed an aggressive arms-control agenda that includes reductions in nuclear weapons, Senate ratification of the Comprehensive Test Ban Treaty and renewal of the verification procedures of START, which run out at the end of the year.
Most US late-stage human clinical trials done abroad
Most Clinical Trials Done Abroad, by Shirley S Wang
WSJ, Feb 19, 2009
Most testing for the U.S. drug industry's late-stage human trials is now done at sites outside the country, where results often can be obtained cheaper and faster, according to a study.
The study found that 13,521 of 24,206 sites being used in November 2007 for studies sponsored by the 20 largest U.S. drug makers were international, and that the number of countries conducting testing has doubled over the past 10 years. The study was published in Wednesday's New England Journal of Medicine.
The findings add to concerns about the ethical treatment of participants and the integrity of the research data produced in developing countries. Experts also say patients in developing countries may be taken advantage of because they are poorer and less familiar with the research process.
In November, Indian drug regulators halted a trial of a Wyeth vaccine after an infant died, in order to investigate whether babies were properly screened before being enrolled in the study. In a Polish study in 2007 of a bird-flu vaccine being developed by Novartis AG, two elderly patients died who should have been excluded based on age. Other companies faced criticism earlier in the decade about testing drugs in populations that couldn't afford the medicines.
"Clearly there are major challenges both in terms of ethical oversight of the research and the scientific rigor," said Seth Glickman, an assistant professor of emergency medicine at the University of North Carolina-Chapel Hill who was first author of the study.
Helping to make overseas trials cheaper and faster, patients in developing countries are often more willing to enroll in studies because of lack of alternative treatment options, and often they aren't taking other medicines. Such "drug-naive" patients can be sought after because it is easier to show that experimental treatments are better than placebos, rather than trying to show an improvement over currently available drugs.
Reviewing published studies, authors of the journal article found proper research oversight and adequate informed consent for participants was inconsistent in developing countries. In one study reviewed, only 56% of 670 researchers who conducted trials in those countries said their studies had been approved by ethics boards or health officials. In another study, only 18% of researchers appropriately informed participants about the study before enrolling them.
The article comes at a time when some say the U.S. is moving in the wrong direction with regard to ethical treatment of study participants. Last year, the Food and Drug Administration updated its guidelines for conducting international clinical trials, adopting a standard used by many countries and organizations known as "good clinical practices."
The shift has been controversial. Critics believe the updated guidelines are less ethically rigorous and more industry friendly compared to the former guidelines, known as the Declaration of Helsinki. One version of that declaration forbade placebo-controlled trials and had provisions in it about companies' obligations to provide access to medicines to those populations in which the treatments had been tested.
It "set a higher standard," said Sonal Singh, an assistant professor at Wake Forest who studies international clinical trials and wasn't involved in Wednesday's report. "You're kind of dispensing with an ethical document," he said of the updated guidelines.
The FDA says that placebo-controlled trials are necessary under certain circumstances and that it also encourages post-market medication access to be discussed during protocol design. The new standards ensure protection for participants by mandating that studies be reviewed by international ethics committees and that informed consent be obtained from all participants, the agency says.
"Good clinical practice is meant to assure quality clinical trials, as well as the implementation of high-level clinical trial ethics," said David Lepay, senior adviser for clinical science at the FDA. "We do not see that there are fundamental differences between [good clinical practice] and other ethical standards in assuring the fundamental rights of subjects."
The authors of the new report also suggest that bureaucracy and regulatory hurdles in the U.S. are partly responsible for making going abroad so enticing. The requirements stretch out the amount of time it takes to complete a study and can add to costs as well. "Many of the policies in regards to the regulatory framework are well intentioned," said Dr. Glickman. "They have the unintended effect of being very onerous from the administrative standpoint."
In the U.S., each site seeking to conduct a study must have its ethics board approve it. But many studies these days are considered "multisite," where one company or sponsor runs the same trial at different centers and pools the data. The U.S. review process means redundant effort for such studies, according to Kevin Schulman, a professor of medicine and business administration at Duke University and another study author.
WSJ, Feb 19, 2009
Most testing for the U.S. drug industry's late-stage human trials is now done at sites outside the country, where results often can be obtained cheaper and faster, according to a study.
The study found that 13,521 of 24,206 sites being used in November 2007 for studies sponsored by the 20 largest U.S. drug makers were international, and that the number of countries conducting testing has doubled over the past 10 years. The study was published in Wednesday's New England Journal of Medicine.
The findings add to concerns about the ethical treatment of participants and the integrity of the research data produced in developing countries. Experts also say patients in developing countries may be taken advantage of because they are poorer and less familiar with the research process.
In November, Indian drug regulators halted a trial of a Wyeth vaccine after an infant died, in order to investigate whether babies were properly screened before being enrolled in the study. In a Polish study in 2007 of a bird-flu vaccine being developed by Novartis AG, two elderly patients died who should have been excluded based on age. Other companies faced criticism earlier in the decade about testing drugs in populations that couldn't afford the medicines.
"Clearly there are major challenges both in terms of ethical oversight of the research and the scientific rigor," said Seth Glickman, an assistant professor of emergency medicine at the University of North Carolina-Chapel Hill who was first author of the study.
Helping to make overseas trials cheaper and faster, patients in developing countries are often more willing to enroll in studies because of lack of alternative treatment options, and often they aren't taking other medicines. Such "drug-naive" patients can be sought after because it is easier to show that experimental treatments are better than placebos, rather than trying to show an improvement over currently available drugs.
Reviewing published studies, authors of the journal article found proper research oversight and adequate informed consent for participants was inconsistent in developing countries. In one study reviewed, only 56% of 670 researchers who conducted trials in those countries said their studies had been approved by ethics boards or health officials. In another study, only 18% of researchers appropriately informed participants about the study before enrolling them.
The article comes at a time when some say the U.S. is moving in the wrong direction with regard to ethical treatment of study participants. Last year, the Food and Drug Administration updated its guidelines for conducting international clinical trials, adopting a standard used by many countries and organizations known as "good clinical practices."
The shift has been controversial. Critics believe the updated guidelines are less ethically rigorous and more industry friendly compared to the former guidelines, known as the Declaration of Helsinki. One version of that declaration forbade placebo-controlled trials and had provisions in it about companies' obligations to provide access to medicines to those populations in which the treatments had been tested.
It "set a higher standard," said Sonal Singh, an assistant professor at Wake Forest who studies international clinical trials and wasn't involved in Wednesday's report. "You're kind of dispensing with an ethical document," he said of the updated guidelines.
The FDA says that placebo-controlled trials are necessary under certain circumstances and that it also encourages post-market medication access to be discussed during protocol design. The new standards ensure protection for participants by mandating that studies be reviewed by international ethics committees and that informed consent be obtained from all participants, the agency says.
"Good clinical practice is meant to assure quality clinical trials, as well as the implementation of high-level clinical trial ethics," said David Lepay, senior adviser for clinical science at the FDA. "We do not see that there are fundamental differences between [good clinical practice] and other ethical standards in assuring the fundamental rights of subjects."
The authors of the new report also suggest that bureaucracy and regulatory hurdles in the U.S. are partly responsible for making going abroad so enticing. The requirements stretch out the amount of time it takes to complete a study and can add to costs as well. "Many of the policies in regards to the regulatory framework are well intentioned," said Dr. Glickman. "They have the unintended effect of being very onerous from the administrative standpoint."
In the U.S., each site seeking to conduct a study must have its ethics board approve it. But many studies these days are considered "multisite," where one company or sponsor runs the same trial at different centers and pools the data. The U.S. review process means redundant effort for such studies, according to Kevin Schulman, a professor of medicine and business administration at Duke University and another study author.
A Prenatal Link to Alzheimer's?
A Prenatal Link to Alzheimer's? By Ron Winslow
Researchers Propose Process in Fetal Development Is Reactivated Later in Life
WSJ, Feb 19, 2009
New research at Genentech Inc. is challenging conventional thinking about Alzheimer's disease, providing a provocative theory about its cause and suggesting potential new targets for therapies to treat it.
The researchers propose that a normal process in which excess nerve cells and nerve fibers are pruned from the brain during prenatal development is somehow reactivated in the adult brain and "hijacked" to cause the death of such cells in Alzheimer's patients.
The dominant view of Alzheimer's disease today is that it is caused by deposits called beta amyloid that accumulate in the brain because of bad luck or other unknown reasons, degrading and destroying nerve cells and robbing victims of their memory.
The new findings offer evidence that "Alzheimer's is not just bad luck, but rather it is the activation of a pathway that is there for development purposes," says Marc Tessier-Lavigne, executive vice president, research drug discovery, at Genentech. "It suggests a different way of looking at Alzheimer's disease."
[photos in original article]
The report, being published Thursday in the journal Nature, is based on laboratory and mouse experiments, and further study is needed to validate the hypothesis.
Genentech, a South San Francisco, Calif., biotech company, says it has identified potential drug candidates based on the findings, but even if they prove promising, it would take several years for any potential treatment to be developed.
Beta amyloid, a fragment of a larger molecule called amyloid precursor protein, or APP, has long been the dominant focus of Alzheimer's research. Many drug companies have compounds in development that are intended to block or clear the buildup of beta amyloid plaques in the brain. But the track record for developing effective drugs has been unimpressive so far. Moreover, some people accumulate beta amyloid in the brain without any apparent effect on memory, adding to confusion about its role in Alzheimer's.
During human development, the prenatal brain makes about twice the number of nerve cells it needs, Dr. Tessier-Lavigne explained. Those neurons, in turn, each make hundreds of nerve fibers that seek to make connections with other cells. The cells and nerve fibers that succeed in making connections survive -- while those that don't naturally trigger a self-destruction mechanism called apoptosis that clears out the unneeded cells.
"We make too many, and then we prune back," Dr. Tessier-Lavigne said. "The system gets sculpted so you have the right set of connections."
What he and his colleagues, including scientists from the Salk Institute, La Jolla, Calif., found is that the amyloid precursor protein linked to Alzheimer's also plays a critical role in triggering the prenatal pruning process. But the beta amyloid that appears to kill nerve cells in Alzheimer's patients isn't involved in the developing embryo. Instead, the pruning is sparked by another fragment of APP called N-APP, causing a cascade of events that results in the death of excess nerve cells and nerve fibers.
"This suggests that APP may go through a novel pathway rather than beta amyloid to cause Alzheimer's disease," says Paul Greengard, a scientist at Rockefeller University, New York, and a Nobel laureate who wasn't involved in the research. He called the paper "an important step" in understanding the pathology of Alzheimer's -- something that is necessary to develop better drugs.
Don Nicholson, a Merck & Co. vice president and author of a commentary that accompanies the Tessier-Lavigne study in Nature, said the paper doesn't rule out a role for beta amyloid. He added that given the intense focus on the role of beta amyloid in the disease, the finding that another part of the precursor protein may be important in Alzheimer's is "unexpected biology."
Exactly what triggers the reappearance in the adult brain of a process fundamental to its early prenatal development isn't clear and is the subject of continuing research, Dr. Tessier-Lavigne said. Meantime, there are several steps in the cascade of events that lead to the death of the developing neurons and nerve fibers. If the process reflects the unwanted death of such cells in Alzheimer's, it presents several places where a drug could block or affect the process to possibly prevent the damage.
"We've identified a mechanism of nerve-cell death and degeneration involving amyloid precursor protein in the embryo," he said. "What Alzheimer's is doing is hijacking not only the molecule but the whole mechanism of degeneration."
Meantime, a second paper published last month by a team including researchers at Buck Institute for Age Research, Novato, Calif., reported that a protein called netrin-1 appears to regulate production of beta amyloid. The finding, which appeared in the journal Cell Death and Differentiation, is behind the authors' belief that Alzheimer's is the result of normal processes going awry.
Together the papers add to intriguing evidence that beta amyloid is perhaps only part of the Alzheimer's story. "What we're seeing is a completely different view of the disease," said Dale Bredesen, a Buck Institute researcher and co-author of the paper. The brain has to make connections and break connections all the time. Alzheimer's, he suggests, is the result when that process is out of balance.
Researchers Propose Process in Fetal Development Is Reactivated Later in Life
WSJ, Feb 19, 2009
New research at Genentech Inc. is challenging conventional thinking about Alzheimer's disease, providing a provocative theory about its cause and suggesting potential new targets for therapies to treat it.
The researchers propose that a normal process in which excess nerve cells and nerve fibers are pruned from the brain during prenatal development is somehow reactivated in the adult brain and "hijacked" to cause the death of such cells in Alzheimer's patients.
The dominant view of Alzheimer's disease today is that it is caused by deposits called beta amyloid that accumulate in the brain because of bad luck or other unknown reasons, degrading and destroying nerve cells and robbing victims of their memory.
The new findings offer evidence that "Alzheimer's is not just bad luck, but rather it is the activation of a pathway that is there for development purposes," says Marc Tessier-Lavigne, executive vice president, research drug discovery, at Genentech. "It suggests a different way of looking at Alzheimer's disease."
[photos in original article]
The report, being published Thursday in the journal Nature, is based on laboratory and mouse experiments, and further study is needed to validate the hypothesis.
Genentech, a South San Francisco, Calif., biotech company, says it has identified potential drug candidates based on the findings, but even if they prove promising, it would take several years for any potential treatment to be developed.
Beta amyloid, a fragment of a larger molecule called amyloid precursor protein, or APP, has long been the dominant focus of Alzheimer's research. Many drug companies have compounds in development that are intended to block or clear the buildup of beta amyloid plaques in the brain. But the track record for developing effective drugs has been unimpressive so far. Moreover, some people accumulate beta amyloid in the brain without any apparent effect on memory, adding to confusion about its role in Alzheimer's.
During human development, the prenatal brain makes about twice the number of nerve cells it needs, Dr. Tessier-Lavigne explained. Those neurons, in turn, each make hundreds of nerve fibers that seek to make connections with other cells. The cells and nerve fibers that succeed in making connections survive -- while those that don't naturally trigger a self-destruction mechanism called apoptosis that clears out the unneeded cells.
"We make too many, and then we prune back," Dr. Tessier-Lavigne said. "The system gets sculpted so you have the right set of connections."
What he and his colleagues, including scientists from the Salk Institute, La Jolla, Calif., found is that the amyloid precursor protein linked to Alzheimer's also plays a critical role in triggering the prenatal pruning process. But the beta amyloid that appears to kill nerve cells in Alzheimer's patients isn't involved in the developing embryo. Instead, the pruning is sparked by another fragment of APP called N-APP, causing a cascade of events that results in the death of excess nerve cells and nerve fibers.
"This suggests that APP may go through a novel pathway rather than beta amyloid to cause Alzheimer's disease," says Paul Greengard, a scientist at Rockefeller University, New York, and a Nobel laureate who wasn't involved in the research. He called the paper "an important step" in understanding the pathology of Alzheimer's -- something that is necessary to develop better drugs.
Don Nicholson, a Merck & Co. vice president and author of a commentary that accompanies the Tessier-Lavigne study in Nature, said the paper doesn't rule out a role for beta amyloid. He added that given the intense focus on the role of beta amyloid in the disease, the finding that another part of the precursor protein may be important in Alzheimer's is "unexpected biology."
Exactly what triggers the reappearance in the adult brain of a process fundamental to its early prenatal development isn't clear and is the subject of continuing research, Dr. Tessier-Lavigne said. Meantime, there are several steps in the cascade of events that lead to the death of the developing neurons and nerve fibers. If the process reflects the unwanted death of such cells in Alzheimer's, it presents several places where a drug could block or affect the process to possibly prevent the damage.
"We've identified a mechanism of nerve-cell death and degeneration involving amyloid precursor protein in the embryo," he said. "What Alzheimer's is doing is hijacking not only the molecule but the whole mechanism of degeneration."
Meantime, a second paper published last month by a team including researchers at Buck Institute for Age Research, Novato, Calif., reported that a protein called netrin-1 appears to regulate production of beta amyloid. The finding, which appeared in the journal Cell Death and Differentiation, is behind the authors' belief that Alzheimer's is the result of normal processes going awry.
Together the papers add to intriguing evidence that beta amyloid is perhaps only part of the Alzheimer's story. "What we're seeing is a completely different view of the disease," said Dale Bredesen, a Buck Institute researcher and co-author of the paper. The brain has to make connections and break connections all the time. Alzheimer's, he suggests, is the result when that process is out of balance.
Wednesday, February 18, 2009
Embrace Canadian Energy
IER: Embrace Canadian Energy
The Institute for Energy Research, Feb 18, 2009
Washington, D.C. – The Institute for Energy Research (IER) today released the following fact sheet on the important energy trade relationship between the United States and Canada in advance of the President’s trip there tomorrow. President Obama should resist the call from some organizations to antagonize Canada, our largest and most stable trading partner. Today’s economic climate should reinforce that he instead must move to strengthen our important economic ties with our northern neighbors.
We import more energy from Canada than any other country:
We export more goods to Canada than any other country:
Our shared resources could provide both nations with an energy and economic boon:
NOTE: The important energy trade relationship between the United States and Canada is indisputable. Any attempt to impede on that relationship would further the economic uncertainty in the United States.
The Institute for Energy Research, Feb 18, 2009
Washington, D.C. – The Institute for Energy Research (IER) today released the following fact sheet on the important energy trade relationship between the United States and Canada in advance of the President’s trip there tomorrow. President Obama should resist the call from some organizations to antagonize Canada, our largest and most stable trading partner. Today’s economic climate should reinforce that he instead must move to strengthen our important economic ties with our northern neighbors.
We import more energy from Canada than any other country:
The United States imports more natural gas, refined gasoline, and oil from
Canada than any other nation in the world—17 percent of our oil and 18 percent
of our natural gas.
Nearly 100 percent of Canada’s energy exports go to the United States.
Canada supplies 2.5 million barrels of oil for the U.S. each day, which is
roughly the equivalent of what we import from Saudi Arabia and Nigeria combined.
Oil sands make up 97 percent of Canada’s total proven reserves and 13
percent of total U.S. imports.
The United States imports 36 percent more oil from Canada than Saudi Arabia
and 320 percent more than Iraq.
We export more goods to Canada than any other country:
According to the U.S. State Department, The United States and Canada share
the largest energy trading relationship in the world.
Among other products, the United States exports 18.4 million short tons of
coal to Canada.
In 2007, 65 percent of Canada’s imports came from the United States.
Our shared resources could provide both nations with an energy and economic boon:
Canada’s total oil sands resources could be as large as 2.6 trillion
barrels.
According to the Department of Energy, if developed, U.S. oil shale
resources—which could total 2.1 trillion barrels—combined with Canada’s tar
sands, could allow the U.S. and Canada to claim the largest oil reserves in the
world.
NOTE: The important energy trade relationship between the United States and Canada is indisputable. Any attempt to impede on that relationship would further the economic uncertainty in the United States.
Industry Views: Low Carbon Fuel Standards: Recipes for Higher Gasoline Prices and Greater Reliance on Middle Eastern Oil
Low Carbon Fuel Standards: Recipes for Higher Gasoline Prices and Greater Reliance on Middle Eastern Oil
IER, Feb 18, 2009
Last December, California released a draft low carbon fuel standard (LCFS) which calls for a 10.5 percent reduction in the carbon intensity of gasoline and a 10 percent reduction for diesel. Following California’s lead, representatives of 11 Northeastern states recently signed an agreement to pursue a region-wide low-carbon fuel standard.
The proponents of LCFS claim the plan’s goal is to reduce emissions from motor vehicles and home-heating fuels. But as this analysis shows, an LCFS is another tax on transportation. An LCFS increases the price of gasoline and home heating oil, leads to more oil imports from the Middle East, and penalizes oil imports from our largest trading partner and biggest oil supplier—Canada.
What is a Low Carbon Fuel Standard?
For all practical purposes, LCFS is a new tax on gasoline and heating oil. It is new regulation designed to reduce the greenhouse gas emissions from fuels. The goal of these regulations is to take into account all of the greenhouse gas emissions from the production (including land use changes), manufacture, transportation and combustion of these fuels and then reduce these emissions.
According to the letter of intent signed by 11 states (Connecticut, Delaware, Maine, Maryland, Massachusetts, New York, New Hampshire, New Jersey, Pennsylvania, Rhode Island and Vermont) participating in the Northeastern LCFS scheme, an LCFS is a “market-based, technologically neutral policy to address the carbon content of fuels by requiring reductions in the average lifecycle GHG [greenhouse gas] emissions per unit of useful energy.”
Despite the assertions of LCFS proponents, an LCFS is not market-based— it’s a classic top-down regulation. It is not entirely technology neutral—in practice it obviously penalizes certain fuel-producing technologies. More importantly, it does not address the difficultly and possibly impracticality of accurately calculating “lifecycle GHG emissions.”
Seven Reasons Why LCFS Schemes are Flawed:
LCFS are based on the Field of Dreams principle—if you mandate it, it will come. LCFS are expensive, harmful to consumers, and diverts resources away from more productive investments. Breakthroughs in technology occur in the marketplace, not in government committee rooms. Policymakers are free to set standards and goals—such as 10 percent less carbon intensity or a manned missions to Mars—but that does not mean the technology to economically achieve those goal will immediately follow. For example, a couple of years ago, many people thought we could economically have low carbon fuels by merely increasing the biofuel content of gasoline. The majority of the science, however, does not support this belief (see bullet point 4 below).
Biofuel production increases the price of food and makes life more difficult for the world’s poor. Biofuels are “a crime against humanity” in the words of Jean Ziegler, the UN special rapporteur on the right to food. Biofuel takes land that has been used for food crops and replaces the food crops with fuel crops. This unnecessarily takes food out of the mouths of the world’s poor. Increased ethanol production has helped increase food prices and has led to great hardships around the world including food riots. Next-generation biofuels are supposed to somewhat relieve this problem by using non-food crops, such as switchgrass or miscanthus, to produce biofuel, but these crops will still compete for arable land and agricultural resources.
A nationwide LCFS would dramatically increase the price of gasoline. CRA International found that an LCFS of 8 percent by 2015 would cause motor fuel prices to increase by 140 percent in 2015.[1] An LCFS would reduce motor fuel supplies or cause fuel producers to purchase carbon dioxide offsets.
Many biofuels emit more greenhouse gases than gasoline. According to a recent study published in Science from the Nature Conservancy and the University of Minnesota, many biofuels emit more greenhouse gases than gasoline. The study’s authors stated that many biofuels produce “17 to 420 times more carbon dioxide than the fossil fuels they replace.” Other research has come to similar conclusions. The Energy and Resources Group at the University of Berkeley found that “if indirect emissions [resulting from the production of ethanol] are applied to the ethanol that is already in California’s gasoline, the carbon intensity of California’s gasoline increases by 3% to 33%.” Corn-based ethanol production not only emits more greenhouse gases than gasoline, but it may also be worse for air quality.[2]
An LCFS discriminates against oil production from oil sands in Canada and favors oil from the Middle East. The U.S. gets more oil from Canada than any other foreign country. Much of Canada’s oil production comes from oil sands. The production of oil from oil sands requires more energy (and carbon dioxide emissions) to produce than production of crude in the Middle East. As a result, an LCFS favors oil from the Middle East and penalizes our friends to the North.
An LCFS discriminates against coal-to-liquids technology and oil shale technologies. The United States has vast reserves of coal and oil shale. These sources are not yet economically competitive with other sources of oil, but if prices where to return to last summer’s highs, these technologies would be cost-competitive. One possible source of fuel is coal-to-liquids technology. The U.S. has the world’s largest reserves of coal. At current usage rates, we have 200-250 years of demonstrated coal reserves. Coal-to-liquids could give the U.S. much larger reserves of petroleum fuels. The U.S. also has massive reserves of oil locked in oil shale—at least 800 billion recoverable barrels of oil. This is nearly three times as much oil as Saudi Arabia has in reserves. Because we would need more energy to recover these energy sources than it takes to produce light crude, an LCFS discriminates against these domestic resources.
If the United States implemented and somehow complied with a nationwide LCFS of 10.5 percent today, the American reduction in emissions would be offset by emissions increases from the rest of the world in less than 80 days.[3] Global warming is a global issue. What matters are not just emissions from the United States, but emissions worldwide. Unilateral changes by the United States alone will not have much of an impact, especially when we are talking about very small reductions in one sector. Because developing countries are dramatically increasing their carbon dioxide emissions, the U.S. will emit a smaller and smaller share of the world’s total greenhouse gas emissions.[4] According to data from the Global Carbon Project, from 2000 through 2007, global total greenhouse gas emissions increased 26 percent. During that same period, China’s carbon dioxide emissions increased 98 percent, India’s increased 36 percent and Russia’s increased 10 percent, while the U.S. increase was a mere 3 percent.[5] Because of these increases from developing countries, unilateral actions by the U.S., such as implementation of a nationwide LCFS, will have little to no effect on the global climate. Actions taken by California, or 11 Northeastern states will have even less impact.
Conclusion: An LCFS is Another Tax on Transportation
An LCFS, either nationwide or at the state level, would damage economy without having an impact global temperatures. The technology to implement an LCFS does not currently exist. If an LCFS resulted in increased biofuel use, it would be very harmful to the world’s poor. Finally, for those worried about energy security, an LCFS would favor Middle Eastern oil over Canadian and domestic fuels.
References
[1] CRA International, Economic Analysis of the Lieberman-Warner Climate Security Act of 2007 Using CRA’s MRN-NEEM Model (Apr. 8, 2008) p. 29, cited in Larry Parker & Brent Yacobucci, CRS Report for Congress: Climate Change: Costs and Benefits of S. 2191, (Mar. 15, 2008) p. CRS-56.
[2] The study will soon be published in the Proceedings of the National Academy of Sciences.
[3] Calculated using the emissions data from the Global Carbon Project. According to EPA, the GHG emissions from the transportation sector total 28 percent of total U.S. emissions in 2006. Environmental Protection Agency, Regulating Greenhouse Gas Emissions Under the Clean Air Act; Proposed Rule, 73 Fed. Reg. 44354, 44403 (July, 30, 2008). Twenty-eight percent of the U.S.’s 2006 carbon dioxide emissions are 436,141 GgC. A nationwide LCFS for the entire transportation sector, if it followed California’s example, would reduce transportation emissions by 10.5 percent, or 45,795 GgC per year. From 2006 to 2007, the world’s carbon dioxide emissions (excluding the United States) increased by 213,436 GgC. At this rate of change, the 10.5percent LCFS-forced reduction in U.S. transportation emissions would be replaced in 78.3 days.
[4] According to the Global Carbon project in 2007, China emitted 21 percent of the world’s carbon equivalent and the U.S. emitted 19 percent.
[5] Calculated using the emission data from the Global Carbon Project. In 2000, China emitted 910,950 GgC, India 316,804 GgC, Russia 391,652 GgC, and the U.S. 1,541,013 GgC. By 2007, China emitted 1,801,932 GgC, India 429,601 GgC, Russia 432,486 GgC, and the U.S. 1,586,213 GgC.
IER, Feb 18, 2009
Last December, California released a draft low carbon fuel standard (LCFS) which calls for a 10.5 percent reduction in the carbon intensity of gasoline and a 10 percent reduction for diesel. Following California’s lead, representatives of 11 Northeastern states recently signed an agreement to pursue a region-wide low-carbon fuel standard.
The proponents of LCFS claim the plan’s goal is to reduce emissions from motor vehicles and home-heating fuels. But as this analysis shows, an LCFS is another tax on transportation. An LCFS increases the price of gasoline and home heating oil, leads to more oil imports from the Middle East, and penalizes oil imports from our largest trading partner and biggest oil supplier—Canada.
What is a Low Carbon Fuel Standard?
For all practical purposes, LCFS is a new tax on gasoline and heating oil. It is new regulation designed to reduce the greenhouse gas emissions from fuels. The goal of these regulations is to take into account all of the greenhouse gas emissions from the production (including land use changes), manufacture, transportation and combustion of these fuels and then reduce these emissions.
According to the letter of intent signed by 11 states (Connecticut, Delaware, Maine, Maryland, Massachusetts, New York, New Hampshire, New Jersey, Pennsylvania, Rhode Island and Vermont) participating in the Northeastern LCFS scheme, an LCFS is a “market-based, technologically neutral policy to address the carbon content of fuels by requiring reductions in the average lifecycle GHG [greenhouse gas] emissions per unit of useful energy.”
Despite the assertions of LCFS proponents, an LCFS is not market-based— it’s a classic top-down regulation. It is not entirely technology neutral—in practice it obviously penalizes certain fuel-producing technologies. More importantly, it does not address the difficultly and possibly impracticality of accurately calculating “lifecycle GHG emissions.”
Seven Reasons Why LCFS Schemes are Flawed:
LCFS are based on the Field of Dreams principle—if you mandate it, it will come. LCFS are expensive, harmful to consumers, and diverts resources away from more productive investments. Breakthroughs in technology occur in the marketplace, not in government committee rooms. Policymakers are free to set standards and goals—such as 10 percent less carbon intensity or a manned missions to Mars—but that does not mean the technology to economically achieve those goal will immediately follow. For example, a couple of years ago, many people thought we could economically have low carbon fuels by merely increasing the biofuel content of gasoline. The majority of the science, however, does not support this belief (see bullet point 4 below).
Biofuel production increases the price of food and makes life more difficult for the world’s poor. Biofuels are “a crime against humanity” in the words of Jean Ziegler, the UN special rapporteur on the right to food. Biofuel takes land that has been used for food crops and replaces the food crops with fuel crops. This unnecessarily takes food out of the mouths of the world’s poor. Increased ethanol production has helped increase food prices and has led to great hardships around the world including food riots. Next-generation biofuels are supposed to somewhat relieve this problem by using non-food crops, such as switchgrass or miscanthus, to produce biofuel, but these crops will still compete for arable land and agricultural resources.
A nationwide LCFS would dramatically increase the price of gasoline. CRA International found that an LCFS of 8 percent by 2015 would cause motor fuel prices to increase by 140 percent in 2015.[1] An LCFS would reduce motor fuel supplies or cause fuel producers to purchase carbon dioxide offsets.
Many biofuels emit more greenhouse gases than gasoline. According to a recent study published in Science from the Nature Conservancy and the University of Minnesota, many biofuels emit more greenhouse gases than gasoline. The study’s authors stated that many biofuels produce “17 to 420 times more carbon dioxide than the fossil fuels they replace.” Other research has come to similar conclusions. The Energy and Resources Group at the University of Berkeley found that “if indirect emissions [resulting from the production of ethanol] are applied to the ethanol that is already in California’s gasoline, the carbon intensity of California’s gasoline increases by 3% to 33%.” Corn-based ethanol production not only emits more greenhouse gases than gasoline, but it may also be worse for air quality.[2]
An LCFS discriminates against oil production from oil sands in Canada and favors oil from the Middle East. The U.S. gets more oil from Canada than any other foreign country. Much of Canada’s oil production comes from oil sands. The production of oil from oil sands requires more energy (and carbon dioxide emissions) to produce than production of crude in the Middle East. As a result, an LCFS favors oil from the Middle East and penalizes our friends to the North.
An LCFS discriminates against coal-to-liquids technology and oil shale technologies. The United States has vast reserves of coal and oil shale. These sources are not yet economically competitive with other sources of oil, but if prices where to return to last summer’s highs, these technologies would be cost-competitive. One possible source of fuel is coal-to-liquids technology. The U.S. has the world’s largest reserves of coal. At current usage rates, we have 200-250 years of demonstrated coal reserves. Coal-to-liquids could give the U.S. much larger reserves of petroleum fuels. The U.S. also has massive reserves of oil locked in oil shale—at least 800 billion recoverable barrels of oil. This is nearly three times as much oil as Saudi Arabia has in reserves. Because we would need more energy to recover these energy sources than it takes to produce light crude, an LCFS discriminates against these domestic resources.
If the United States implemented and somehow complied with a nationwide LCFS of 10.5 percent today, the American reduction in emissions would be offset by emissions increases from the rest of the world in less than 80 days.[3] Global warming is a global issue. What matters are not just emissions from the United States, but emissions worldwide. Unilateral changes by the United States alone will not have much of an impact, especially when we are talking about very small reductions in one sector. Because developing countries are dramatically increasing their carbon dioxide emissions, the U.S. will emit a smaller and smaller share of the world’s total greenhouse gas emissions.[4] According to data from the Global Carbon Project, from 2000 through 2007, global total greenhouse gas emissions increased 26 percent. During that same period, China’s carbon dioxide emissions increased 98 percent, India’s increased 36 percent and Russia’s increased 10 percent, while the U.S. increase was a mere 3 percent.[5] Because of these increases from developing countries, unilateral actions by the U.S., such as implementation of a nationwide LCFS, will have little to no effect on the global climate. Actions taken by California, or 11 Northeastern states will have even less impact.
Conclusion: An LCFS is Another Tax on Transportation
An LCFS, either nationwide or at the state level, would damage economy without having an impact global temperatures. The technology to implement an LCFS does not currently exist. If an LCFS resulted in increased biofuel use, it would be very harmful to the world’s poor. Finally, for those worried about energy security, an LCFS would favor Middle Eastern oil over Canadian and domestic fuels.
References
[1] CRA International, Economic Analysis of the Lieberman-Warner Climate Security Act of 2007 Using CRA’s MRN-NEEM Model (Apr. 8, 2008) p. 29, cited in Larry Parker & Brent Yacobucci, CRS Report for Congress: Climate Change: Costs and Benefits of S. 2191, (Mar. 15, 2008) p. CRS-56.
[2] The study will soon be published in the Proceedings of the National Academy of Sciences.
[3] Calculated using the emissions data from the Global Carbon Project. According to EPA, the GHG emissions from the transportation sector total 28 percent of total U.S. emissions in 2006. Environmental Protection Agency, Regulating Greenhouse Gas Emissions Under the Clean Air Act; Proposed Rule, 73 Fed. Reg. 44354, 44403 (July, 30, 2008). Twenty-eight percent of the U.S.’s 2006 carbon dioxide emissions are 436,141 GgC. A nationwide LCFS for the entire transportation sector, if it followed California’s example, would reduce transportation emissions by 10.5 percent, or 45,795 GgC per year. From 2006 to 2007, the world’s carbon dioxide emissions (excluding the United States) increased by 213,436 GgC. At this rate of change, the 10.5percent LCFS-forced reduction in U.S. transportation emissions would be replaced in 78.3 days.
[4] According to the Global Carbon project in 2007, China emitted 21 percent of the world’s carbon equivalent and the U.S. emitted 19 percent.
[5] Calculated using the emission data from the Global Carbon Project. In 2000, China emitted 910,950 GgC, India 316,804 GgC, Russia 391,652 GgC, and the U.S. 1,541,013 GgC. By 2007, China emitted 1,801,932 GgC, India 429,601 GgC, Russia 432,486 GgC, and the U.S. 1,586,213 GgC.
Slaying of two dissidents, Stanislav Markelov and Anastasia Baburova
Murder in Moscow, by Stephen Schwartz
Press criticism, KGB-style.
The Weekly Standard, Feb 23, 2009, Volume 014, Issue 22
Vice President Joseph Biden has told the Europeans that the new administration wishes to "reset" relations with Vladmir Putin's Russia. But the January 19 slaying of two dissidents, 34-year-old human rights lawyer Stanislav Markelov and journalism student Anastasia Baburova, 25, on a Moscow street is one of several recent reminders that Americans cannot be comfortable in Putin's embrace.
Markelov, head of the Institute for the Supremacy of Law, may well have been murdered as a result of the release from custody, one week before, of Russian army colonel Yuri Budanov, who had been sent to prison for crimes he committed while serving in Chechnya. Markelov had been crucial to Budanov's 2003 conviction in the kidnapping, torture, multiple sexual assault, and murder of an 18-year-old Chechen girl, Elza Kheda Kungaeva. Budanov, although he admitted his guilt and was sentenced to 10 years' imprisonment, had benefited from an early release.
On the day he perished, Markelov delivered a statement to the press. Representing the family of the Chechen female victim, he accused the Russian authorities of improperly arranging for Budanov to be let go. He then walked to a metro station near the Kremlin with Baburova. The killer, wearing a ski mask, approached from behind and shot Markelov in the back of the head. Baburova pursued the shooter, who turned and fired into her forehead. She died several hours later.
Anticipating her graduation from journalism school, Baburova was working for the daily Novaya Gazeta, which has employed a distinguished roster of liquidated investigative journalists. Novaya Gazeta is co-owned by Alexander Lebedev, an ex-KGB official and billionaire turned political reformer, who purchased the ailing London Evening Standard on January 21, only two days after Baburova's death.
As the largest individual shareholder in Novaya Gazeta--he owns 39 percent--Lebedev is responsible for a publication that has experienced the high-profile killing of several of the country's leading reporters. Anna Politkovskaya, murdered in the elevator of her apartment building in 2006, was his top staffer; she too had exposed atrocities in Chechnya, and Markelov was her lawyer. Igor Domnikov was killed in a brutal beating in 2000. His colleague Yury Shchekochikhin was poisoned in 2003.
Indeed, the poison cabinet seems to have become a favored anti-dissident weapon of the Russian state, as it was under Stalin. Politkovskaya herself was poisoned (though not fatally) in 2004 when she tried to travel to Beslan during the hostage crisis there. And less than two months after her eventual murder, Alexander Litvinenko, another former KGB agent critical of the Putin regime, was killed in a highly unusual poisoning in London.
In the aftermath of the Markelov-Baburov assassinations, the U.S.-based Committee to Protect Journalists reported that Lebedev, perhaps spurred by his KGB experiences, had announced the intention of Novaya Gazeta journalists to petition to arm themselves if necessary. Novaya Gazeta editor Dmitry Muratov denounced the Russian government for its inability to protect the press and asserted, "We have three options. The first one--to leave and turn off the lights. . . . The second--to stop writing about the special services, corruption, drugs, fascists; to stop investigating the crimes of the powerful. . . . The third option is to somehow defend ourselves."
Russian political life has increasingly assumed a pogrom atmosphere. Markelov had extended his investigation of human rights violations from Chechnya to the central Russian republic of Bashkortostan, which has a Turkic Muslim majority, but has not been the scene of Chechen-style rebellion against Russian rule. At the end of 2004, local police beat up to 1,000 people in Bashkortostan over a period of four days. Markelov had warned against "the spread of the Chechnya syndrome throughout other regions of Russia" and exposed the existence of a secret "order number 870" issued by the Ministry of Internal Affairs in 2003, which authorized the police to declare states of emergency without informing the public and to follow them up with repressive actions.
One of his closest friends, an academic named Vladislav Bugera, described Markelov as a perhaps naïve product of the old Soviet way of life. Writing in the online periodical Johnson's Russia List, Bugera called the dead lawyer a "socialist and an internationalist" whose many causes included an independent labor union, but whose socialism was "moderate . . . and reformist. . . . He was a reliable person. You could always be sure of him. . . . He is my hero."
Needless to say, a return to socialist ideals would stand no chance of protecting human rights from state abuse. Russia has been through its dark eras of internal strife and compulsory social experiment; Putinism, now aggravated by the global economic crisis, represents an attempt to revive aspects of both. The staggering challenge before Russian supporters of democracy is to find a way to construct a new and unburdened system of individual rights, secured by due process. Russian democrats and those abroad who would help them can ill afford to look away from the blood of Russian lawyers and journalists shed in the street.
Stephen Schwartz is a frequent contributor to The Weekly Standard.
Press criticism, KGB-style.
The Weekly Standard, Feb 23, 2009, Volume 014, Issue 22
Vice President Joseph Biden has told the Europeans that the new administration wishes to "reset" relations with Vladmir Putin's Russia. But the January 19 slaying of two dissidents, 34-year-old human rights lawyer Stanislav Markelov and journalism student Anastasia Baburova, 25, on a Moscow street is one of several recent reminders that Americans cannot be comfortable in Putin's embrace.
Markelov, head of the Institute for the Supremacy of Law, may well have been murdered as a result of the release from custody, one week before, of Russian army colonel Yuri Budanov, who had been sent to prison for crimes he committed while serving in Chechnya. Markelov had been crucial to Budanov's 2003 conviction in the kidnapping, torture, multiple sexual assault, and murder of an 18-year-old Chechen girl, Elza Kheda Kungaeva. Budanov, although he admitted his guilt and was sentenced to 10 years' imprisonment, had benefited from an early release.
On the day he perished, Markelov delivered a statement to the press. Representing the family of the Chechen female victim, he accused the Russian authorities of improperly arranging for Budanov to be let go. He then walked to a metro station near the Kremlin with Baburova. The killer, wearing a ski mask, approached from behind and shot Markelov in the back of the head. Baburova pursued the shooter, who turned and fired into her forehead. She died several hours later.
Anticipating her graduation from journalism school, Baburova was working for the daily Novaya Gazeta, which has employed a distinguished roster of liquidated investigative journalists. Novaya Gazeta is co-owned by Alexander Lebedev, an ex-KGB official and billionaire turned political reformer, who purchased the ailing London Evening Standard on January 21, only two days after Baburova's death.
As the largest individual shareholder in Novaya Gazeta--he owns 39 percent--Lebedev is responsible for a publication that has experienced the high-profile killing of several of the country's leading reporters. Anna Politkovskaya, murdered in the elevator of her apartment building in 2006, was his top staffer; she too had exposed atrocities in Chechnya, and Markelov was her lawyer. Igor Domnikov was killed in a brutal beating in 2000. His colleague Yury Shchekochikhin was poisoned in 2003.
Indeed, the poison cabinet seems to have become a favored anti-dissident weapon of the Russian state, as it was under Stalin. Politkovskaya herself was poisoned (though not fatally) in 2004 when she tried to travel to Beslan during the hostage crisis there. And less than two months after her eventual murder, Alexander Litvinenko, another former KGB agent critical of the Putin regime, was killed in a highly unusual poisoning in London.
In the aftermath of the Markelov-Baburov assassinations, the U.S.-based Committee to Protect Journalists reported that Lebedev, perhaps spurred by his KGB experiences, had announced the intention of Novaya Gazeta journalists to petition to arm themselves if necessary. Novaya Gazeta editor Dmitry Muratov denounced the Russian government for its inability to protect the press and asserted, "We have three options. The first one--to leave and turn off the lights. . . . The second--to stop writing about the special services, corruption, drugs, fascists; to stop investigating the crimes of the powerful. . . . The third option is to somehow defend ourselves."
Russian political life has increasingly assumed a pogrom atmosphere. Markelov had extended his investigation of human rights violations from Chechnya to the central Russian republic of Bashkortostan, which has a Turkic Muslim majority, but has not been the scene of Chechen-style rebellion against Russian rule. At the end of 2004, local police beat up to 1,000 people in Bashkortostan over a period of four days. Markelov had warned against "the spread of the Chechnya syndrome throughout other regions of Russia" and exposed the existence of a secret "order number 870" issued by the Ministry of Internal Affairs in 2003, which authorized the police to declare states of emergency without informing the public and to follow them up with repressive actions.
One of his closest friends, an academic named Vladislav Bugera, described Markelov as a perhaps naïve product of the old Soviet way of life. Writing in the online periodical Johnson's Russia List, Bugera called the dead lawyer a "socialist and an internationalist" whose many causes included an independent labor union, but whose socialism was "moderate . . . and reformist. . . . He was a reliable person. You could always be sure of him. . . . He is my hero."
Needless to say, a return to socialist ideals would stand no chance of protecting human rights from state abuse. Russia has been through its dark eras of internal strife and compulsory social experiment; Putinism, now aggravated by the global economic crisis, represents an attempt to revive aspects of both. The staggering challenge before Russian supporters of democracy is to find a way to construct a new and unburdened system of individual rights, secured by due process. Russian democrats and those abroad who would help them can ill afford to look away from the blood of Russian lawyers and journalists shed in the street.
Stephen Schwartz is a frequent contributor to The Weekly Standard.
Ethanol & Greenhouse Gas Emissions - Reconsidering the University of Nebraska Study
Ethanol & Greenhouse Gas Emissions - Reconsidering the University of Nebraska Study. By Jerry Taylor
Master Resource, February 18, 2009
The debate about the environmental impact of ethanol rages on. Last month, the most recent study on the greenhouse gas (GHG) emissions associated with ethanol use was published by researchers from the University of Nebraska (Liska et al.). That analysis used the most recent data available on individual facility operations and emissions, observed corn yields, nitrogen fertilizer emissions profiles, and co-product use; all of which prove important because of improved energy efficiencies associated with ethanol production over the past several years. The authors found that the total life-cycle GHG emissions from the most common type of ethanol processing facility in operation today are 48-59 percent lower than gasoline, one of the highest savings reported in the literature. Even without subtracting-out the GHG emissions associated with ethanol cop-products (which accounted for 19-38 percent of total system emissions), ethanol would still present GHG advantages relative to gasoline. The ethanol lobby went wild.
This may be the best study on the subject, but it is not the final word. There are three fundamental problems with the analysis.
First, the study examines only a subset of corn production operations and ethanol processing facilities; dry mill ethanol processors fired by natural gas in six corn-belt states. Together, those facilities accounted for 23 percent of US ethanol production in 2006. While this approach makes the study stronger because the authors are not forced to rely as heavily on estimates and aggregated analysis, the down-side is that the study ignores a large number of older, less efficient ethanol processing facilities and thus cannot be used to assess the GHG balance of the ethanol industry as a whole. While the findings may well point to where the industry will be in the future as older, less efficient facilities lose market share and are upgraded or retired, the bankruptcies that are shutting down many newer facilities at present caution against certainty on this point.
Second, estimates regarding emissions are still relied on to some degree, and one of those estimates in particular – the estimate pertaining to the release of nitrous oxide (N2O) from fertilizer use in corn production – is problematic. While the study comports with convention in that it relies on emission estimates offered by the Intergovernmental Panel on Climate Change, a recent study finds that the IPCC estimates as they pertain to N2O release from fertilizer does not comport with the observed data (Crutzen et al., 2007). That study finds that N2O emissions from fertilizers used in biofuels production are 3-5 times greater than assumed by the IPCC and that, if we plug those higher emissions into the ethanol life-cycle models, “the outcome is that the production of commonly used biofuels, such as biodiesel from rapeseed and bioethanol from corn (maize), can contribute as much or more to global warming by N2O emissions than cooling by fossil fuel savings.” Given that the lead author of the study – Paul Crutzen – is a Nobel laureate chemist who has specialized in fields related to atmospheric science, his findings cannot be lightly dismissed.
Third, the study acknowledges the importance of the impact that ethanol production has on crop prices and, thus, on global land-use patterns, but it does not account for the GHG emissions associated with those changes. Those emissions are substantial and no life-cycle analysis of ethanol can credibly ignore them.
A worldwide agricultural model constructed by Searchinger et al. (2008) finds that the increases in crop prices that follow from the increased demand for ethanol will induce a global change in the pattern of land use. Those land use changes produce a surge in GHG emissions that is only dissipated by conventional life-cycle emissions savings many decades hence. Although Searchinger et al. modeled ethanol production increases that were beyond those mandated in existing law, “the emissions from land-use change per unit of ethanol would be similar regardless of the ethanol increase analyzed.”
While critics of Searchinger et al. are right to point out that the agricultural model employed in the study was crude, that much is unknown about the factors that influence global land use decisions, that improved yields are reducing the amount of land necessary to meet global crop demands, and that any land additions to crop production do not need to come from forests or other robust carbon sequestration sinks, none of those observations is sufficient to reject the basic insight forwarded in that study. If ethanol demand increases corn and other crop prices beyond where they otherwise would have been, profit incentives will induce investors to increase crop production beyond where production would otherwise have been. If that increased production comes in part from land use changes relative to the baseline, then significant volumes of GHG will likely be released and those emissions threaten to swamp the GHG savings found elsewhere in the life-cycle analysis. Even if the upward pressure on crop prices that are a consequence of ethanol consumption is more than offset by downward price pressures following from other factors, crop acreage retirement will not be as large as might otherwise have been the case and terrestrial sequestration will be lower as a consequence. Every link in that chain of logic is unassailable.
This is but one of the many impacts that ethanol might have on hundreds of industrial sectors worldwide. Searchinger et al. is ultimately unsatisfying because it is only a crude and partial consideration of those impacts, many of which might indirectly affect global land use patterns. For instance, if ethanol consumption reduces the demand for – and thus the price of – crude oil in global markets, how much of those “booked” reductions in oil consumption will be offset by increased demand induced elsewhere by the lower global crude oil prices that follow (known as a “rebound effect” in economics)? How might that increase in global demand for crude oil in response to lower price affect all sort of GHG emissions vectors? None of these sorts of questions are asked in ethanol GHG life-cycle analyses but they are clearly crucial to the analysis.
To summarize, the narrow, conventional consideration of the GHG emissions associated with ethanol from Liska et al. suggest that it reduces climate change harms relative to gasoline. If the IPCC has underestimated N2O emissions from fertilizer – as appears to be the case – then ethanol probably is at best a wash with regards to GHG emissions. Even if that’s not the case, consideration of secondary and tertiary emissions impacts strongly suggests that most if not all of all advertised GHG gains are lost in the changes in land use patterns that follow from increases in ethanol production relative to the baseline. Other changes in anthropogenic emissions – positive and negative – would almost certainly follow as well, but existing models do not bother to search for them and thus we do not know enough to say much beyond this with confidence.
Based on what we know now, it would be hard to make a compelling case that ethanol is preferable to gasoline with regards to total greenhouse gas emissions - and last month’s study out of the University of Nebraska does not change that.
Master Resource, February 18, 2009
The debate about the environmental impact of ethanol rages on. Last month, the most recent study on the greenhouse gas (GHG) emissions associated with ethanol use was published by researchers from the University of Nebraska (Liska et al.). That analysis used the most recent data available on individual facility operations and emissions, observed corn yields, nitrogen fertilizer emissions profiles, and co-product use; all of which prove important because of improved energy efficiencies associated with ethanol production over the past several years. The authors found that the total life-cycle GHG emissions from the most common type of ethanol processing facility in operation today are 48-59 percent lower than gasoline, one of the highest savings reported in the literature. Even without subtracting-out the GHG emissions associated with ethanol cop-products (which accounted for 19-38 percent of total system emissions), ethanol would still present GHG advantages relative to gasoline. The ethanol lobby went wild.
This may be the best study on the subject, but it is not the final word. There are three fundamental problems with the analysis.
First, the study examines only a subset of corn production operations and ethanol processing facilities; dry mill ethanol processors fired by natural gas in six corn-belt states. Together, those facilities accounted for 23 percent of US ethanol production in 2006. While this approach makes the study stronger because the authors are not forced to rely as heavily on estimates and aggregated analysis, the down-side is that the study ignores a large number of older, less efficient ethanol processing facilities and thus cannot be used to assess the GHG balance of the ethanol industry as a whole. While the findings may well point to where the industry will be in the future as older, less efficient facilities lose market share and are upgraded or retired, the bankruptcies that are shutting down many newer facilities at present caution against certainty on this point.
Second, estimates regarding emissions are still relied on to some degree, and one of those estimates in particular – the estimate pertaining to the release of nitrous oxide (N2O) from fertilizer use in corn production – is problematic. While the study comports with convention in that it relies on emission estimates offered by the Intergovernmental Panel on Climate Change, a recent study finds that the IPCC estimates as they pertain to N2O release from fertilizer does not comport with the observed data (Crutzen et al., 2007). That study finds that N2O emissions from fertilizers used in biofuels production are 3-5 times greater than assumed by the IPCC and that, if we plug those higher emissions into the ethanol life-cycle models, “the outcome is that the production of commonly used biofuels, such as biodiesel from rapeseed and bioethanol from corn (maize), can contribute as much or more to global warming by N2O emissions than cooling by fossil fuel savings.” Given that the lead author of the study – Paul Crutzen – is a Nobel laureate chemist who has specialized in fields related to atmospheric science, his findings cannot be lightly dismissed.
Third, the study acknowledges the importance of the impact that ethanol production has on crop prices and, thus, on global land-use patterns, but it does not account for the GHG emissions associated with those changes. Those emissions are substantial and no life-cycle analysis of ethanol can credibly ignore them.
A worldwide agricultural model constructed by Searchinger et al. (2008) finds that the increases in crop prices that follow from the increased demand for ethanol will induce a global change in the pattern of land use. Those land use changes produce a surge in GHG emissions that is only dissipated by conventional life-cycle emissions savings many decades hence. Although Searchinger et al. modeled ethanol production increases that were beyond those mandated in existing law, “the emissions from land-use change per unit of ethanol would be similar regardless of the ethanol increase analyzed.”
While critics of Searchinger et al. are right to point out that the agricultural model employed in the study was crude, that much is unknown about the factors that influence global land use decisions, that improved yields are reducing the amount of land necessary to meet global crop demands, and that any land additions to crop production do not need to come from forests or other robust carbon sequestration sinks, none of those observations is sufficient to reject the basic insight forwarded in that study. If ethanol demand increases corn and other crop prices beyond where they otherwise would have been, profit incentives will induce investors to increase crop production beyond where production would otherwise have been. If that increased production comes in part from land use changes relative to the baseline, then significant volumes of GHG will likely be released and those emissions threaten to swamp the GHG savings found elsewhere in the life-cycle analysis. Even if the upward pressure on crop prices that are a consequence of ethanol consumption is more than offset by downward price pressures following from other factors, crop acreage retirement will not be as large as might otherwise have been the case and terrestrial sequestration will be lower as a consequence. Every link in that chain of logic is unassailable.
This is but one of the many impacts that ethanol might have on hundreds of industrial sectors worldwide. Searchinger et al. is ultimately unsatisfying because it is only a crude and partial consideration of those impacts, many of which might indirectly affect global land use patterns. For instance, if ethanol consumption reduces the demand for – and thus the price of – crude oil in global markets, how much of those “booked” reductions in oil consumption will be offset by increased demand induced elsewhere by the lower global crude oil prices that follow (known as a “rebound effect” in economics)? How might that increase in global demand for crude oil in response to lower price affect all sort of GHG emissions vectors? None of these sorts of questions are asked in ethanol GHG life-cycle analyses but they are clearly crucial to the analysis.
To summarize, the narrow, conventional consideration of the GHG emissions associated with ethanol from Liska et al. suggest that it reduces climate change harms relative to gasoline. If the IPCC has underestimated N2O emissions from fertilizer – as appears to be the case – then ethanol probably is at best a wash with regards to GHG emissions. Even if that’s not the case, consideration of secondary and tertiary emissions impacts strongly suggests that most if not all of all advertised GHG gains are lost in the changes in land use patterns that follow from increases in ethanol production relative to the baseline. Other changes in anthropogenic emissions – positive and negative – would almost certainly follow as well, but existing models do not bother to search for them and thus we do not know enough to say much beyond this with confidence.
Based on what we know now, it would be hard to make a compelling case that ethanol is preferable to gasoline with regards to total greenhouse gas emissions - and last month’s study out of the University of Nebraska does not change that.
Subscribe to:
Posts (Atom)