Unconventional monetary policies: an appraisal. By Claudio Borio and Piti Disyatat
BIS Working Papers No 292
November 2009
Abstract:
The recent global financial crisis has led central banks to rely heavily on "unconventional" monetary policies. This alternative approach to policy has generated much discussion and a heated and at times confusing debate. The debate has been complicated by the use of different definitions and conflicting views of the mechanisms at work. This paper sets out a framework for classifying and thinking about such policies, highlighting how they can be viewed within the overall context of monetary policy implementation. The framework clarifies the differences among the various forms of unconventional monetary policy, provides a systematic characterisation of the wide range of central bank responses to the crisis, helps to underscore the channels of transmission, and identifies some of the main policy challenges. In the process, the paper also addresses a number of contentious analytical issues, notably the role of bank reserves and their inflationary consequences.
JEL Classification Numbers: E40, E50, E52, E58, E60
Keywords: unconventional monetary policy, balance sheet policy, credit policy, quantitative easing, credit easing, monetary policy implementation, transmission mechanism, interest rates
Monday, November 30, 2009
Ten propositions about liquidity crises
Ten propositions about liquidity crises. By Claudio Borio
BIS Working Papers No 293
November 2009
Abstract:
What are liquidity crises? And what can be done to address them? This short paper brings together some personal reflections on this issue, largely based on previous work. In the process, it questions a number of commonly held beliefs that have become part of the conventional wisdom. The paper is organised around ten propositions that cover the following issues: the distinction between idiosyncratic and systematic elements of liquidity crises; the growing reliance on funding liquidity in a market-based financial system; the role of payment and settlement systems; the need to improve liquidity buffers; the desirability of putting in place (variable) speed limits in the financial system; the proper role of (retail) deposit insurance schemes; the double-edged sword nature of liquidity provision by central banks; the often misunderstood role of "monetary base" injections in addressing liquidity disruptions; the need to develop principles for the provision of central bank liquidity; and the need to reconsider the preventive role of monetary (interest rate) policy.
JEL Classification Numbers: E50, E51, E58, G10, G14, G18, G28
Keywords: market and funding liquidity, liquidity crises, deposit insurance, central bank operations, monetary base
BIS Working Papers No 293
November 2009
Abstract:
What are liquidity crises? And what can be done to address them? This short paper brings together some personal reflections on this issue, largely based on previous work. In the process, it questions a number of commonly held beliefs that have become part of the conventional wisdom. The paper is organised around ten propositions that cover the following issues: the distinction between idiosyncratic and systematic elements of liquidity crises; the growing reliance on funding liquidity in a market-based financial system; the role of payment and settlement systems; the need to improve liquidity buffers; the desirability of putting in place (variable) speed limits in the financial system; the proper role of (retail) deposit insurance schemes; the double-edged sword nature of liquidity provision by central banks; the often misunderstood role of "monetary base" injections in addressing liquidity disruptions; the need to develop principles for the provision of central bank liquidity; and the need to reconsider the preventive role of monetary (interest rate) policy.
JEL Classification Numbers: E50, E51, E58, G10, G14, G18, G28
Keywords: market and funding liquidity, liquidity crises, deposit insurance, central bank operations, monetary base
How to Break Up the Banks - Solving the "too big to fail" problem in the future structure of the global financial system
How to Break Up the Banks. By Adrian Blundell-Wignall
Solving the "too big to fail" problem in the future structure of the global financial system.
The Wall Street Journal, page A13
The financial crisis that sparked the worst recession in decades is in abeyance, but not yet over. Nonperforming loans and other assets of doubtful quality still weigh on many banks. Financial reform to date has focused on improving capital rules and processes. What has not yet been addressed is the future structure of the global financial system.
Contagion risk and counterparty failure have been the main hallmarks of the crisis. While some large diversified banks that focused mainly on commercial banking survived very well, other smaller and less diversified banks, particularly those focused on mortgages, and financial conglomerates that built on investment banking, the structuring of complex derivatives and proprietary trading as the main drivers of growth, suffered crippling losses. In principle, sound corporate governance and a strong risk-management culture should enable banks to avoid excess leverage and risk taking. But human nature being what it is, there are likely always to be some players eager to push complex products and trading beyond the sensible needs of industry and long-term investors in order to drive profits. Indeed, right now such activity is driving the rapid profit growth of some banks, with little having been learned from the past.
As the system will always be hostage to the "gung-ho" few, the question is whether there is a better way to structure large conglomerates in order to isolate commercial banking functions from such high-risk activities. In discussions at the OECD, we have been reviewing possible options. One proposal, which we now submit for consideration, is that banking and financial service groups could be structured under a variant of non-operating holding companies (NOHCs), in all countries.
Under such a structure, the parent would be non-operating, raising capital on the stock exchange and investing it transparently and without any double-gearing in its operating subsidiaries—say a bank and a securities firm that would be separate legal entities with their own governance. The subsidiaries would pay dividends through the parent to shareholders out of profits. The nonoperating parent would have no legal basis to shift capital between affiliates in a crisis, and it would not be able to request "special dividends" in order to do so.
These structures allow separation insofar as prudential risk and the use of capital is concerned without the full divestment required under Glass-Steagall or in response to the recently-expressed concerns of Paul Volcker and Mervyn King—such extreme solutions should remain the proper focus of competition authorities. With an NOHC structure, technology platforms and back office functions would still be shared, permitting synergies and economies of scale and scope. Such a transparent structure would make it easier for regulators and market players to see potential weaknesses. Mark-to-market and fair value accounting would affect those affiliates most associated with securities businesses, while longer-term cost amortization would dominate for commercial banking. It would create a tougher, non-subsidized environment for securities firms, but a safer one for investors.
If a securities firm under this structure had access to limited "siloed" capital and could not share with other subsidiaries, and this were clear to the market, this would be priced into the cost of capital and reflected in margins for derivative transactions. The result would likely be smaller securities firms that are more careful in risk-taking than has been the case under the "double gearing" scenarios seen in mixed or universal bank groups.
Finally, if a securities affiliate were to fail under such a structure, the regulator could shut it down without affecting its commercial banking sister firm in a critical way—obviating the need for "living wills." Resolution mechanisms for smaller, legally separate entities would be more credible than those needed in the recent past for large mixed conglomerates—helping to deal with the "too big to fail" issue. To protect consumers, deposit insurance and other guarantees could apply to the bank without being extended to the legally separate securities firm.
The world is still waiting for a full reassessment of what banks do and how they compete. Until now, the implementation of regulatory standards and accounting rules has been eased. Fiscal policy has supported the economy and interest rates are being kept low to support the underlying earnings of banks and their ability to issue new equity in rising markets. This strategy may work in the short term. But it can't go on forever. Sooner or later we will have to exit from the extraordinary measures that have used trillions of taxpayer dollars to save the institutions that took the world economy to the brink of another Great Depression.
The structure of organizations and how they compete will be critical to future stability. Going forward, the aim must be to keep the "credit culture" and the "equity culture" separate so that government implicit and explicit insurance does not extend to cross-subsidizing high-risk market activity, and so that contagion and counterparty risk can be reduced. The right balance must also be struck between sufficient size conducive to diversification and strong competition to meet consumer needs at reasonable costs.
The capital and derivative markets are inherently interconnected globally, so counterparty risk looms large. Under present structures, if one participant fails, everyone is in trouble. We can't let the world go through that turmoil again.
Mr. Blundell-Wignall is deputy director of financial and enterprise affairs at the OECD.
Solving the "too big to fail" problem in the future structure of the global financial system.
The Wall Street Journal, page A13
The financial crisis that sparked the worst recession in decades is in abeyance, but not yet over. Nonperforming loans and other assets of doubtful quality still weigh on many banks. Financial reform to date has focused on improving capital rules and processes. What has not yet been addressed is the future structure of the global financial system.
Contagion risk and counterparty failure have been the main hallmarks of the crisis. While some large diversified banks that focused mainly on commercial banking survived very well, other smaller and less diversified banks, particularly those focused on mortgages, and financial conglomerates that built on investment banking, the structuring of complex derivatives and proprietary trading as the main drivers of growth, suffered crippling losses. In principle, sound corporate governance and a strong risk-management culture should enable banks to avoid excess leverage and risk taking. But human nature being what it is, there are likely always to be some players eager to push complex products and trading beyond the sensible needs of industry and long-term investors in order to drive profits. Indeed, right now such activity is driving the rapid profit growth of some banks, with little having been learned from the past.
As the system will always be hostage to the "gung-ho" few, the question is whether there is a better way to structure large conglomerates in order to isolate commercial banking functions from such high-risk activities. In discussions at the OECD, we have been reviewing possible options. One proposal, which we now submit for consideration, is that banking and financial service groups could be structured under a variant of non-operating holding companies (NOHCs), in all countries.
Under such a structure, the parent would be non-operating, raising capital on the stock exchange and investing it transparently and without any double-gearing in its operating subsidiaries—say a bank and a securities firm that would be separate legal entities with their own governance. The subsidiaries would pay dividends through the parent to shareholders out of profits. The nonoperating parent would have no legal basis to shift capital between affiliates in a crisis, and it would not be able to request "special dividends" in order to do so.
These structures allow separation insofar as prudential risk and the use of capital is concerned without the full divestment required under Glass-Steagall or in response to the recently-expressed concerns of Paul Volcker and Mervyn King—such extreme solutions should remain the proper focus of competition authorities. With an NOHC structure, technology platforms and back office functions would still be shared, permitting synergies and economies of scale and scope. Such a transparent structure would make it easier for regulators and market players to see potential weaknesses. Mark-to-market and fair value accounting would affect those affiliates most associated with securities businesses, while longer-term cost amortization would dominate for commercial banking. It would create a tougher, non-subsidized environment for securities firms, but a safer one for investors.
If a securities firm under this structure had access to limited "siloed" capital and could not share with other subsidiaries, and this were clear to the market, this would be priced into the cost of capital and reflected in margins for derivative transactions. The result would likely be smaller securities firms that are more careful in risk-taking than has been the case under the "double gearing" scenarios seen in mixed or universal bank groups.
Finally, if a securities affiliate were to fail under such a structure, the regulator could shut it down without affecting its commercial banking sister firm in a critical way—obviating the need for "living wills." Resolution mechanisms for smaller, legally separate entities would be more credible than those needed in the recent past for large mixed conglomerates—helping to deal with the "too big to fail" issue. To protect consumers, deposit insurance and other guarantees could apply to the bank without being extended to the legally separate securities firm.
The world is still waiting for a full reassessment of what banks do and how they compete. Until now, the implementation of regulatory standards and accounting rules has been eased. Fiscal policy has supported the economy and interest rates are being kept low to support the underlying earnings of banks and their ability to issue new equity in rising markets. This strategy may work in the short term. But it can't go on forever. Sooner or later we will have to exit from the extraordinary measures that have used trillions of taxpayer dollars to save the institutions that took the world economy to the brink of another Great Depression.
The structure of organizations and how they compete will be critical to future stability. Going forward, the aim must be to keep the "credit culture" and the "equity culture" separate so that government implicit and explicit insurance does not extend to cross-subsidizing high-risk market activity, and so that contagion and counterparty risk can be reduced. The right balance must also be struck between sufficient size conducive to diversification and strong competition to meet consumer needs at reasonable costs.
The capital and derivative markets are inherently interconnected globally, so counterparty risk looms large. Under present structures, if one participant fails, everyone is in trouble. We can't let the world go through that turmoil again.
Mr. Blundell-Wignall is deputy director of financial and enterprise affairs at the OECD.
Monday, November 9, 2009
"[C]reating a new entitlement program, which, once established, will be virtually impossible to rescind"
Confessions of an ObamaCare Backer. WSJ Editorial
A liberal explains the political calculus.
The Wall Street Journal, page A24
The typical argument for ObamaCare is that it will offer better medical care for everyone and cost less to do it, but occasionally a supporter lets the mask slip and reveals the real political motivation. So let's give credit to John Cassidy, part of the left-wing stable at the New Yorker, who wrote last week on its Web site that "it's important to be clear about what the reform amounts to." [http://www.newyorker.com/online/blogs/johncassidy/2009/11/some-vaguely-heretical-thoughts-on-health-care-reform.html]
Mr. Cassidy is more honest than the politicians whose dishonesty he supports. "The U.S. government is making a costly and open-ended commitment," he writes. "Let's not pretend that it isn't a big deal, or that it will be self-financing, or that it will work out exactly as planned. It won't. What is really unfolding, I suspect, is the scenario that many conservatives feared. The Obama Administration . . . is creating a new entitlement program, which, once established, will be virtually impossible to rescind."
Why are they doing it? Because, according to Mr. Cassidy, ObamaCare serves the twin goals of "making the United States a more equitable country" and furthering the Democrats' "political calculus." In other words, the purpose is to further redistribute income by putting health care further under government control, and in the process making the middle class more dependent on government. As the party of government, Democrats will benefit over the long run.
This explains why Nancy Pelosi is willing to risk the seats of so many Blue Dog Democrats by forcing such an unpopular bill through Congress on a narrow, partisan vote: You have to break a few eggs to make a permanent welfare state. As Mr. Cassidy concludes, "Putting on my amateur historian's cap, I might even claim that some subterfuge is historically necessary to get great reforms enacted."
No wonder many Americans are upset. They know they are being lied to about ObamaCare, and they know they are going to be stuck with the bill.
A liberal explains the political calculus.
The Wall Street Journal, page A24
The typical argument for ObamaCare is that it will offer better medical care for everyone and cost less to do it, but occasionally a supporter lets the mask slip and reveals the real political motivation. So let's give credit to John Cassidy, part of the left-wing stable at the New Yorker, who wrote last week on its Web site that "it's important to be clear about what the reform amounts to." [http://www.newyorker.com/online/blogs/johncassidy/2009/11/some-vaguely-heretical-thoughts-on-health-care-reform.html]
Mr. Cassidy is more honest than the politicians whose dishonesty he supports. "The U.S. government is making a costly and open-ended commitment," he writes. "Let's not pretend that it isn't a big deal, or that it will be self-financing, or that it will work out exactly as planned. It won't. What is really unfolding, I suspect, is the scenario that many conservatives feared. The Obama Administration . . . is creating a new entitlement program, which, once established, will be virtually impossible to rescind."
Why are they doing it? Because, according to Mr. Cassidy, ObamaCare serves the twin goals of "making the United States a more equitable country" and furthering the Democrats' "political calculus." In other words, the purpose is to further redistribute income by putting health care further under government control, and in the process making the middle class more dependent on government. As the party of government, Democrats will benefit over the long run.
This explains why Nancy Pelosi is willing to risk the seats of so many Blue Dog Democrats by forcing such an unpopular bill through Congress on a narrow, partisan vote: You have to break a few eggs to make a permanent welfare state. As Mr. Cassidy concludes, "Putting on my amateur historian's cap, I might even claim that some subterfuge is historically necessary to get great reforms enacted."
No wonder many Americans are upset. They know they are being lied to about ObamaCare, and they know they are going to be stuck with the bill.
Guidance to Assess the Systemic Importance of Financial Institutions, Markets and Instruments: Initial Considerations Report & Background Paper
Guidance to Assess the Systemic Importance of Financial Institutions, Markets and Instruments: Initial Considerations Report & Background Paper
Nov 07, 2009
Prepared by staff of the International Monetary Fund, the Bank for International Settlements and the Financial Stability Board and submitted to G20 Finance Ministers and Governors.
November 7, 2009
The report and background paper respond to a request made by the G20 Leaders in April 2009 to develop guidance for national authorities to assess the systemic importance of financial institutions, markets and instruments. The report outlines conceptual and analytical approaches to the assessment of systemic importance and discusses a possible form for general guidelines.
The report recognizes that current knowledge and concerns about moral hazard limit the extent to which very precise guidance can be developed. Assessments of systemic importance will necessarily involve a high degree of judgment, they will likely be time-varying and state-dependent, and they will reflect the purpose of the assessment. The report does not pre-judge the policy actions to which such assessments could be an input.
The report suggests that the guidelines could take the form of high level principles that would be sufficiently flexible to apply to a broad range of countries and circumstances, and it outlines the possible coverage of such guidelines. A set of such high level principles appropriate for a variety of policy uses could be developed, further, by the IMF, BIS and FSB, taking account of experience with the application of the conceptual and analytical approaches described here.
There are a number of policy issues where an assessment of systemic importance would be useful. One critical issue is the ongoing work to reduce the moral hazard posed by systemically important institutions. The FSB and the international standard setters are developing measures that can be taken to reduce the systemic risks these institutions pose, and the attached papers will provide a useful conceptual and analytical framework to inform policy discussions. A second area is the work to address information gaps that were exposed by the recent crisis (the subject of a separate report to the G20 from IMF staff and the FSB Secretariat), where assessments of systemic importance can help to inform data collection needs. A third area is in helping to identify sources of financial sector risk that could have serious macroeconomic consequences. We will keep you informed on our respective future policy work in these important areas.
Report to G-20 Finance Ministers and Central Bank Governors (PDF 29 pages, 679 kb) & background paper (PDF 46 pages, 955 kb) downloadable @ http://www.bis.org/publ/othp07.htm?sent=091109
Nov 07, 2009
Prepared by staff of the International Monetary Fund, the Bank for International Settlements and the Financial Stability Board and submitted to G20 Finance Ministers and Governors.
November 7, 2009
The report and background paper respond to a request made by the G20 Leaders in April 2009 to develop guidance for national authorities to assess the systemic importance of financial institutions, markets and instruments. The report outlines conceptual and analytical approaches to the assessment of systemic importance and discusses a possible form for general guidelines.
The report recognizes that current knowledge and concerns about moral hazard limit the extent to which very precise guidance can be developed. Assessments of systemic importance will necessarily involve a high degree of judgment, they will likely be time-varying and state-dependent, and they will reflect the purpose of the assessment. The report does not pre-judge the policy actions to which such assessments could be an input.
The report suggests that the guidelines could take the form of high level principles that would be sufficiently flexible to apply to a broad range of countries and circumstances, and it outlines the possible coverage of such guidelines. A set of such high level principles appropriate for a variety of policy uses could be developed, further, by the IMF, BIS and FSB, taking account of experience with the application of the conceptual and analytical approaches described here.
There are a number of policy issues where an assessment of systemic importance would be useful. One critical issue is the ongoing work to reduce the moral hazard posed by systemically important institutions. The FSB and the international standard setters are developing measures that can be taken to reduce the systemic risks these institutions pose, and the attached papers will provide a useful conceptual and analytical framework to inform policy discussions. A second area is the work to address information gaps that were exposed by the recent crisis (the subject of a separate report to the G20 from IMF staff and the FSB Secretariat), where assessments of systemic importance can help to inform data collection needs. A third area is in helping to identify sources of financial sector risk that could have serious macroeconomic consequences. We will keep you informed on our respective future policy work in these important areas.
Report to G-20 Finance Ministers and Central Bank Governors (PDF 29 pages, 679 kb) & background paper (PDF 46 pages, 955 kb) downloadable @ http://www.bis.org/publ/othp07.htm?sent=091109
Saturday, November 7, 2009
At the Ends of the World: Projects at Remote Locations
At the Ends of the World: Projects at Remote Locations. By Fabio Teixeira de Melo, PMP
PMI eNews, Nov 06, 2009
We have all heard that the world is getting smaller and smaller. However, some projects challenge that view: namely, those performed at remote locations.
For project management, a remote location is a place where:
Access to resources is more difficult; Both public and private sectors have less presence, or no presence at all; Local communities have little connection with the “civilized world.” Successfully executing a project at these locations requires a specific approach for some of the unique challenges you’ll face. Here are a few suggestions:
Logistics:
You should creatively explore what alternatives are available for supplying materials and consumables, and know the risk for each one. You have to consider natural factors, such as flood and dry seasons and their impact in site access, as well as frozen, blocked and / or dangerous access roads.
Consulting local communities is vital for gaining knowledge on alternatives, potential risks and contingency plans. Keep in mind that it is not only about bringing equipment in: it is about feeding and supporting your site team.
Communication:
Communication depends heavily on wireless phone and internet access. These options facilitate working at remote locations, but they do not always function properly. Between thunderstorms, heavy rain, energy shutdowns and frozen equipment, many things can go wrong.
Communicating through traditional, hard-copy mail is safe and reliable, but takes more time. Consider adding redundancy—exchanging data electronically but also sending hard copies through traditional mail—to the communications management plan, logistics plan and schedule. It can make the difference between taking advantage of wireless communication and suffering from the lack of it.
Local Community:
With very few exceptions, remote locations are inhabited, usually by poor and unassisted communities living in a subsistence economy. They often lack proper authorities, which is an invitation to the actions of drug producers, smugglers and others who interact with the local community. You should consider them as a part of it – in fact, sometimes they even act as the “local authority.”
Base your approach on the core values of respect and honesty. Show interest for the community and try to build trust without interfering in their relationship with potential outlaw groups. For those groups, try to negotiate your relationship in the basis of non-interference, but consider their presence in your risk management: it’s not unheard of for project managers to be kidnapped by local gangs or terror groups.
Social Responsibility:
Your project will probably impact the local community. Hiring its people is a good way to inject money to the local economy, but you have to be cautious as to how many people will be employed and what jobs they will take.
Resist the temptation to hire everybody, since they will have to continue to live after you demobilize. If you train them to work on your project—for example, to operate your bulldozers—when you finish they either will be unemployed or will have to leave the region in search for a job.
Instead, give them insight and training on how to improve and market what they currently produce for their living. Help them get more productive and organized. Your project will certainly bring them closer to “civilization,” and you should help them make that encounter more of an opportunity than a risk.
When you plan for a project at a remote location, don’t associate the challenge with logistics only. Remember that the communications and stakeholder management for these projects have particular requirements, which, if not properly performed, can be as harmful to your project as a natural disaster.
Fabio Teixeira de Melo, PMP, is a Site Manager working for Odebrecht, a Brazilian multinational construction company with projects in over 20 countries. An LI ’04 graduate with more than 15 years of experience in construction project planning and management, he was founder and first President of PMI Pernambuco – Brazil Chapter; participated in the elaboration of the Construction Extension to the PMBOK® Guide, served a 5-year term as DPC SIG Latin America Chair and contributed with articles for the SIG’s newsletter. You can contact him writing a comment to this post.
PMI eNews, Nov 06, 2009
We have all heard that the world is getting smaller and smaller. However, some projects challenge that view: namely, those performed at remote locations.
For project management, a remote location is a place where:
Access to resources is more difficult; Both public and private sectors have less presence, or no presence at all; Local communities have little connection with the “civilized world.” Successfully executing a project at these locations requires a specific approach for some of the unique challenges you’ll face. Here are a few suggestions:
Logistics:
You should creatively explore what alternatives are available for supplying materials and consumables, and know the risk for each one. You have to consider natural factors, such as flood and dry seasons and their impact in site access, as well as frozen, blocked and / or dangerous access roads.
Consulting local communities is vital for gaining knowledge on alternatives, potential risks and contingency plans. Keep in mind that it is not only about bringing equipment in: it is about feeding and supporting your site team.
Communication:
Communication depends heavily on wireless phone and internet access. These options facilitate working at remote locations, but they do not always function properly. Between thunderstorms, heavy rain, energy shutdowns and frozen equipment, many things can go wrong.
Communicating through traditional, hard-copy mail is safe and reliable, but takes more time. Consider adding redundancy—exchanging data electronically but also sending hard copies through traditional mail—to the communications management plan, logistics plan and schedule. It can make the difference between taking advantage of wireless communication and suffering from the lack of it.
Local Community:
With very few exceptions, remote locations are inhabited, usually by poor and unassisted communities living in a subsistence economy. They often lack proper authorities, which is an invitation to the actions of drug producers, smugglers and others who interact with the local community. You should consider them as a part of it – in fact, sometimes they even act as the “local authority.”
Base your approach on the core values of respect and honesty. Show interest for the community and try to build trust without interfering in their relationship with potential outlaw groups. For those groups, try to negotiate your relationship in the basis of non-interference, but consider their presence in your risk management: it’s not unheard of for project managers to be kidnapped by local gangs or terror groups.
Social Responsibility:
Your project will probably impact the local community. Hiring its people is a good way to inject money to the local economy, but you have to be cautious as to how many people will be employed and what jobs they will take.
Resist the temptation to hire everybody, since they will have to continue to live after you demobilize. If you train them to work on your project—for example, to operate your bulldozers—when you finish they either will be unemployed or will have to leave the region in search for a job.
Instead, give them insight and training on how to improve and market what they currently produce for their living. Help them get more productive and organized. Your project will certainly bring them closer to “civilization,” and you should help them make that encounter more of an opportunity than a risk.
When you plan for a project at a remote location, don’t associate the challenge with logistics only. Remember that the communications and stakeholder management for these projects have particular requirements, which, if not properly performed, can be as harmful to your project as a natural disaster.
Fabio Teixeira de Melo, PMP, is a Site Manager working for Odebrecht, a Brazilian multinational construction company with projects in over 20 countries. An LI ’04 graduate with more than 15 years of experience in construction project planning and management, he was founder and first President of PMI Pernambuco – Brazil Chapter; participated in the elaboration of the Construction Extension to the PMBOK® Guide, served a 5-year term as DPC SIG Latin America Chair and contributed with articles for the SIG’s newsletter. You can contact him writing a comment to this post.
A ground-breaking study shows that New York City's calorie labeling law is ineffective
After Calorie Warnings, Diners Order More Calories. By ALLYSIA FINLEY
A ground-breaking study shows that New York City's calorie labeling law is ineffective.
WSJ, Nov 06, 2009
Before food czars get any more punch-happy on their own Kool-Aid, they need to be purged of the illusion that their laws are actually working. Last month, New York University and Yale medical professors published a ground-breaking study, which shows that New York City's law requiring fast food chains to post calories on their menus doesn't reduce their customers' caloric intake.
Lawmakers everywhere should take note. Efforts to require fast food restaurants to post nutritional information on their menus have been gaining ground across the country. Sixteen municipalities including California, Seattle, and Portland have passed laws similar to NYC's, and the Menu Education and Labeling Act, which would impose labeling regulations nationwide, is pending in Congress. The bill would extend the Nutrition Labeling and Education Act of 1990, which requires food manufacturers to include nutritional information on their packaging, to restaurants. We all know how effective that law was. Since 1990, obesity has more than doubled.
Published online in the journal Health Affairs, the NYU and Yale study is noteworthy because it considers the practical significance of food labeling laws. The researchers examined 1,100 restaurant receipts from McDonald's, Wendy's, Burger King and KFC franchises in low income, high-minority neighborhoods where obesity is most prevalent. They found that the poor fast-food customers that the law intended to help weren't affected.
Only half of the customers said they noticed the caloric information, and only about 15% said they used the information. But the researchers' most striking finding was that customers actually ordered more caloric items after the law went into effect than before, despite the fact that nine out of ten customers who reported using the information said they made healthier choices as a result of the law. This disconnect can partly be explained by response bias in which people tell surveyors what they think the surveyors want to hear.
But the problem may also be more complex. It's possible that people who are less educated may actually think they are eating more healthily than they are notwithstanding the calorie numbers staring them in the face. Calories as a measure of food intake (or more precisely, energy consumption and output) may be as foreign to them as the metric system is to many Americans.
The poor are also extremely price sensitive---especially in a bad economy. Give them the choice between a $2 double quarter pounder with cheese and a $5 chicken salad, and they'll make an economically rational decision and order the $2 burger. And with the extra three bucks saved, they'll order a side of fries and a Coke. Why should they care how many calories they're eating if they're getting good value?
Under pressure to subvert the NYU and Yale study, the New York City Health Department last week came out with its own report, which it nicely packaged in a press release and power point presentation (evidently, the Department didn't want to confuse the media with an actual scientific study). Though the Department's results are equivocal, New York City lawmakers are using the data to argue the efficacy of the law.
The Department is boasting that 56% of customers saw the caloric information and that 15% said they used it. But these figures demonstrate the law's failure---not success. Despite the fact that people were readily presented with the nutritional information, 85% of them ignored it.
The lawmakers who enacted the calorie posting regulations succumbed to the fallacy that everyone thinks like them. They probably reasoned that because they would make healthier choices if presented with nutritional information, everyone else would as well. But maybe what consumers actually want is a delicious meal at a low price.
While information is important, even fully informed people won't always act as lawmakers think they should, especially if it's economically irrational. Any public health legislation won't significantly change people's behavior unless it 1) provides proper incentives for people to put their long-term well-being above temporary gratification and 2) takes into account the economic rationality of people's behavior.
Unfortunately, many lawmakers refuse to swallow this inconvenient truth, preferring the taste of their Kool-Aid.
Ms. Finley is Assistant Editor of OpinionJournal.com
A ground-breaking study shows that New York City's calorie labeling law is ineffective.
WSJ, Nov 06, 2009
Before food czars get any more punch-happy on their own Kool-Aid, they need to be purged of the illusion that their laws are actually working. Last month, New York University and Yale medical professors published a ground-breaking study, which shows that New York City's law requiring fast food chains to post calories on their menus doesn't reduce their customers' caloric intake.
Lawmakers everywhere should take note. Efforts to require fast food restaurants to post nutritional information on their menus have been gaining ground across the country. Sixteen municipalities including California, Seattle, and Portland have passed laws similar to NYC's, and the Menu Education and Labeling Act, which would impose labeling regulations nationwide, is pending in Congress. The bill would extend the Nutrition Labeling and Education Act of 1990, which requires food manufacturers to include nutritional information on their packaging, to restaurants. We all know how effective that law was. Since 1990, obesity has more than doubled.
Published online in the journal Health Affairs, the NYU and Yale study is noteworthy because it considers the practical significance of food labeling laws. The researchers examined 1,100 restaurant receipts from McDonald's, Wendy's, Burger King and KFC franchises in low income, high-minority neighborhoods where obesity is most prevalent. They found that the poor fast-food customers that the law intended to help weren't affected.
Only half of the customers said they noticed the caloric information, and only about 15% said they used the information. But the researchers' most striking finding was that customers actually ordered more caloric items after the law went into effect than before, despite the fact that nine out of ten customers who reported using the information said they made healthier choices as a result of the law. This disconnect can partly be explained by response bias in which people tell surveyors what they think the surveyors want to hear.
But the problem may also be more complex. It's possible that people who are less educated may actually think they are eating more healthily than they are notwithstanding the calorie numbers staring them in the face. Calories as a measure of food intake (or more precisely, energy consumption and output) may be as foreign to them as the metric system is to many Americans.
The poor are also extremely price sensitive---especially in a bad economy. Give them the choice between a $2 double quarter pounder with cheese and a $5 chicken salad, and they'll make an economically rational decision and order the $2 burger. And with the extra three bucks saved, they'll order a side of fries and a Coke. Why should they care how many calories they're eating if they're getting good value?
Under pressure to subvert the NYU and Yale study, the New York City Health Department last week came out with its own report, which it nicely packaged in a press release and power point presentation (evidently, the Department didn't want to confuse the media with an actual scientific study). Though the Department's results are equivocal, New York City lawmakers are using the data to argue the efficacy of the law.
The Department is boasting that 56% of customers saw the caloric information and that 15% said they used it. But these figures demonstrate the law's failure---not success. Despite the fact that people were readily presented with the nutritional information, 85% of them ignored it.
The lawmakers who enacted the calorie posting regulations succumbed to the fallacy that everyone thinks like them. They probably reasoned that because they would make healthier choices if presented with nutritional information, everyone else would as well. But maybe what consumers actually want is a delicious meal at a low price.
While information is important, even fully informed people won't always act as lawmakers think they should, especially if it's economically irrational. Any public health legislation won't significantly change people's behavior unless it 1) provides proper incentives for people to put their long-term well-being above temporary gratification and 2) takes into account the economic rationality of people's behavior.
Unfortunately, many lawmakers refuse to swallow this inconvenient truth, preferring the taste of their Kool-Aid.
Ms. Finley is Assistant Editor of OpinionJournal.com
Wednesday, November 4, 2009
When Regulators Fail - 'Systemic risk' is not only for banks
When Regulators Fail. WSJ Editorial
'Systemic risk' is not only for banks.
WSJ, Nov 04, 2009
Financial Services Authority chief Adair Turner has finally stopped attacking bankers for their paychecks and started talking about the real issue—what to do about the banks deemed too-big-to-fail. Unfortunately, he's still worrying too much about how to prevent failure and not enough about how to facilitate it.
In his speech Monday to an international group of central and private bankers, Lord Turner identified three possible approaches to the problem:
• Make failure less likely by increasing capital requirements;
• Make banks smaller or less "systemic" by either narrowing what they can do or making them less interconnected;
• Or, finally, make failure easier by developing bankruptcy procedures or other "resolution" mechanisms for large financial institutions.
Of these, the last is the most important for reducing the moral hazard that did so much to contribute to the financial panic, as Bank of England Governor Mervyn King has persuasively argued. Even before the panic, systemically important banks enjoyed considerable advantages over their less "important" rivals, and many of these advantages were created by or made more acute by government regulation and rules.
As Lord Turner noted Monday, the Basel II standards on bank capital actually allowed large financial firms to hold less capital than their smaller brethren, on the theory that large meant diversified and sophisticated and so less risky. Looking back, this was clearly a crazy policy—but it's worth recalling that it was propagated by the same luminaries who are now proposing to prevent the next crisis by tinkering with the regime that contributed to the last one. At a minimum, this should be an occasion of some humility from the wise men of bank regulation.
We now know that this presumption of safety in size was false. We also know that the costs of being wrong about such things—both for the public fisc and the real economy as a whole—are much greater than was commonly assumed before the panic.
So the price that large banks pay for the privileges of size should be a great deal higher than it was before. Whether banks benefit from the explicit guarantees of deposit insurance or the implicit protection of being too-big-to-fail, or both, governments have a right to demand that banks not ride free on the backs of taxpayers.
But whether it's less leverage, more capital, or restrictions on banking activities, no one should be under any illusion that the same people who failed to detect the last bubble and crash will be able to design a system capable of catching the next one in time. The relative risks of being too lax or too restrictive may be hard to gauge, but either way the odds of getting it wrong are substantial if not overwhelming.
This is why putting the risk of failure back into the system should be the sine qua non of any effort at reform. If regulators around the world get nothing else right, the final backstop has to be bankruptcy and/or dissolution for firms that have earned it.
So it's too bad Lord Turner spent precious little time on this particular question, preferring to ruminate on the relative merits of really narrow banking vs. moderately narrow banking, and how to make capital requirements more countercyclical.
We understand that regulators find it uncomfortable to ponder what should happen when all their best laid plans fail. The bankruptcy of a systemically important bank is, necessarily, also a failure of the regulators who were overseeing it.
'Systemic risk' is not only for banks.
WSJ, Nov 04, 2009
Financial Services Authority chief Adair Turner has finally stopped attacking bankers for their paychecks and started talking about the real issue—what to do about the banks deemed too-big-to-fail. Unfortunately, he's still worrying too much about how to prevent failure and not enough about how to facilitate it.
In his speech Monday to an international group of central and private bankers, Lord Turner identified three possible approaches to the problem:
• Make failure less likely by increasing capital requirements;
• Make banks smaller or less "systemic" by either narrowing what they can do or making them less interconnected;
• Or, finally, make failure easier by developing bankruptcy procedures or other "resolution" mechanisms for large financial institutions.
Of these, the last is the most important for reducing the moral hazard that did so much to contribute to the financial panic, as Bank of England Governor Mervyn King has persuasively argued. Even before the panic, systemically important banks enjoyed considerable advantages over their less "important" rivals, and many of these advantages were created by or made more acute by government regulation and rules.
As Lord Turner noted Monday, the Basel II standards on bank capital actually allowed large financial firms to hold less capital than their smaller brethren, on the theory that large meant diversified and sophisticated and so less risky. Looking back, this was clearly a crazy policy—but it's worth recalling that it was propagated by the same luminaries who are now proposing to prevent the next crisis by tinkering with the regime that contributed to the last one. At a minimum, this should be an occasion of some humility from the wise men of bank regulation.
We now know that this presumption of safety in size was false. We also know that the costs of being wrong about such things—both for the public fisc and the real economy as a whole—are much greater than was commonly assumed before the panic.
So the price that large banks pay for the privileges of size should be a great deal higher than it was before. Whether banks benefit from the explicit guarantees of deposit insurance or the implicit protection of being too-big-to-fail, or both, governments have a right to demand that banks not ride free on the backs of taxpayers.
But whether it's less leverage, more capital, or restrictions on banking activities, no one should be under any illusion that the same people who failed to detect the last bubble and crash will be able to design a system capable of catching the next one in time. The relative risks of being too lax or too restrictive may be hard to gauge, but either way the odds of getting it wrong are substantial if not overwhelming.
This is why putting the risk of failure back into the system should be the sine qua non of any effort at reform. If regulators around the world get nothing else right, the final backstop has to be bankruptcy and/or dissolution for firms that have earned it.
So it's too bad Lord Turner spent precious little time on this particular question, preferring to ruminate on the relative merits of really narrow banking vs. moderately narrow banking, and how to make capital requirements more countercyclical.
We understand that regulators find it uncomfortable to ponder what should happen when all their best laid plans fail. The bankruptcy of a systemically important bank is, necessarily, also a failure of the regulators who were overseeing it.
Tuesday, November 3, 2009
Implementing US Gov't Wildlife Surveillance Project to Detect and Predict Emerging Infectious Diseases
Implementing USAID Wildlife Surveillance Project to Detect and Predict Emerging Infectious Diseases
USAID, November 3, 2009
[There is a collection of articles on this. This is one of them, titled Implementing USAID Wildlife Surveillance Project to Detect and PREDICT Emerging Infectious Diseases, using Predict as an acronym. Other articles are here and here]
Washington, D.C. - The United States Agency for International Development (USAID) Bureau for Global Health is pleased to announce a partnership with UC Davis to monitor for and increase the local capacity in "geographic hot spots" to identify the emergence of new infectious diseases in high-risk wildlife such as bats, rodents, and non-human primates that could pose a major threat to human health. UC Davis leads a coalition of leading experts in wildlife surveillance including Wildlife Conservation Society, Wildlife Trust, The Smithsonian Institute, and Global Viral Forecasting, Inc. This is a five-year cooperative agreement with a ceiling of $75 million.
This project, named PREDICT, is part of the USAID Emerging Pandemic Threats Program - a specialized set of projects that build on the successes of the Agency's 30 years of work in disease surveillance, training and outbreak response. PREDICT will focus on expanding USAID's current monitoring of wild birds for H5N1 influenza to more broadly address the role played by wildlife in spreading of new disease threats.
PREDICT will be active in global hot spots where important wildlife hosts species have significant interaction with domestic animals and high-density human populations. In these regions, the team will focus on detecting disease-causing organisms in wildlife before they lead to human infection or death. Among the 1,461 pathogens recognized to cause diseases in humans, at least 60 percent are of animal origin. Predicting where these new diseases may emerge , and detecting viruses and other pathogens before they spread to people, holds the greatest potential to prevent new pandemics.
PREDICT will be led by Dr. Stephen S. Morse of Columbia University Mailman School of Public Health, a leading emerging disease authority. Other key staff include Dr. Jonna Mazet, the project's Deputy Director; Dr. William Karesh, Senior Technical Advisor; Dr. Peter Daszak, Technical Expert; and Dr. Nathan Wolfe, Technical Expert.
USAID, November 3, 2009
[There is a collection of articles on this. This is one of them, titled Implementing USAID Wildlife Surveillance Project to Detect and PREDICT Emerging Infectious Diseases, using Predict as an acronym. Other articles are here and here]
Washington, D.C. - The United States Agency for International Development (USAID) Bureau for Global Health is pleased to announce a partnership with UC Davis to monitor for and increase the local capacity in "geographic hot spots" to identify the emergence of new infectious diseases in high-risk wildlife such as bats, rodents, and non-human primates that could pose a major threat to human health. UC Davis leads a coalition of leading experts in wildlife surveillance including Wildlife Conservation Society, Wildlife Trust, The Smithsonian Institute, and Global Viral Forecasting, Inc. This is a five-year cooperative agreement with a ceiling of $75 million.
This project, named PREDICT, is part of the USAID Emerging Pandemic Threats Program - a specialized set of projects that build on the successes of the Agency's 30 years of work in disease surveillance, training and outbreak response. PREDICT will focus on expanding USAID's current monitoring of wild birds for H5N1 influenza to more broadly address the role played by wildlife in spreading of new disease threats.
PREDICT will be active in global hot spots where important wildlife hosts species have significant interaction with domestic animals and high-density human populations. In these regions, the team will focus on detecting disease-causing organisms in wildlife before they lead to human infection or death. Among the 1,461 pathogens recognized to cause diseases in humans, at least 60 percent are of animal origin. Predicting where these new diseases may emerge , and detecting viruses and other pathogens before they spread to people, holds the greatest potential to prevent new pandemics.
PREDICT will be led by Dr. Stephen S. Morse of Columbia University Mailman School of Public Health, a leading emerging disease authority. Other key staff include Dr. Jonna Mazet, the project's Deputy Director; Dr. William Karesh, Senior Technical Advisor; Dr. Peter Daszak, Technical Expert; and Dr. Nathan Wolfe, Technical Expert.
America's Natural Gas Revolution - A 'shale gale' of unconventional and abundant U.S. gas is transforming the energy market
America's Natural Gas Revolution. By DANIEL YERGIN AND ROBERT INESON
A 'shale gale' of unconventional and abundant U.S. gas is transforming the energy market.
The biggest energy innovation of the decade is natural gas—more specifically what is called "unconventional" natural gas. Some call it a revolution.
Yet the natural gas revolution has unfolded with no great fanfare, no grand opening ceremony, no ribbon cutting. It just crept up. In 1990, unconventional gas—from shales, coal-bed methane and so-called "tight" formations—was about 10% of total U.S. production. Today it is around 40%, and growing fast, with shale gas by far the biggest part.
The potential of this "shale gale" only really became clear around 2007. In Washington, D.C., the discovery has come later—only in the last few months. Yet it is already changing the national energy dialogue and overall energy outlook in the U.S.—and could change the global natural gas balance.
From the time of the California energy crisis at the beginning of this decade, it appeared that the U.S. was headed for an extended period of tight supplies, even shortages, of natural gas.
While gas has many favorable attributes—as a clean, relatively low-carbon fuel—abundance did not appear to be one of them. Prices had gone up, but increased drilling failed to bring forth additional supplies. The U.S., it seemed, was destined to become much more integrated into the global gas market, with increasing imports of liquefied natural gas (LNG).
But a few companies were trying to solve a perennial problem: how to liberate shale gas—the plentiful natural gas supplies locked away in the impermeable shale. The experimental lab was a sprawling area called the Barnett Shale in the environs of Fort Worth, Texas.
The companies were experimenting with two technologies. One was horizontal drilling. Instead of merely drilling straight down into the resource, horizontal wells go sideways after a certain depth, opening up a much larger area of the resource-bearing formation.
The other technology is known as hydraulic fracturing, or "fraccing." Here, the producer injects a mixture of water and sand at high pressure to create multiple fractures throughout the rock, liberating the trapped gas to flow into the well.
The critical but little-recognized breakthrough was early in this decade—finding a way to meld together these two increasingly complex technologies to finally crack the shale rock, and thus crack the code for a major new resource. It was not a single eureka moment, but rather the result of incremental experimentation and technical skill. The success freed the gas to flow in greater volumes and at a much lower unit cost than previously thought possible.
In the last few years, the revolution has spread into other shale plays, from Louisiana and Arkansas to Pennsylvania and New York State, and British Columbia as well.
The supply impact has been dramatic. In the lower 48, states thought to be in decline as a natural gas source, production surged an astonishing 15% from the beginning of 2007 to mid-2008. This increase is more than most other countries produce in total.
Equally dramatic is the effect on U.S. reserves. Proven reserves have risen to 245 trillion cubic feet (Tcf) in 2008 from 177 Tcf in 2000, despite having produced nearly 165 Tcf during those years. The recent increase in estimated U.S. gas reserves by the Potential Gas Committee, representing both academic and industry experts, is in itself equivalent to more than half of the total proved reserves of Qatar, the new LNG powerhouse. With more drilling experience, U.S. estimates are likely to rise dramatically in the next few years. At current levels of demand, the U.S. has about 90 years of proven and potential supply—a number that is bound to go up as more and more shale gas is found.
To have the resource base suddenly expand by this much is a game changer. But what is getting changed?
It transforms the debate over generating electricity. The U.S. electric power industry faces very big questions about fuel choice and what kind of new generating capacity to build. In the face of new climate regulations, the increased availability of gas will likely lead to more natural gas consumption in electric power because of gas's relatively lower CO2 emissions. Natural gas power plants can also be built more quickly than coal-fired plants.
Some areas like Pennsylvania and New York, traditionally importers of the bulk of their energy from elsewhere, will instead become energy producers. It could also mean that more buses and truck fleets will be converted to natural gas. Energy-intensive manufacturing companies, which have been moving overseas in search of cheaper energy in order to remain globally competitive, may now stay home.
But these industrial users and the utilities with their long investment horizons—both of which have been whipsawed by recurrent cycles of shortage and surplus in natural gas over several decades—are inherently skeptical and will require further confirmation of a sustained shale gale before committing.
More abundant gas will have another, not so well recognized effect—facilitating renewable development. Sources like wind and solar are "intermittent." When the wind doesn't blow and the sun doesn't shine, something has to pick up the slack, and that something is likely to be natural-gas fired electric generation. This need will become more acute as the mandates for renewable electric power grow.
So far only one serious obstacle to development of shale resources across the U.S. has appeared—water. The most visible concern is the fear in some quarters that hydrocarbons or chemicals used in fraccing might flow into aquifers that supply drinking water. However, in most instances, the gas-bearing and water-bearing layers are widely separated by thousands of vertical feet, as well as by rock, with the gas being much deeper.
Therefore, the hydraulic fracturing of gas shales is unlikely to contaminate drinking water. The risks of contamination from surface handling of wastes, common to all industrial processes, requires continued care. While fraccing uses a good deal of water, it is actually less water-intensive than many other types of energy production.
Unconventional natural gas has already had a global impact. With the U.S. market now oversupplied, and storage filled to the brim, there's been much less room for LNG. As a result more LNG is going into Europe, leading to lower spot prices and talk of modifying long-term contracts.
But is unconventional natural gas going to go global? Preliminary estimates suggest that shale gas resources around the world could be equivalent to or even greater than current proven natural gas reserves. Perhaps much greater. But here in the U.S., our independent oil and gas sector, open markets and private ownership of mineral rights facilitated development. Elsewhere development will require negotiations with governments, and potentially complex regulatory processes. Existing long-term contracts, common in much of the natural gas industry outside the U.S., could be another obstacle. Extensive new networks of pipelines and infrastructure will have to be built. And many parts of the world still have ample conventional gas to develop first.
Yet interest and activity are picking up smartly outside North America. A shale gas revolution in Europe and Asia would change the competitive dynamics of the globalized gas market, altering economic calculations and international politics.
This new innovation will take time to establish its global credentials. The U.S. is really only beginning to grapple with the significance. It may be half a decade before the strength of the unconventional gas revolution outside North America can be properly assessed. But what has begun as the shale gale in the U.S. could end up being an increasingly powerful wind that blows through the world economy.
Mr. Yergin, author of the Pulitzer Prize-winning "The Prize: The Epic Quest for Oil, Money, & Power" (Free Press, new edition, 2009) is chairman of IHS CERA. Mr. Ineson is senior director of global gas for IHS CERA.
A 'shale gale' of unconventional and abundant U.S. gas is transforming the energy market.
The biggest energy innovation of the decade is natural gas—more specifically what is called "unconventional" natural gas. Some call it a revolution.
Yet the natural gas revolution has unfolded with no great fanfare, no grand opening ceremony, no ribbon cutting. It just crept up. In 1990, unconventional gas—from shales, coal-bed methane and so-called "tight" formations—was about 10% of total U.S. production. Today it is around 40%, and growing fast, with shale gas by far the biggest part.
The potential of this "shale gale" only really became clear around 2007. In Washington, D.C., the discovery has come later—only in the last few months. Yet it is already changing the national energy dialogue and overall energy outlook in the U.S.—and could change the global natural gas balance.
From the time of the California energy crisis at the beginning of this decade, it appeared that the U.S. was headed for an extended period of tight supplies, even shortages, of natural gas.
While gas has many favorable attributes—as a clean, relatively low-carbon fuel—abundance did not appear to be one of them. Prices had gone up, but increased drilling failed to bring forth additional supplies. The U.S., it seemed, was destined to become much more integrated into the global gas market, with increasing imports of liquefied natural gas (LNG).
But a few companies were trying to solve a perennial problem: how to liberate shale gas—the plentiful natural gas supplies locked away in the impermeable shale. The experimental lab was a sprawling area called the Barnett Shale in the environs of Fort Worth, Texas.
The companies were experimenting with two technologies. One was horizontal drilling. Instead of merely drilling straight down into the resource, horizontal wells go sideways after a certain depth, opening up a much larger area of the resource-bearing formation.
The other technology is known as hydraulic fracturing, or "fraccing." Here, the producer injects a mixture of water and sand at high pressure to create multiple fractures throughout the rock, liberating the trapped gas to flow into the well.
The critical but little-recognized breakthrough was early in this decade—finding a way to meld together these two increasingly complex technologies to finally crack the shale rock, and thus crack the code for a major new resource. It was not a single eureka moment, but rather the result of incremental experimentation and technical skill. The success freed the gas to flow in greater volumes and at a much lower unit cost than previously thought possible.
In the last few years, the revolution has spread into other shale plays, from Louisiana and Arkansas to Pennsylvania and New York State, and British Columbia as well.
The supply impact has been dramatic. In the lower 48, states thought to be in decline as a natural gas source, production surged an astonishing 15% from the beginning of 2007 to mid-2008. This increase is more than most other countries produce in total.
Equally dramatic is the effect on U.S. reserves. Proven reserves have risen to 245 trillion cubic feet (Tcf) in 2008 from 177 Tcf in 2000, despite having produced nearly 165 Tcf during those years. The recent increase in estimated U.S. gas reserves by the Potential Gas Committee, representing both academic and industry experts, is in itself equivalent to more than half of the total proved reserves of Qatar, the new LNG powerhouse. With more drilling experience, U.S. estimates are likely to rise dramatically in the next few years. At current levels of demand, the U.S. has about 90 years of proven and potential supply—a number that is bound to go up as more and more shale gas is found.
To have the resource base suddenly expand by this much is a game changer. But what is getting changed?
It transforms the debate over generating electricity. The U.S. electric power industry faces very big questions about fuel choice and what kind of new generating capacity to build. In the face of new climate regulations, the increased availability of gas will likely lead to more natural gas consumption in electric power because of gas's relatively lower CO2 emissions. Natural gas power plants can also be built more quickly than coal-fired plants.
Some areas like Pennsylvania and New York, traditionally importers of the bulk of their energy from elsewhere, will instead become energy producers. It could also mean that more buses and truck fleets will be converted to natural gas. Energy-intensive manufacturing companies, which have been moving overseas in search of cheaper energy in order to remain globally competitive, may now stay home.
But these industrial users and the utilities with their long investment horizons—both of which have been whipsawed by recurrent cycles of shortage and surplus in natural gas over several decades—are inherently skeptical and will require further confirmation of a sustained shale gale before committing.
More abundant gas will have another, not so well recognized effect—facilitating renewable development. Sources like wind and solar are "intermittent." When the wind doesn't blow and the sun doesn't shine, something has to pick up the slack, and that something is likely to be natural-gas fired electric generation. This need will become more acute as the mandates for renewable electric power grow.
So far only one serious obstacle to development of shale resources across the U.S. has appeared—water. The most visible concern is the fear in some quarters that hydrocarbons or chemicals used in fraccing might flow into aquifers that supply drinking water. However, in most instances, the gas-bearing and water-bearing layers are widely separated by thousands of vertical feet, as well as by rock, with the gas being much deeper.
Therefore, the hydraulic fracturing of gas shales is unlikely to contaminate drinking water. The risks of contamination from surface handling of wastes, common to all industrial processes, requires continued care. While fraccing uses a good deal of water, it is actually less water-intensive than many other types of energy production.
Unconventional natural gas has already had a global impact. With the U.S. market now oversupplied, and storage filled to the brim, there's been much less room for LNG. As a result more LNG is going into Europe, leading to lower spot prices and talk of modifying long-term contracts.
But is unconventional natural gas going to go global? Preliminary estimates suggest that shale gas resources around the world could be equivalent to or even greater than current proven natural gas reserves. Perhaps much greater. But here in the U.S., our independent oil and gas sector, open markets and private ownership of mineral rights facilitated development. Elsewhere development will require negotiations with governments, and potentially complex regulatory processes. Existing long-term contracts, common in much of the natural gas industry outside the U.S., could be another obstacle. Extensive new networks of pipelines and infrastructure will have to be built. And many parts of the world still have ample conventional gas to develop first.
Yet interest and activity are picking up smartly outside North America. A shale gas revolution in Europe and Asia would change the competitive dynamics of the globalized gas market, altering economic calculations and international politics.
This new innovation will take time to establish its global credentials. The U.S. is really only beginning to grapple with the significance. It may be half a decade before the strength of the unconventional gas revolution outside North America can be properly assessed. But what has begun as the shale gale in the U.S. could end up being an increasingly powerful wind that blows through the world economy.
Mr. Yergin, author of the Pulitzer Prize-winning "The Prize: The Epic Quest for Oil, Money, & Power" (Free Press, new edition, 2009) is chairman of IHS CERA. Mr. Ineson is senior director of global gas for IHS CERA.
Monday, November 2, 2009
CIT's Bankruptcy Lesson - Treasury proves it can't identify systemic risk
CIT's Bankruptcy Lesson. WSJ Editorial
Treasury proves it can't identify systemic risk.
The Wall Street Journal, page A20
The $2.3 billion of Troubled Asset Relief Program money that will likely be lost in the bankruptcy of commercial lender CIT is hard to swallow, but it may be the most instructive loss taxpayers absorb all year.
Just as the Treasury Department is urging Congress to junk the bankruptcy process and hand over virtually unlimited bailout authority to the executive branch, CIT is proving two things: Bankruptcy works—even for financial firms—and the U.S. Treasury judges systemic risk out of its political hip pocket.
Treasury provided the $2.3 billion TARP injection last December. Then when CIT was on the ropes last July, Treasury urged the Federal Deposit Insurance Corp. to provide debt guarantees to help the company raise capital. Treasury made the case that a CIT failure posed a systemic risk given the number of small and medium-sized companies that rely on CIT for short-term financing.
We argued against this in July. More importantly, FDIC Chair Sheila Bair rejected it. Since her wise decision, CIT has been providing a laboratory to observe the recuperative pain of bankruptcy in an experiment uncontrolled by politicians.
With no federal lifeline coming, the company's major bondholders quickly agreed to a $3 billion secured loan facility and the company began restructuring its liabilities. It became clear that bankruptcy would be necessary and the company recently gained the support of almost 90% of its voting debt holders for a prepackaged reorganization plan that could allow the lender to emerge from Chapter 11 by the end of the year.
While the holding company declared bankruptcy on Sunday, its operating subsidiaries remain outside Chapter 11 and continue to serve customers. Some are choosing to continue with CIT, others are choosing to go with a competitor. Armageddon it is not.
Bankruptcy is a process under the rule of law that is demanded by the Constitution. "Markets not ministers," says former SEC Chairman Richard Breeden in summing up his preference for bankruptcy guided by judges over interventions crafted by politicians. Let's hope CIT's example brings the former back into fashion.
Treasury proves it can't identify systemic risk.
The Wall Street Journal, page A20
The $2.3 billion of Troubled Asset Relief Program money that will likely be lost in the bankruptcy of commercial lender CIT is hard to swallow, but it may be the most instructive loss taxpayers absorb all year.
Just as the Treasury Department is urging Congress to junk the bankruptcy process and hand over virtually unlimited bailout authority to the executive branch, CIT is proving two things: Bankruptcy works—even for financial firms—and the U.S. Treasury judges systemic risk out of its political hip pocket.
Treasury provided the $2.3 billion TARP injection last December. Then when CIT was on the ropes last July, Treasury urged the Federal Deposit Insurance Corp. to provide debt guarantees to help the company raise capital. Treasury made the case that a CIT failure posed a systemic risk given the number of small and medium-sized companies that rely on CIT for short-term financing.
We argued against this in July. More importantly, FDIC Chair Sheila Bair rejected it. Since her wise decision, CIT has been providing a laboratory to observe the recuperative pain of bankruptcy in an experiment uncontrolled by politicians.
With no federal lifeline coming, the company's major bondholders quickly agreed to a $3 billion secured loan facility and the company began restructuring its liabilities. It became clear that bankruptcy would be necessary and the company recently gained the support of almost 90% of its voting debt holders for a prepackaged reorganization plan that could allow the lender to emerge from Chapter 11 by the end of the year.
While the holding company declared bankruptcy on Sunday, its operating subsidiaries remain outside Chapter 11 and continue to serve customers. Some are choosing to continue with CIT, others are choosing to go with a competitor. Armageddon it is not.
Bankruptcy is a process under the rule of law that is demanded by the Constitution. "Markets not ministers," says former SEC Chairman Richard Breeden in summing up his preference for bankruptcy guided by judges over interventions crafted by politicians. Let's hope CIT's example brings the former back into fashion.
Subscribe to:
Posts (Atom)