Women, Welch Clash at Forum. By John Bussey
Wall Street Journal, May 4, 2012, page B1
http://online.wsj.com/article/SB10001424052702303877604577382321364803912.html
Is Jack Welch a timeless seer or an out-of-touch warhorse?
The former Master and Commander of General Electric still writes widely on business strategy. He's also influential on the speaking circuit.
On Wednesday, Mr. Welch and his wife and writing partner, Suzy Welch, told a gathering of women executives from a range of industries that, in matters of career track, it is results and performance that chart the way. Programs promoting diversity, mentorships and affinity groups may or may not be good, but they are not how women get ahead. "Over deliver," Mr. Welch advised. "Performance is it!"
Angry murmurs ran through the crowd. The speakers asked: Were there any questions?
"We're regaining our consciousness," one woman executive shot back.
Mr. Welch had walked into a spinning turbine fan blade.
"Of course women need to perform to advance," Alison Quirk, an executive vice president at the investment firm State Street Corp., said later. "But we can all do more to help people understand their unconscious biases."
"He showed no recognition that the culture shapes the performance metrics, and the culture is that of white men," another executive said.
Academy Award winning actor Geena Davis talks about the perception of women as seen in the media and about what has and has not changed in the past sixty years.
Dee Dee Myers, a former White House press secretary who is now with Glover Park Group, a communications firm, added: "While he seemed to acknowledge the value of a diverse workforce, he didn't seem to think it was necessary to develop strategies for getting there—and especially for taking a cold, hard look at some of the subtle barriers to women's advancement that still exist. If objective performance measures were enough, more than a handful of Fortune 500 senior executives would already be women. "
"This meritocracy fiction may be the single biggest obstacle to women's advancement," added Lisa Levey, a consultant who heard Mr. Welch speak.
Mr. Welch has sparked controversy in the past with his view of the workplace. In 2009, he told a group of human-resources managers: "There's no such thing as work-life balance." Instead, "there are work-life choices, and you make them, and they have consequences." Step out of the arena to raise kids, and don't be surprised if the promotion passes you by.
Of the Fortune 500 companies, only 3% have a female CEO today. Female board membership is similarly spare. A survey of 60 major companies by McKinsey shows women occupying 53% of entry-level positions, 40% of manager positions, and only 19% of C-suite jobs.
The reasons for this are complex and aren't always about child rearing. A separate McKinsey survey showed that among women who have already reached the status of successful executive, 59% don't aspire to one of the top jobs. The majority of these women have already had children.
"Their work ethic—these people are doing it all," said Dominic Barton of McKinsey. "They say, 'I'm the person turning off the lights'" at the end of the day.
Instead, Mr. Barton said, it's "the soft stuff, the culture" that's shaping their career decisions.
The group of women executives who wrestled with Mr. Welch were at a conference on Women in the Economy held by The Wall Street Journal this week. Among other things, they tackled the culture questions—devising strategies to get more high-performing women to the top, keep women on track during childbearing years, address bias, and make the goals of diversity motivating to employees. They also discussed the sexual harassment some women still experience in the workplace. (A report on the group's findings will be published in the Journal Monday.)
The realm of the "soft stuff" may not be Mr. Welch's favored zone. During his remarks, he referred to human resources as "the H.R. teams that are out there, most of them for birthdays and picnics." He mentioned a women's forum inside GE that he says attracted 500 participants. "The best of the women would come to me and say, 'I don't want to be in a special group. I'm not in the victim's unit. I'm a star. I want to be compared with the best of your best.'"
And then he addressed the audience: "Stop lying about it. It's true. Great women get upset about getting into the victim's unit."
Individual mentoring programs, meanwhile, are "one of the worst ideas that ever came along," he said. "You should see everyone as a mentor."
He had this advice for women who want to get ahead: Grab tough assignments to prove yourself, get line experience, and embrace serious performance reviews and the coaching inherent in them.
"Without a rigorous appraisal system, without you knowing where you stand...and how you can improve, none of these 'help' programs that were up there are going to be worth much to you," he said. Mr. Welch said later that the appraisal "is the best way to attack bias" because the facts go into the document, which both parties have to sign.
Mr. Welch championed the business philosophy of "Six Sigma" at GE, a strategy that seeks to expunge defects from production through constant review and improvement. It appears to work with machines and business processes.
But applying that clinical procedure to the human character, as Mr. Welch seems to want to do, is a stickier proposition.
"His advice was not tailored to how women can attain parity in today's male-dominated workplace," said one female board member of a Fortune 500 company. Indeed, a couple of women walked out in frustration during his presentation.
Friday, May 4, 2012
Wednesday, May 2, 2012
Dynamic Loan Loss Provisioning: Simulations on Effectiveness and Guide to Implementation
Dynamic Loan Loss Provisioning: Simulations on Effectiveness and Guide to Implementation. By Torsten Wezel, Jorge A. Chan Lau, and Francesco Columba
IMF Working Paper No. 12/110
May 01, 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25885.0
Summary: This simulation-based paper investigates the impact of different methods of dynamic provisioning on bank soundness and shows that this increasingly popular macroprudential tool can smooth provisioning costs over the credit cycle and lower banks’ probability of default. In additon, the paper offers an in-depth guide to implementation that addresses pertinent issues related to data requirements, calibration and safeguards as well as accounting, disclosure and tax treatment. It also discusses the interaction of dynamic provisioning with other macroprudential instruments such as countercyclical capital.
Excerpts:
Introduction
Reducing the procyclicality of the banking sector by way of macroprudential policy instruments has become a policy priority. The recent crisis has illustrated how excessive procyclicality of the banking system may activate powerful macro-financial linkages that amplify the business cycle and how increased financial instability can have large negative spillover effects onto the real sector. Moreover, research has shown that crises that included banking turmoil are among the longest and most severe of all crises.
Although there is no consensus yet on the very definition of macroprudential policy, an array of such tools, especially those of countercyclical nature, has been applied in many countries for years. But it was only during the financial crisis that powerful macro-financial linkages played out on a global scale, conveying a sense of urgency.
In the wake of the crisis, policymakers therefore intensified their efforts to gear the macroprudential approach to financial stability towards improving banks’ capacity to absorb shocks—a consultative process that culminated in the development of the Basel III framework in December 2010 to be phased in over the coming years. In addition to improving the quality of bank capital and liquidity as well as imposing a minimum leverage ratio, this new regulatory standard introduces countercyclical capital buffers and lends support to forward-looking loan loss provisioning, which comprises dynamic provisioning (DP).
The new capital standard promotes the build-up of capital buffers in good times that can be drawn down in periods of stress, in the form of a capital conservation requirement to increase the banking sector’s resilience entering into a downturn. Part of this conservation buffer would be a countercyclical buffer that is to be activated only when there is excess credit growth so that the sector is not destabilized in the downturn. Such countercyclical capital has also been characterized as potentially cushioning the economy’s real output during a crisis (IMF, 2011). Similarly, dynamic provisioning requires banks to build a cushion of generic provisions during an upswing that can be used to cover rising specific provisions linked to loan delinquencies during the subsequent downturn.
Both countercyclical capital and DP have been applied in practice. Some countries have adjusted capital regulations in different phases of the cycle to give them a more potent countercyclical impact: Brazil has used a formula to smooth capital requirements for interest rate risk in times of extreme volatility, China introduced a countercyclical capital requirement similar to the countercyclical buffer under Basel III, and India has made countercyclical adjustments in risk weights and in provisioning. DP was first introduced by Spain in 2000 and subsequently adopted in Uruguay, Colombia, Peru, and Bolivia, while other countries such as Mexico and Chile switched to provisioning based on expected loan loss. Peru is the only country to explicitly use both countercyclical instruments in combination.
The concept of DP examined in this paper is intriguing. By gradually building a countercyclical loan loss reserve in good times and then using it to cover losses as they arise in bad times, DP is able to greatly smooth provisioning costs over the cycle and thus insulate banks’ profit and loss statements in this regard. Therefore, DP may usefully complement other policies targeted more at macroeconomic aggregates. The implementation of DP can, however, be a delicate balancing exercise. The calibration is typically challenging because it requires specific data, and even if these are available, it may still be inaccurate if the subsequent credit cycle differs substantially from the previous one(s) on which the model is necessarily predicated. Over-provisioning may ensue in particular instances. This said, a careful calibration that tries to incorporate as many of the stylized facts of past credit developments as possible goes a long way in providing a sizeable cushion for banks to withstand periodic downswings.
This paper provides strong support for DP as a tool for countercyclical banking policies. Our contribution to this strand of the literature is threefold. We first recreate a hypothetical path of provisions under different DP systems based on historical data of an emerging banking market and compare the outcome to the actual situation without DP. These counterfactual simulations suggest that a well-calibrated system of DP mitigates procyclicality in provisioning costs and thus earnings and capital. Second, using Monte-Carlo simulations we show that the countercyclical buffer that DP builds typically lowers a bank’s probability of default. Finally, we offer a guide to implementation of the DP concept that seeks to clarify issues related to data requirements, choice of formula, parametrization, accounting treatment, and recalibration.
Other studies that have used counterfactual simulations based on historical data to assess the hypothetical performance under DP include Balla and McKenna (2009), Fillat and Montoriol- Garriga (2010), both using U.S. bank data, and Wezel (2010), using data for Uruguay. All studies find support for the notion that DP, when properly calibrated, can help absorb rising loan losses in a downturn and thus be a useful macroprudential tool in this regard. Some other studies (Lim et al., 2011; Peydró-Alcalde et al., 2011) even find that DP is effective in mitigating swings in credit growth, although this should not be expected of DP in general.
Conclusion
This paper has provided a thorough analysis of the merits and challenges associated with dynamic provisioning—a macroprudential tool that deserves attention from policymakers and regulators for its capacity to distribute the burden of loan impairment evenly over the credit cycle and so quench an important source of procyclicality in banking. Our simulations that apply the Spanish and Peruvian DP formulas to a full cycle of banking data of an advanced emerging market leave little doubt that the countercyclical buffer built under DP not only smoothes costs but actually bolsters financial stability by lowering banks’ PD in severe downturn conditions. We also show that for best countercyclical results DP should be tailored to the different risk exposures of individual banks and the specific circumstances of banking sectors, presenting measures such as bank-specific rates or hybrid systems combining the virtues of formulas.
While the simple concept of providing in good times for lean years is intuitive, it has its operational challenges. When calibrating a DP system great care must be taken to keep countercyclical reserves in line with expected loan losses and so avoid insufficient buffers or excessive coverage. As many of the features and needed restrictions are not easily understood or operationalized, we offer a comprehensive primer for regulators eager to implement one of the variants of DP analyzed in the paper. The discussion of practical challenges also includes thorny issues like compliance with accounting standards. In fact, policymakers have long tended to dismiss DP on grounds that it is not legitimate from an accounting perspective and therefore focused on other tools such as countercyclical capital. To remedy this problem, we propose ways to recalibrate the formula periodically and so keep it in line with expected loan loss. Further, while recognizing that countercyclical capital has its definite place in the macroprudential toolkit, we argue that DP acts as a first line of defense by directly shielding bank profits, thereby lowering the degree to which other countercyclical instruments are needed. However, there should be no doubt that due to the limited impact of DP in restraining excessive credit growth complacency in supervision due to DP buffers should be avoided and that DP needs to be accompanied by other macroprudential tools aimed at mitigating particular systemic risks.
Clearly, further research is needed on the interaction between DP and countercyclical capital as well as other macroprudential tools to answer the question in what ways they can complement one another in providing an integrated countercyclical buffer. As an early example, Saurina (2011) analyzes DP and countercyclical capital side-by-side but not their possible interaction. Another area of needed research is the impact of DP on credit cycles and other macroeconomic aggregates. Newer studies (e.g., Peydró-Alcalde et al., 2011; Chan-Lau, 2012) evaluate the implications of DP for credit availability, yet broader-based results are certainly warranted. The ongoing efforts by a number of countries towards adopting DP systems and other forms of forward-looking provisioning will provide a fertile ground for such future research.
IMF Working Paper No. 12/110
May 01, 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25885.0
Summary: This simulation-based paper investigates the impact of different methods of dynamic provisioning on bank soundness and shows that this increasingly popular macroprudential tool can smooth provisioning costs over the credit cycle and lower banks’ probability of default. In additon, the paper offers an in-depth guide to implementation that addresses pertinent issues related to data requirements, calibration and safeguards as well as accounting, disclosure and tax treatment. It also discusses the interaction of dynamic provisioning with other macroprudential instruments such as countercyclical capital.
Excerpts:
Introduction
Reducing the procyclicality of the banking sector by way of macroprudential policy instruments has become a policy priority. The recent crisis has illustrated how excessive procyclicality of the banking system may activate powerful macro-financial linkages that amplify the business cycle and how increased financial instability can have large negative spillover effects onto the real sector. Moreover, research has shown that crises that included banking turmoil are among the longest and most severe of all crises.
Although there is no consensus yet on the very definition of macroprudential policy, an array of such tools, especially those of countercyclical nature, has been applied in many countries for years. But it was only during the financial crisis that powerful macro-financial linkages played out on a global scale, conveying a sense of urgency.
In the wake of the crisis, policymakers therefore intensified their efforts to gear the macroprudential approach to financial stability towards improving banks’ capacity to absorb shocks—a consultative process that culminated in the development of the Basel III framework in December 2010 to be phased in over the coming years. In addition to improving the quality of bank capital and liquidity as well as imposing a minimum leverage ratio, this new regulatory standard introduces countercyclical capital buffers and lends support to forward-looking loan loss provisioning, which comprises dynamic provisioning (DP).
The new capital standard promotes the build-up of capital buffers in good times that can be drawn down in periods of stress, in the form of a capital conservation requirement to increase the banking sector’s resilience entering into a downturn. Part of this conservation buffer would be a countercyclical buffer that is to be activated only when there is excess credit growth so that the sector is not destabilized in the downturn. Such countercyclical capital has also been characterized as potentially cushioning the economy’s real output during a crisis (IMF, 2011). Similarly, dynamic provisioning requires banks to build a cushion of generic provisions during an upswing that can be used to cover rising specific provisions linked to loan delinquencies during the subsequent downturn.
Both countercyclical capital and DP have been applied in practice. Some countries have adjusted capital regulations in different phases of the cycle to give them a more potent countercyclical impact: Brazil has used a formula to smooth capital requirements for interest rate risk in times of extreme volatility, China introduced a countercyclical capital requirement similar to the countercyclical buffer under Basel III, and India has made countercyclical adjustments in risk weights and in provisioning. DP was first introduced by Spain in 2000 and subsequently adopted in Uruguay, Colombia, Peru, and Bolivia, while other countries such as Mexico and Chile switched to provisioning based on expected loan loss. Peru is the only country to explicitly use both countercyclical instruments in combination.
The concept of DP examined in this paper is intriguing. By gradually building a countercyclical loan loss reserve in good times and then using it to cover losses as they arise in bad times, DP is able to greatly smooth provisioning costs over the cycle and thus insulate banks’ profit and loss statements in this regard. Therefore, DP may usefully complement other policies targeted more at macroeconomic aggregates. The implementation of DP can, however, be a delicate balancing exercise. The calibration is typically challenging because it requires specific data, and even if these are available, it may still be inaccurate if the subsequent credit cycle differs substantially from the previous one(s) on which the model is necessarily predicated. Over-provisioning may ensue in particular instances. This said, a careful calibration that tries to incorporate as many of the stylized facts of past credit developments as possible goes a long way in providing a sizeable cushion for banks to withstand periodic downswings.
This paper provides strong support for DP as a tool for countercyclical banking policies. Our contribution to this strand of the literature is threefold. We first recreate a hypothetical path of provisions under different DP systems based on historical data of an emerging banking market and compare the outcome to the actual situation without DP. These counterfactual simulations suggest that a well-calibrated system of DP mitigates procyclicality in provisioning costs and thus earnings and capital. Second, using Monte-Carlo simulations we show that the countercyclical buffer that DP builds typically lowers a bank’s probability of default. Finally, we offer a guide to implementation of the DP concept that seeks to clarify issues related to data requirements, choice of formula, parametrization, accounting treatment, and recalibration.
Other studies that have used counterfactual simulations based on historical data to assess the hypothetical performance under DP include Balla and McKenna (2009), Fillat and Montoriol- Garriga (2010), both using U.S. bank data, and Wezel (2010), using data for Uruguay. All studies find support for the notion that DP, when properly calibrated, can help absorb rising loan losses in a downturn and thus be a useful macroprudential tool in this regard. Some other studies (Lim et al., 2011; Peydró-Alcalde et al., 2011) even find that DP is effective in mitigating swings in credit growth, although this should not be expected of DP in general.
Conclusion
This paper has provided a thorough analysis of the merits and challenges associated with dynamic provisioning—a macroprudential tool that deserves attention from policymakers and regulators for its capacity to distribute the burden of loan impairment evenly over the credit cycle and so quench an important source of procyclicality in banking. Our simulations that apply the Spanish and Peruvian DP formulas to a full cycle of banking data of an advanced emerging market leave little doubt that the countercyclical buffer built under DP not only smoothes costs but actually bolsters financial stability by lowering banks’ PD in severe downturn conditions. We also show that for best countercyclical results DP should be tailored to the different risk exposures of individual banks and the specific circumstances of banking sectors, presenting measures such as bank-specific rates or hybrid systems combining the virtues of formulas.
While the simple concept of providing in good times for lean years is intuitive, it has its operational challenges. When calibrating a DP system great care must be taken to keep countercyclical reserves in line with expected loan losses and so avoid insufficient buffers or excessive coverage. As many of the features and needed restrictions are not easily understood or operationalized, we offer a comprehensive primer for regulators eager to implement one of the variants of DP analyzed in the paper. The discussion of practical challenges also includes thorny issues like compliance with accounting standards. In fact, policymakers have long tended to dismiss DP on grounds that it is not legitimate from an accounting perspective and therefore focused on other tools such as countercyclical capital. To remedy this problem, we propose ways to recalibrate the formula periodically and so keep it in line with expected loan loss. Further, while recognizing that countercyclical capital has its definite place in the macroprudential toolkit, we argue that DP acts as a first line of defense by directly shielding bank profits, thereby lowering the degree to which other countercyclical instruments are needed. However, there should be no doubt that due to the limited impact of DP in restraining excessive credit growth complacency in supervision due to DP buffers should be avoided and that DP needs to be accompanied by other macroprudential tools aimed at mitigating particular systemic risks.
Clearly, further research is needed on the interaction between DP and countercyclical capital as well as other macroprudential tools to answer the question in what ways they can complement one another in providing an integrated countercyclical buffer. As an early example, Saurina (2011) analyzes DP and countercyclical capital side-by-side but not their possible interaction. Another area of needed research is the impact of DP on credit cycles and other macroeconomic aggregates. Newer studies (e.g., Peydró-Alcalde et al., 2011; Chan-Lau, 2012) evaluate the implications of DP for credit availability, yet broader-based results are certainly warranted. The ongoing efforts by a number of countries towards adopting DP systems and other forms of forward-looking provisioning will provide a fertile ground for such future research.
Tuesday, May 1, 2012
Pharma: New Tufts Report Shows Academic-Industry Partnerships Are Mutually Beneficial
New Tufts Report Shows Academic-Industry Partnerships Are Mutually Beneficial |
http://www.innovation.org/index.cfm/NewsCenter/Newsletters?NID=200
April 30, 2012 -
According to a new study by the Tufts Center for the Study of Drug Development, collaboration among organizations is becoming increasingly important to advancing basic research and developing new medicines. This study specifically explores the breadth and nature of partnerships between biopharmaceutical companies and academic medical centers (AMCs)[1] which are likely to play an increasingly important role in making progress in treating unmet medical needs.
According to a new study by the Tufts Center for the Study of Drug Development, collaboration among organizations is becoming increasingly important to advancing basic research and developing new medicines. This study specifically explores the breadth and nature of partnerships between biopharmaceutical companies and academic medical centers (AMCs)[1] which are likely to play an increasingly important role in making progress in treating unmet medical needs.
In
the study, researchers examine a subset of public-private partnerships,
including more than 3,000 grants to AMCs from approximately 450
biopharmaceutical company sponsors that were provided through 22 medical
schools. Findings show that while it is generally accepted that these
partnerships have become an increasingly common approach both to promote
public health objectives and to produce healthcare innovations, it is
anticipated that their nature will continue to evolve over time and
their full potential is yet to be realized.
Tufts
researchers also found that the nature of these relationships is
varied, ever-changing, and expanding. They often involve company and AMC
scientists and other researchers working side-by-side on cutting-edge
science, applying advanced tools and resources. This type of innovative
research has enabled the United States to advance biomedical research in
a number of areas, such as the development of personalized medicines
and the understanding of rare diseases.
The
report outlines the 12 primary models of academic-industry
collaborations and highlights other emerging models, which reflect a
shift in the nature of academic-industry relationships toward more risk-
and resource-sharing partnerships. While unrestricted research support
has generally represented the most common form of academic-industry
collaboration, Tufts research found that this model is becoming less
frequently used. A range of innovative partnership models are emerging,
from corporate venture capital funds to pre-competitive research centers
to increasingly used academic drug discovery centers.
These
collaborations occur across all aspects of drug discovery and the
partnerships benefit both industry and academia since they provide the
opportunity for the leading biomedical researchers in both sectors to
work together to explore new technologies and scientific discoveries.
Such innovation in both the science and technology has the potential to
treat the most challenging diseases and conditions facing patients
today.
According
to Tufts, “[t]he industry is funding and working collaboratively with
the academic component of the public sector on basic research that
contributes broadly across the entire spectrum of biomedical R&D,
not just for products in its portfolio.” In conclusion, the report notes
that in the face of an increasingly challenging R&D environment and
overall global competition, we are likely to witness the continued
proliferation of AMC-industry partnerships.
[1] C.P.
Milne, et al., “Academic-Industry Partnerships for Biopharmaceutical
Research & Development: Advancing Medical Science in the U.S.,”
Tufts Center for the Study of Drug Development, April 2012.
Tuesday, April 24, 2012
From Bail-out to Bail-in: Mandatory Debt Restructuring of Systemic Financial Institutions
From Bail-out to Bail-in: Mandatory Debt Restructuring of Systemic Financial Institutions. By Zhou, Jian-Ping; Rutledge, Virginia; Bossu, Wouter; Dobler, Marc; Jassaud, Nadege; Moore, Michael
IMF Staff Discussion Notes No. 12/03
April 24, 2012
ISBN/ISSN: 978-1-61635-392-6 / 2221-030X
Stock No: SDNETEA2012003
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25858.0
Excerpts
Executive Summary
Large-scale government support of the financial institutions deemed too big or too important to fail during the recent crisis has been costly and has potentially increased moral hazard. To protect taxpayers from exposure to bank losses and to reduce the risks posed by too-big-tofail (TBTF), various reform initiatives have been undertaken at both national and international levels, including expanding resolution powers and tools.
One example is bail-in, which is a statutory power of a resolution authority (as opposed to contractual arrangements, such as contingent capital requirements) to restructure the liabilities of a distressed financial institution by writing down its unsecured debt and/or converting it to equity. The statutory bail-in power is intended to achieve a prompt recapitalization and restructuring of the distressed institution. This paper studies its effectiveness in restoring the viability of distressed institutions, discusses potential risks when a bail-in power is activated, and proposes design features to mitigate these risks. The main conclusions are:
1. As a going-concern form of resolution, bail-in could mitigate the systemic risks associated with disorderly liquidations, reduce deleveraging pressures, and preserve asset values that might otherwise be lost in a liquidation. With a credible threat of stock elimination or dilution by debt conversion and assumption of management by resolution authorities, financial institutions may be incentivized to raise capital or restructure debt voluntarily before the triggering of the bail-in power.
2. However, if the use of a bail-in power is perceived by the market as a sign of the concerned institution’s insolvency, it could trigger a run by short-term creditors and aggravate the institution’s liquidity problem. Ideally, therefore, bail-in should be activated when a capital infusion is expected to restore a distressed financial institution to viability, with official liquidity support as a backstop until the bank is stabilized.
3. Bail-in is not a panacea and should be considered as one element of a comprehensive solution to the TBTF problem. It should supplement, not replace, other resolution tools that would allow for an orderly closure of a failed institution.
4. Most importantly, the bail-in framework needs to be carefully designed to ensure its effective implementation.
Conclusions
Bail-in power needs to be considered as an additional and complementary tool for the resolution of SIFIs. Bail-in is a statutory power of a resolution authority, as opposed to contractual arrangements, such as contingent capital requirements. It involves recapitalization through relatively straightforward mandatory debt restructuring and could therefore avoid some of the operational and legal complexities that arise when using other tools (such as P&A transactions), which require transferring assets and liabilities between different legal entities and across borders. By restoring the viability of a distressed SIFI, the pressure on the institution to post more collateral, for example against their repo contracts, could be significantly reduced, thereby minimizing liquidity risks and preventing runs by short-term creditors.
The design and implementation of a bail-in power, however, need to take into careful consideration its potential market impact and its implications for financial stability. It is especially important that the triggering of a bail-in power is not perceived by the market as a sign of the concerned institution’s non-viability, a perception that could trigger a run by short-term creditors and aggravate the institution’s liquidity problem. An effective bail-in framework generally includes the following key design elements:
IMF Staff Discussion Notes No. 12/03
April 24, 2012
ISBN/ISSN: 978-1-61635-392-6 / 2221-030X
Stock No: SDNETEA2012003
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25858.0
Excerpts
Executive Summary
Large-scale government support of the financial institutions deemed too big or too important to fail during the recent crisis has been costly and has potentially increased moral hazard. To protect taxpayers from exposure to bank losses and to reduce the risks posed by too-big-tofail (TBTF), various reform initiatives have been undertaken at both national and international levels, including expanding resolution powers and tools.
One example is bail-in, which is a statutory power of a resolution authority (as opposed to contractual arrangements, such as contingent capital requirements) to restructure the liabilities of a distressed financial institution by writing down its unsecured debt and/or converting it to equity. The statutory bail-in power is intended to achieve a prompt recapitalization and restructuring of the distressed institution. This paper studies its effectiveness in restoring the viability of distressed institutions, discusses potential risks when a bail-in power is activated, and proposes design features to mitigate these risks. The main conclusions are:
1. As a going-concern form of resolution, bail-in could mitigate the systemic risks associated with disorderly liquidations, reduce deleveraging pressures, and preserve asset values that might otherwise be lost in a liquidation. With a credible threat of stock elimination or dilution by debt conversion and assumption of management by resolution authorities, financial institutions may be incentivized to raise capital or restructure debt voluntarily before the triggering of the bail-in power.
2. However, if the use of a bail-in power is perceived by the market as a sign of the concerned institution’s insolvency, it could trigger a run by short-term creditors and aggravate the institution’s liquidity problem. Ideally, therefore, bail-in should be activated when a capital infusion is expected to restore a distressed financial institution to viability, with official liquidity support as a backstop until the bank is stabilized.
3. Bail-in is not a panacea and should be considered as one element of a comprehensive solution to the TBTF problem. It should supplement, not replace, other resolution tools that would allow for an orderly closure of a failed institution.
4. Most importantly, the bail-in framework needs to be carefully designed to ensure its effective implementation.
- The triggers for bail-in power should be consistent with those used for other resolution tools. They should be set at the point when a firm would have breached the regulatory minima but before it became balance-sheet insolvent. To make bail-in a transparent tool, its scope should be limited to (i) elimination of existing equity shares as a precondition for a bail-in; and (ii) conversion and haircut to subordinated and unsecured senior debt. Debt restructuring under a bail-in should take into account the order of priorities applicable in a liquidation.
- A clear and coherent legal framework for bail-in is essential. The legal framework needs to be designed to establish an appropriate balance between the rights of private stakeholders and the public policy interest in preserving financial stability. Debt restructuring ideally would not be subject to creditor consent, but a “no creditor worse off” test may be introduced to safeguard creditors’ and shareholders’ interests. The framework also needs to provide mechanisms for addressing issues associated with the bail-in of debt issued by an entity of a larger banking group and with the cross-border operations of that entity or banking group.
- The contribution of new capital will come from debt conversion and/or an issuance of new equity, with an elimination or significant dilution of the pre-bail in shareholders. Bail-in will need to be accompanied by mechanisms to ensure the suitability of new shareholders. Some measures (e.g., a floor price for debt/equity conversion) might be necessary to reduce the risk of a “death spiral” in share prices.
- It may be necessary to impose minimum requirements on banks for issuing unsecured debt or to set limits on the encumbrance of assets (which have been introduced by many advanced countries). This would help reassure the market that a bail-in would be sufficient to recapitalize the distressed institution, thus forestalling potential runs by short-term creditors and avert a downward share price spiral. The framework should also include measures to mitigate contagion risks to other systemic financial institutions, for example, by limiting their cross-holding of unsecured senior debt.
Conclusions
Bail-in power needs to be considered as an additional and complementary tool for the resolution of SIFIs. Bail-in is a statutory power of a resolution authority, as opposed to contractual arrangements, such as contingent capital requirements. It involves recapitalization through relatively straightforward mandatory debt restructuring and could therefore avoid some of the operational and legal complexities that arise when using other tools (such as P&A transactions), which require transferring assets and liabilities between different legal entities and across borders. By restoring the viability of a distressed SIFI, the pressure on the institution to post more collateral, for example against their repo contracts, could be significantly reduced, thereby minimizing liquidity risks and preventing runs by short-term creditors.
The design and implementation of a bail-in power, however, need to take into careful consideration its potential market impact and its implications for financial stability. It is especially important that the triggering of a bail-in power is not perceived by the market as a sign of the concerned institution’s non-viability, a perception that could trigger a run by short-term creditors and aggravate the institution’s liquidity problem. An effective bail-in framework generally includes the following key design elements:
- The scope of the statutory power should be limited to (i) eliminating or diluting existing shareholders; and (ii) writing down or converting, in the following order, any contractual contingent capital instruments, subordinated debt, and unsecured senior debt, accompanied by the power of the resolution authority to change bank management.
- The triggers for bail-in power should be consistent with those used for other resolution tools and set at the point when an insititution would have breached the regulatory minima but before it became balance-sheet insolvent, to allow for a prompt response to an SIFI’s financial distress. The intervention criteria (a combination of quantitative and qualitative assessments) need to be as transparent and predictable as possible to avoid market uncertainty.
- It may be necessary to require banks or bank holding companies to maintain a minimum amount of unsecured liabilities (as a percentage of total liabilities) beforehand, which could be subject to bail-in afterwards. This would help reassure the market that bail-in is sufficient to recapitalize the distressed institution and restore its viability, thus reduce the risk of runs by short-term creditors.
- To fund potential liquidity outflows, and given the probable temporary loss of market access, bail-in may need to be coupled with adequate official liquidity assistance.
- Bail-in needs to be considered as one element of a comprehensive framework that includes effective supervision to reduce the likelihood of bank failures and an effective overall resolution framework that allows for an orderly resolution of a failed SIFI, facilitated by up-to-date recovery and resolution plans. In general, statutory bail-in should be used in instances where a capital infusion is likely to restore a distressed financial institution to viability, possibly because, other than a lack of capital, the institution is viable and has a decent business model and good riskmanagement systems. Otherwise, bail-in capital could simply delay the inevitable failure.
Central Bank Independence and Macro-prudential Regulation. By Kenichi Ueda & Fabian Valencia
Central Bank Independence and Macro-prudential Regulation. By Kenichi Ueda & Fabian Valencia
IMF Working Paper No. 12/101
Apr 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25872.0
Summary: We consider the optimality of various institutional arrangements for agencies that conduct macro-prudential regulation and monetary policy. When a central bank is in charge of price and financial stability, a new time inconsistency problem may arise. Ex-ante, the central bank chooses the socially optimal level of inflation. Ex-post, however, the central bank chooses inflation above the social optimum to reduce the real value of private debt. This inefficient outcome arises when macro-prudential policies cannot be adjusted as frequently as monetary. Importantly, this result arises even when the central bank is politically independent. We then consider the role of political pressures in the spirit of Barro and Gordon (1983). We show that if either the macro-prudential regulator or the central bank (or both) are not politically independent, separation of price and financial stability objectives does not deliver the social optimum.
Excerpts
Introduction
A growing literature based on models where pecuniary externalities reinforce shocks in the aggregate advocates the use of macro-prudential regulation (e.g. Bianchi (2010), Bianchi and Mendoza (2010), Jeanne and Korinek (2010), and Jeanne and Korinek (2011)). Most research in this area has focused on understanding the distortions that lead to financial amplification and to assess their quantitative importance. The natural next question is how to implement macro-prudential regulation.
Implementing macro-prudential policy requires, among other things, figuring out the optimal institutional design. In this context, there is an intense policy debate about the desirability of assigning the central bank formally with the responsibility of financial stability. This debate has spurred interest in studying the interactions between monetary and macro-prudential policies with the objective of understanding the conflicts and synergies that may arise from different institutional arrangements.
This paper contributes to this debate by exploring the circumstances under which it may be suboptimal to have the central bank in charge of macro-prudential regulation. We differ from a rapidly expanding literature on macro-prudential and monetary interactions, including De Paoli and Paustian (2011) and Quint and Rabanal (2011), mainly in that our focus is on the potential time-inconsistency problems that can arise, which are not addressed in existing work. Our departure point is the work pioneered by Kydland and Prescott (1977) and Barro and Gordon (1983) who studied how time-inconsistency problems and political pressures distort the monetary authority’s incentives under various institutional arrangements. In our model, there are two stages, in the first stage, the policymaker (possibly a single or several institutions) makes simultaneous monetary policy and macro-prudential regulation decisions. In the second stage, monetary policy decisions can be revised or “fine-tuned” after the realization of a credit shock. This setup captures the fact that macro-prudential regulation is intended to be used preemptively, once a credit shock (boom or bust) have taken place, it can do little to change the stock of debt. Monetary policy, on the other hand, can be used ex-ante and ex-post.
The key finding of the paper is that a dual-mandate central bank is not socially optimal. In this setting, a time inconsistency problem arises. While it is ex-ante optimal for the dual-mandate central bank to deliver the socially optimal level of inflation, it is not so ex-post. This central bank has the ex-post incentive to reduce the real burden of private debt through inflation, similar to the incentives to monetize public sector debt studied in Calvo (1978) and Lucas and Stokey (1983). This outcome arises because ex-post the dual-mandate central bank has only one tool, monetary policy, to achieve financial and price stability.
We then examine the role of political factors with a simple variation of our model in the spirit of Barro and Gordon (1983). We find that the above result prevails if policy is conducted by politically independent institutions. However, when institutions are not politically independent (the central bank, the macro-prudential regulator, or both) neither separate institutions nor combination of objectives in a single institution delivers the social optimum. As in Barro and Gordon (1983), the non-independent institution will use its policy tool at hand to try to generate economic expansions. The non-independent central bank will use monetary policy for this purpose and the non-independent macro-prudential regulator will use regulation. Which arrangement generates lower welfare losses in the case of non-independence depends on parameter values. A calibration of the model using parameter values from the literature suggest, however, that a regime with a non-independent dual-mandate central bank almost always delivers a worse outcome than a regime with a non-independent but separate macro-prudential regulator.
Finally, if the only distortion of concern is political interference (i.e. ignoring the time-inconsistency problem highlighted earlier) all that is needed to achieve the social optimum is political independence, with separation or combination of objectives yielding the same outcome. From a policy perspective, our analysis suggests that a conflict between price and financial stability objectives may arise if pursued by a single institution. Our results also extend the earlier findings by Barro and Gordon (1983) and many others on political independence of the central bank to show that these results are also applicable to a macro-prudential regulator. We should note that we have abstracted from considering the potential synergies that may arise in having dual mandate institutions. For instance, benefits from information sharing and use of central bank expertise may mitigate the welfare losses we have shown may arise (see Nier, Osinski, J´acome and Madrid (2011)), although information sharing would also benefit fiscal and monetary interactions. However, we have also abstracted other aspects that could exacerbate the welfare loss such as loss in reputation.
Conclusions
We consider macro-prudential regulation and monetary policy interactions to investigate the welfare implications of different institutional arrangements. In our framework, monetary policy can re-optimize following a realization of credit shocks, but macro-prudential regulation cannot be adjusted immediately after the credit shock. This feature of the model captures the ability of adjusting monetary policy more frequently than macro-prudential regulation because macro-prudential regulation is an ex-ante tool, whereas monetary policy can be used ex-ante and ex-post. In this setting, a central bank with a price and financial stability mandate does not deliver the social optimum because of a time-inconsistency problem. This central bank finds it optimal ex-ante to deliver the social optimal level of inflation, but it does not do so ex-post. This is because the central bank finds it optimal ex-post to let inflation rise to repair private balance sheets because ex-post it has only monetary policy to do so. Achieving the social optimum in this case requires separating the price and financial stability objectives.
We also consider the role of political independence of institutions, as in Barro and Gordon (1983). Under this extension, separation of price and financial stability objectives delivers the social optimum only if both institutions are politically independent. If the central bank or the macro-prudential regulator (or both) are not politically independent, they would not achieve the social optimum. Numerical analysis in our model suggest however, that in most cases a non-independent macro-prudential regulator (with independent monetary authority) delivers a better outcome than a non-independent central bank in charge of both price and financial stability.
IMF Working Paper No. 12/101
Apr 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25872.0
Summary: We consider the optimality of various institutional arrangements for agencies that conduct macro-prudential regulation and monetary policy. When a central bank is in charge of price and financial stability, a new time inconsistency problem may arise. Ex-ante, the central bank chooses the socially optimal level of inflation. Ex-post, however, the central bank chooses inflation above the social optimum to reduce the real value of private debt. This inefficient outcome arises when macro-prudential policies cannot be adjusted as frequently as monetary. Importantly, this result arises even when the central bank is politically independent. We then consider the role of political pressures in the spirit of Barro and Gordon (1983). We show that if either the macro-prudential regulator or the central bank (or both) are not politically independent, separation of price and financial stability objectives does not deliver the social optimum.
Excerpts
Introduction
A growing literature based on models where pecuniary externalities reinforce shocks in the aggregate advocates the use of macro-prudential regulation (e.g. Bianchi (2010), Bianchi and Mendoza (2010), Jeanne and Korinek (2010), and Jeanne and Korinek (2011)). Most research in this area has focused on understanding the distortions that lead to financial amplification and to assess their quantitative importance. The natural next question is how to implement macro-prudential regulation.
Implementing macro-prudential policy requires, among other things, figuring out the optimal institutional design. In this context, there is an intense policy debate about the desirability of assigning the central bank formally with the responsibility of financial stability. This debate has spurred interest in studying the interactions between monetary and macro-prudential policies with the objective of understanding the conflicts and synergies that may arise from different institutional arrangements.
This paper contributes to this debate by exploring the circumstances under which it may be suboptimal to have the central bank in charge of macro-prudential regulation. We differ from a rapidly expanding literature on macro-prudential and monetary interactions, including De Paoli and Paustian (2011) and Quint and Rabanal (2011), mainly in that our focus is on the potential time-inconsistency problems that can arise, which are not addressed in existing work. Our departure point is the work pioneered by Kydland and Prescott (1977) and Barro and Gordon (1983) who studied how time-inconsistency problems and political pressures distort the monetary authority’s incentives under various institutional arrangements. In our model, there are two stages, in the first stage, the policymaker (possibly a single or several institutions) makes simultaneous monetary policy and macro-prudential regulation decisions. In the second stage, monetary policy decisions can be revised or “fine-tuned” after the realization of a credit shock. This setup captures the fact that macro-prudential regulation is intended to be used preemptively, once a credit shock (boom or bust) have taken place, it can do little to change the stock of debt. Monetary policy, on the other hand, can be used ex-ante and ex-post.
The key finding of the paper is that a dual-mandate central bank is not socially optimal. In this setting, a time inconsistency problem arises. While it is ex-ante optimal for the dual-mandate central bank to deliver the socially optimal level of inflation, it is not so ex-post. This central bank has the ex-post incentive to reduce the real burden of private debt through inflation, similar to the incentives to monetize public sector debt studied in Calvo (1978) and Lucas and Stokey (1983). This outcome arises because ex-post the dual-mandate central bank has only one tool, monetary policy, to achieve financial and price stability.
We then examine the role of political factors with a simple variation of our model in the spirit of Barro and Gordon (1983). We find that the above result prevails if policy is conducted by politically independent institutions. However, when institutions are not politically independent (the central bank, the macro-prudential regulator, or both) neither separate institutions nor combination of objectives in a single institution delivers the social optimum. As in Barro and Gordon (1983), the non-independent institution will use its policy tool at hand to try to generate economic expansions. The non-independent central bank will use monetary policy for this purpose and the non-independent macro-prudential regulator will use regulation. Which arrangement generates lower welfare losses in the case of non-independence depends on parameter values. A calibration of the model using parameter values from the literature suggest, however, that a regime with a non-independent dual-mandate central bank almost always delivers a worse outcome than a regime with a non-independent but separate macro-prudential regulator.
Finally, if the only distortion of concern is political interference (i.e. ignoring the time-inconsistency problem highlighted earlier) all that is needed to achieve the social optimum is political independence, with separation or combination of objectives yielding the same outcome. From a policy perspective, our analysis suggests that a conflict between price and financial stability objectives may arise if pursued by a single institution. Our results also extend the earlier findings by Barro and Gordon (1983) and many others on political independence of the central bank to show that these results are also applicable to a macro-prudential regulator. We should note that we have abstracted from considering the potential synergies that may arise in having dual mandate institutions. For instance, benefits from information sharing and use of central bank expertise may mitigate the welfare losses we have shown may arise (see Nier, Osinski, J´acome and Madrid (2011)), although information sharing would also benefit fiscal and monetary interactions. However, we have also abstracted other aspects that could exacerbate the welfare loss such as loss in reputation.
Conclusions
We consider macro-prudential regulation and monetary policy interactions to investigate the welfare implications of different institutional arrangements. In our framework, monetary policy can re-optimize following a realization of credit shocks, but macro-prudential regulation cannot be adjusted immediately after the credit shock. This feature of the model captures the ability of adjusting monetary policy more frequently than macro-prudential regulation because macro-prudential regulation is an ex-ante tool, whereas monetary policy can be used ex-ante and ex-post. In this setting, a central bank with a price and financial stability mandate does not deliver the social optimum because of a time-inconsistency problem. This central bank finds it optimal ex-ante to deliver the social optimal level of inflation, but it does not do so ex-post. This is because the central bank finds it optimal ex-post to let inflation rise to repair private balance sheets because ex-post it has only monetary policy to do so. Achieving the social optimum in this case requires separating the price and financial stability objectives.
We also consider the role of political independence of institutions, as in Barro and Gordon (1983). Under this extension, separation of price and financial stability objectives delivers the social optimum only if both institutions are politically independent. If the central bank or the macro-prudential regulator (or both) are not politically independent, they would not achieve the social optimum. Numerical analysis in our model suggest however, that in most cases a non-independent macro-prudential regulator (with independent monetary authority) delivers a better outcome than a non-independent central bank in charge of both price and financial stability.
Wednesday, April 18, 2012
Principles for financial market infrastructures, assessment methodology and disclosure framework
CPSS Publications No 101
April 2012
Final version of the Principles for financial market infrastructures
The report Principles for financial market infrastructures
contains new and more demanding international standards for payment,
clearing and settlement systems, including central counterparties.
Issued by the CPSS and the International Organization of Securities
Commissions (IOSCO), the new standards (called "principles") are
designed to ensure that the infrastructure supporting global financial
markets is more robust and thus well placed to withstand financial
shocks.
The principles apply to all systemically important payment systems,
central securities depositories, securities settlement systems, central
counterparties and trade repositories (collectively "financial market
infrastructures"). They replace the three existing sets of international
standards set out in the Core principles for systemically important payment systems (CPSS, 2001); the Recommendations for securities settlement systems (CPSS-IOSCO, 2001); and the Recommendations for central counterparties
(CPSS-IOSCO, 2004). CPSS and IOSCO have strengthened and harmonised
these three sets of standards by raising minimum requirements, providing
more detailed guidance and broadening the scope of the standards to
cover new risk-management areas and new types of FMIs.
The principles were issued for public consultation in March 2011. The finalised principles being issued now have been revised in light of the comments received during that consultation.
CPSS and IOSCO members will strive to adopt the new standards by the
end of 2012. Financial market infrastructures (FMIs) are expected to
observe the standards as soon as possible.
Consultation versions of an assessment methodology and disclosure framework
At the same time as publishing the final version of the principles,
CPSS and IOSCO have issued two related documents for public
consultation, namely an assessment methodology and a disclosure framework for these new principles.
Comments on these two documents are invited from all interested parties and should be sent by 15 June 2012 to both the CPSS secretariat (cpss@bis.org) and the IOSCO secretariat (fmi@iosco.org).
The comments will be published on the websites of the Bank for
International Settlements (BIS) and IOSCO unless commentators request
otherwise. After the consultation period, the CPSS and IOSCO will review
the comments received and publish final versions of the two documents
later in 2012.
Other documents
A cover note that
explains the background to the three documents above and sets out some
specific points on the two consultation documents on which the
committees are seeking comments during the public consultation period is
also available.
A summary note that provides background on the report and an overview of its contents is also available.
Saturday, April 14, 2012
America's Voluntary Standards System--A "Best Practice" Model for Innovation Policy?
America's Voluntary Standards System--A "Best Practice" Model for Innovation Policy? By Dieter Ernst
East-West Center, Apr 2012
http://www.eastwestcenter.org/publications/americas-voluntary-standards-system-best-practice-model-innovation-policy
For its proponents, America's voluntary standards system is a "best practice" model for innovation policy. Foreign observers however are concerned about possible drawbacks of a standards system that is largely driven by the private sector. There are doubts, especially in Europe and China, whether the American system can balance public and private interests in times of extraordinary national and global challenges to innovation. To assess the merits of these conflicting perceptions, the paper reviews the historical roots of the American voluntary standards system, examines its current defining characteristics, and highlights its strengths and weaknesses. On the positive side, a tradition of decentralized local self-government, has given voice to diverse stakeholders in innovation, avoiding the pitfalls of top-down government-centered standards systems. However, a lack of effective coordination of multiple stakeholder strategies tends to constrain effective and open standardization processes, especially in the management of essential patents and in the timely provision of interoperability standards. To correct these drawbacks of the American standards system, the government has an important role to play as an enabler, coordinator, and, if necessary, an enforcer of the rules of the game in order to prevent abuse of market power by companies with large accumulated patent portfolios. The paper documents the ups and downs of the Federal Government’s role in standardization, and examines current efforts to establish robust public-private standards development partnerships, focusing on the Smart Grid Interoperability project coordinated by the National Institute of Standards and Technology (NIST). In short, countries that seek to improve their standards systems should study the strengths and weaknesses of the American system. However, persistent differences in economic institutions, levels of development and growth models are bound to limit convergence to a US-Style market-led voluntary standards system.
East-West Center, Apr 2012
http://www.eastwestcenter.org/publications/americas-voluntary-standards-system-best-practice-model-innovation-policy
For its proponents, America's voluntary standards system is a "best practice" model for innovation policy. Foreign observers however are concerned about possible drawbacks of a standards system that is largely driven by the private sector. There are doubts, especially in Europe and China, whether the American system can balance public and private interests in times of extraordinary national and global challenges to innovation. To assess the merits of these conflicting perceptions, the paper reviews the historical roots of the American voluntary standards system, examines its current defining characteristics, and highlights its strengths and weaknesses. On the positive side, a tradition of decentralized local self-government, has given voice to diverse stakeholders in innovation, avoiding the pitfalls of top-down government-centered standards systems. However, a lack of effective coordination of multiple stakeholder strategies tends to constrain effective and open standardization processes, especially in the management of essential patents and in the timely provision of interoperability standards. To correct these drawbacks of the American standards system, the government has an important role to play as an enabler, coordinator, and, if necessary, an enforcer of the rules of the game in order to prevent abuse of market power by companies with large accumulated patent portfolios. The paper documents the ups and downs of the Federal Government’s role in standardization, and examines current efforts to establish robust public-private standards development partnerships, focusing on the Smart Grid Interoperability project coordinated by the National Institute of Standards and Technology (NIST). In short, countries that seek to improve their standards systems should study the strengths and weaknesses of the American system. However, persistent differences in economic institutions, levels of development and growth models are bound to limit convergence to a US-Style market-led voluntary standards system.
BCBS: Implementation of stress testing practices by supervisors
Implementation of stress testing practices by supervisors: Basel Committee publishes peer review
http://www.bis.org/press/p120413.htm
The Basel Committee on Banking Supervision has today published a peer review of the implementation by national supervisory authorities of the Basel Committee's principles for sound stress testing practices and supervision.
Stress testing is an important tool used by banks to identify the potential for unexpected adverse outcomes across a range of risks and scenarios. In 2009, the Committee reviewed the performance of stress testing practices during the financial crisis and published recommendations for banks and supervisors entitled Principles for sound stress testing practices and supervision. The guidance set out a comprehensive set of principles for the sound governance, design and implementation of stress testing programmes at banks, as well as high-level expectations for the role and responsibilities of supervisors.
As part of its mandate to assess the implementation of standards across countries and to foster the promotion of good supervisory practice, the Committee's Standards Implementation Group (SIG) conducted a peer review during 2011 of supervisory authorities' implementation of the principles. The review found that stress testing has become a key component of the supervisory assessment process as well as a tool for contingency planning and communication. Countries are, however, at varying stages of maturity in the implementation of the principles; as a result, more work remains to be done to fully implement the principles in many countries.
Overall, the review found the 2009 stress testing principles to be generally effective. The Committee, however, will continue to monitor implementation of the principles and determine whether, in the future, additional guidance might be necessary.
BCBS
April 13, 2012
The Basel Committee on Banking Supervision has today published a peer review of the implementation by national supervisory authorities of the Basel Committee's principles for sound stress testing practices and supervision.
Stress testing is an important tool used by banks to identify the potential for unexpected adverse outcomes across a range of risks and scenarios. In 2009, the Committee reviewed the performance of stress testing practices during the financial crisis and published recommendations for banks and supervisors entitled Principles for sound stress testing practices and supervision. The guidance set out a comprehensive set of principles for the sound governance, design and implementation of stress testing programmes at banks, as well as high-level expectations for the role and responsibilities of supervisors.
As part of its mandate to assess the implementation of standards across countries and to foster the promotion of good supervisory practice, the Committee's Standards Implementation Group (SIG) conducted a peer review during 2011 of supervisory authorities' implementation of the principles. The review found that stress testing has become a key component of the supervisory assessment process as well as a tool for contingency planning and communication. Countries are, however, at varying stages of maturity in the implementation of the principles; as a result, more work remains to be done to fully implement the principles in many countries.
Overall, the review found the 2009 stress testing principles to be generally effective. The Committee, however, will continue to monitor implementation of the principles and determine whether, in the future, additional guidance might be necessary.
Subscribe to:
Posts (Atom)