Quantifying Structural Subsidy Values for Systemically Important Financial Institutions. By Kenichi Ueda and Beatrice Weder
IMF Working Paper No. 12/128
Summary: Claimants to SIFIs receive transfers when governments are forced into bailouts. Ex ante, the bailout expectation lowers daily funding costs. This funding cost differential reflects both the structural level of the government support and the time-varying market valuation for such a support. With large worldwide sample of banks, we estimate the structural subsidy values by exploiting expectations of state support embedded in credit ratings and by using long-run average value of rating bonus. It was already sizable, 60 basis points, as of the end-2007, before the crisis. It increased to 80 basis points by the end-2009.
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25928.0
Excerpts
Introduction
One of the most troubling legacies of the financial crisis is the problem of “too-systemically important-to-fail” financial institutions. Public policy had long recognized the dangers that systemically relevant institutions pose for the financial system and for public sector balance sheets, but in practice, this problem was not deemed to be extremely pressing. It was mainly dealt with by creating some uncertainty (constructive ambiguity) about the willingness of government intervention in a crisis.
The recent crisis since 2008 provided a real-life test of the willingness to intervene. After governments have proven their willingness to extended large-scale support, constructive ambiguity has given way to near certainty that sufficiently large or complex institutions will not be allowed to fail. Thus, countries have emerged from the financial crisis with an even larger problem: Many banks are larger than before and so are implicit government guarantees. In addition, it also becomes clear that these guarantees are not limited to large institutions. In Europe, smaller institutions with a high degree of interconnectedness, complexity, or political importance were also considered too important to fail.
The international community is addressing the problem of SIFIs with a two-pronged approach. On the one hand, the probability of SIFIs failure is to be reduced through higher capital buffers and tighter supervision. On the other hand, SIFIs are to be made more “resolvable” by subjecting them to special resolutions regimes (e.g., living wills and CoCos). A number of countries have already adopted special regimes at the national level or are in the process of doing so. However, it remains highly doubtful whether these regimes would be operable across borders. This regulatory coordination failure implies that creditors of SIFIs continue to enjoy implicit guarantees.
Subsidies arising from size and complexity create incentives for banks to become even larger and more complex. Hence, eliminating the value of the implicit structural subsidy to SIFIs should contribute to reducing both the probability and magnitude of (future) financial crises. Market participants tend to dismiss these concerns by stating that these effects may be there in theory but are very small in practice. Therefore, it requires an empirical study to quantify the value of state subsidies to SIFIs. This is the aim of this paper.
How can we estimate the value of structural state guarantees? As institutions with state backing are safer, investors ask for a lower risk premium, taking into account the expected future transfers from the government. Therefore, before crisis, the expected value of state guarantees is the difference in funding costs between a privileged bank and a non-privileged bank. A caveat of this reasoning is that this distortion might affect the competitive behaviors and the market shares of both the subsidized and the non-subsidized financial institutions. Therefore, the difference in observed funding costs may include indirect effects in addition to the direct subsidy for SIFIs.
We estimate the value of the structural subsidy using expectations of government support embedded in credit ratings. Overall ratings (and funding costs) of financial institutions have two constituent parts: their own financial strength and the expected amount of external support. External support can be provided by a parent company or by the government. Some rating agencies (e.g., Fitch) provide regular quantitative estimates of the probability that a particular financial institution would receive external support in case of crisis. We isolate the government support component and provide estimates of the value of this subsidy as of end-2007 and end-2009.
We find that the structural subsidy value is already sizable as of end-2007 and increased substantially by the end-2009, after key governments confirmed bailout expectations. On average, banks in major countries enjoyed credit rating bonuses of 1.8-3.4 at the end-2007 and 2.5-4.2 at the end-2009. This can be translated into a funding cost advantage roughly 60bp and 80bp, respectively.
The use of ratings might be considered problematic because rating agencies have been known to make mistakes in their judgments. For instance, they have been under heavy criticism for overrating structured products in the wake of the financial crisis. However, whether rating agencies assess default risks correctly is not important for the question at hand. All that matters is that markets use ratings in pricing debt instruments and those ratings influence funding costs. This has been the case.6 Therefore, we can use the difference in overall credit ratings of banks as a proxy for the difference in their structural funding costs. Our empirical approach is to extract the value of structural subsidy from support ratings, while taking into account bank-specific factors that determine banks’ own financial strength as well as country-specific factors that determine governments’ fiscal ability to offer support.
A related study by Baker and McArthur (2009) obtains a somewhat lower value of the subsidy, ranging from 9 bp to 49 bp. However, the difference in results can be explained by different empirical strategies: Baker and McArthur use the change in the difference in funding costs between small and large US banks before and after TARP. With this technique, they identify the change in the value of the SIFIs subsidy, which is assumed to be created by the government bailout intervention. However, they cannot account for a possible level of bailout expectations that may have been embedded in prices long before the financial crisis. This ignorance is a drawback of all studies that use bailout events to quantify the value of subsidy: They can be quite precise in estimating the change in the subsidy due to a particular intervention but they will underestimate the total level of the subsidy if the support is positive even in tranquil times. In other words, they cannot establish the value of funding cost advantages accruing from expected state support even before the crisis.
This characteristic is the distinct advantage of the rating approach. It allows us to estimate not only the change of the subsidy during the crisis but also the total value of the subsidy before the crisis. As far as we are aware, there are only a few previous papers which use ratings. Soussa (2000), Rime (2005), and Morgan and Stiroh (2005) used similar approaches to back out the value of the subsidy. However, our study is more comprehensive by including a larger set of banks and countries and also by covering the 2008 financial crisis.
Assuming that the equity values are not so much affected by bailouts but the debt values are, the time-varying estimates of the government guarantees can be calculated using a standard option pricing theory.9 However, the funding cost advantage in crisis reflects two components: first, the structural government support and, second, a larger risk premium due to market turmoil. If we calculate the value of one rating bonus only in crisis times, the value of bonus would be larger because of the latter effect. However, when designing a corrective levy, the value of the government support should not be affected by these short-run market movements. For this reason, the long-run average value of one rating bonus—used here to calculate the total value of the structural government support—should be more suitable as a basis for a collective levy than real-time estimates for the market value of the government guarantees.
Interpretation and conclusion
Section III has provided estimates of the value of the subsidy to SIFIs in terms of the overall ratings. Using the range of our estimates, we can summarize that a one-unit increase in government support for banks in advanced economies has an impact equivalent to 0.55 to 0.9 notches on the overall long-term credit rating at the end-2007. And, this effect increased to 0.8 to 1.23 notches by the end-2009 (Summary Table 8). At the end-2009, the effect of the government support is almost identical between the group of advanced countries and developing countries. Before the crisis, governments in advanced economies played a smaller role in boosting banks’ long-term ratings. These results are robust to a number of sample selection tests, such as testing for differential effects across developing and advanced countries, for both listed and non-listed banks, and also correcting for bank parental support and alternative estimations of an individual bank’s strength.
In interpreting these results, it is important to check if the averages mask large differences across countries. In fact, the overall rating bonuses in a section of large countries seem remarkably similar (Summary Table 9). For instance, mean support of Japanese banks was unchanged at 3.9 in 2007 and 2009. This implies, based on regressions without distinguishing advanced and developing countries, that overall ratings of systemically relevant banks profited by 2.9-3.5 notches from expected government support in 2007, with the value of this support increasing to 3.4-4.2 notches in 2009. For the top 45 U.S. banks, the mean support rating increased from 3.2 in 2007 to 4.1 in 2009. This translates into a 2.4-2.9 overall rating bonus for supported banks in 2007 and a much higher, 3.6-4.5, notch impact in 2009. In Germany, government support started high at 4.4 in 2007 and slightly increased to 4.6 in 2009. This suggests a 3.3-4.0 overall rating advantage of supported banks in 2007 and a 4.1-5.1 notch rating bonus in 2009.
For selected countries that have large banking centers and/or have been affected by the financial crisis, average government support ratings are about 3.6 in 2007 and 3.8 in 2009 on average (see Table 2, based on U.S. top 45 banks). Thus the overall rating bonuses for supported banks in this sample of countries are 2.7-3.2 in 2007 and 3.4-4.2 in 2009.
Our three-notch impact, on average, for advanced countries in 2007 is comparable to the results found by Soussa (2000) and Rimes (2005), although their studies are less rigorous and based on a smaller sample. In addition, Soussa (2000) reports structural annualized interest rate differentials among different credit ratings based on the average cumulative default rates (percent) for 1920-1999, calculated by Moody’s.17 According to his conversion table, when issuing a five-year bond, a three-notch rating increase translates into a funding advantage of 5 bp to 128 bp, depending on the riskiness of the institution.18 At the mid-point, it is 66.5 bp for a three-notch improvement, or 22bp for one-notch improvement. Using this and the overall rating bonuses described in the previous paragraph, we can evaluate the overall funding cost advantage of SIFIs as around 60bp in 2007 and 80bp in 2009.
This is helpful information, for example, if one would like to design a corrective levy on banks, which extracts the value of the subsidy. The funding cost advantage can be decomposed into the level of the government support and the time-varying risk premium. If a corrective levy were to be designed, it should not be affected by short-run market movements but should reflect only the long-run average value of rating bonuses, used here to calculate the total value of the structural government support. As discussed above, we find that the level of the structural government support has increased in most countries in 2009 compared to 2007. Still, we note that our estimate for the value of government support is lower than the real-time market value during crisis.
Our estimate may be also an overestimate of the required tax rate that would neutralize the (implicit) SIFI subsidy, since the competitive advantage of a guaranteed firm versus a nonguaranteed firm can be magnified (the former gains market share and the latter loses market share). One possibility is that the advantages and disadvantages are equally distributed between the two firms. Then, the levy rate that would eliminate the competitive distortion is smaller than the estimated difference in the funding costs. In this simple example, it would be half of the values given above. Nevertheless, the corrective tax required to correct the distortion of government support would remain sizable.
Saturday, May 19, 2012
Wednesday, May 16, 2012
BCBS: Models and tools for macroprudential analysis
Models and tools for macroprudential analysis
BCBS Working Papers No 21
May 2012
The Basel Committee's Research Task Force Transmission Channel project aimed at generating new research on various aspects of the credit channel linkages in the monetary transmission mechanism. Under the credit channel view, financial intermediaries play a critical role in the allocation of credit in the economy. They are the primary source of credit for consumers and businesses that do not have direct access to capital markets. Among more traditional macroeconomic modelling approaches, the credit view is unique in its emphasis on the health of the financial sector as a critically important determinant of the efficacy of monetary policy.
The final products of the project are two working papers that summarise the findings of the many individual research projects that were undertaken and discussed in the course of the project. The first working paper, Basel Committee Working Paper No 20, "The policy implications of transmission channels between the financial system and the real economy", analyses the link between the real economy and the financial sector, and channels through which the financial system may transmit instability to the real economy. The second working paper, Basel Committee Working Paper No 21, "Models and tools for macroprudential analysis", focuses on the methodological progress and modelling advancements aimed at improving financial stability monitoring and the identification of systemic risk potential. Because both working papers are summaries, they touch only briefly on the results and methods of the individual research papers that were developed during the course of the project. Each working paper includes comprehensive references with information that will allow the interested reader to contact any of the individual authors and acquire the most up-to-date version of the research that was summarised in each of these working papers.
http://www.bis.org/publ/bcbs_wp21.htm
BCBS Working Papers No 21
May 2012
The Basel Committee's Research Task Force Transmission Channel project aimed at generating new research on various aspects of the credit channel linkages in the monetary transmission mechanism. Under the credit channel view, financial intermediaries play a critical role in the allocation of credit in the economy. They are the primary source of credit for consumers and businesses that do not have direct access to capital markets. Among more traditional macroeconomic modelling approaches, the credit view is unique in its emphasis on the health of the financial sector as a critically important determinant of the efficacy of monetary policy.
The final products of the project are two working papers that summarise the findings of the many individual research projects that were undertaken and discussed in the course of the project. The first working paper, Basel Committee Working Paper No 20, "The policy implications of transmission channels between the financial system and the real economy", analyses the link between the real economy and the financial sector, and channels through which the financial system may transmit instability to the real economy. The second working paper, Basel Committee Working Paper No 21, "Models and tools for macroprudential analysis", focuses on the methodological progress and modelling advancements aimed at improving financial stability monitoring and the identification of systemic risk potential. Because both working papers are summaries, they touch only briefly on the results and methods of the individual research papers that were developed during the course of the project. Each working paper includes comprehensive references with information that will allow the interested reader to contact any of the individual authors and acquire the most up-to-date version of the research that was summarised in each of these working papers.
http://www.bis.org/publ/bcbs_wp21.htm
Tuesday, May 15, 2012
Changes in U.S. water use and implications for the future
It is interesting to see some data in Water Reuse: Expanding the Nation's Water Supply Through Reuse of Municipal Wastewater (http://www.nap.edu/catalog.php?record_id=13303), a National Research Council publication.
See for example figure 1-6, p 17, changes in U.S. water use and implications for the future:
See for example figure 1-6, p 17, changes in U.S. water use and implications for the future:
Monday, May 14, 2012
Do Dynamic Provisions Enhance Bank Solvency and Reduce Credit Procyclicality? A Study of the Chilean Banking System
Do Dynamic Provisions Enhance Bank Solvency and Reduce Credit Procyclicality? A Study of the Chilean Banking System. By Jorge A. Chan-Lau
IMF Working Paper No. 12/124
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25912.0
Summary: Dynamic provisions could help to enhance the solvency of individual banks and reduce procyclicality. Accomplishing these objectives depends on country-specific features of the banking system, business practices, and the calibration of the dynamic provisions scheme. In the case of Chile, a simulation analysis suggests Spanish dynamic provisions would improve banks' resilience to adverse shocks but would not reduce procyclicality. To address the latter, other countercyclical measures should be considered.
Excerpts
Introduction
It has long been acknowledged that procyclicality could pose risks to financial stability as noted by the academic and policy discussion centered on Basel II, accounting practices, and financial globalization. Recently, much attention has been focused on regulatory dynamic provisions (or statistical provisions). Under dynamic provisions, as banks build up their loan portfolio during an economic expansion, they should set aside provisions against future losses.
The use of dynamic provisions raises two questions bearing on financial stability. First, do dynamic provisions reduce insolvency risk? Second, do dynamic provisions reduce procyclicality? In theory the answer is yes to both questions. Provided loss estimates are roughly accurate, bank solvency is enhanced since buffers are built in advance ahead of the realization of large losses. Regulatory dynamic provisions could also discourage too rapid credit growth during the expansionary phase of the cycle, as it helps preventing a relaxation of provisioning practices.
However, when real data is brought to bear on the questions above the answers could diverge from what theory implies. This paper attempts to answer these questions in the specific case of Chile. It finds that the adoption of dynamic provisions could help to enhance bank solvency but it would not help to reduce procyclicality. The successful implementation of dynamic provisions, however, requires a careful calibration to match or exceed current provisioning practices, and it is worth noting that reliance on past data could lead to a false sense of security as loan losses are fat-tail events. Finally, since dynamic provisions may not be sufficient to counter procyclicality alternative measures should be considered, such as the proposed countercyclical capital buffers in Basel III and the countercyclical provision rule Peru implemented in 2008.
Conclusions
At the policy level, the case for regulatory dynamic provisions have been advanced on the grounds that they help reducing the risk of bank insolvency and dampening credit procyclicality. In the case of the Chile the data appears to partly validate these claims.
A simulation analysis suggests that under the Spanish dynamic provisions rule provision buffers against losses would be higher compared to those accumulated under current practices. The analysis also suggests that calibration based on historical data may not be adequate to deal with the presence of fat-tails in realized loan losses. Implementing dynamic provisions, therefore, requires a careful calibration of the regulatory model and stress testing loan-loss internal models.
Dynamic provision rules appear not to dampen procyclicality in Chile. Results from a VECM analysis indicate that the credit cycle does not respond to the level of or changes in aggregate provisions. In light of this result, it may be worth exploring other measures to address procyclicality. Two examples of these measures include countercyclical capital requirements, as proposed by the Basel Committee on Banking Supervision (2010a and b), or the countercyclical provision rule introduced in Peru in 2008. The Basel countercyclical capital requirements suggest that the build up and release of additional capital buffers should be conditioned on deviations of credit to GDP ratio from its long-run trend. The Peruvian rule, contrary to standard dynamic provision rules, requires banks to accumulate countercyclical provisions when GDP growth exceeds potential. Both measures, by tying up capital or provision accumulations to cyclical indicators, could be more effective for reducing procyclicality.
IMF Working Paper No. 12/124
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25912.0
Summary: Dynamic provisions could help to enhance the solvency of individual banks and reduce procyclicality. Accomplishing these objectives depends on country-specific features of the banking system, business practices, and the calibration of the dynamic provisions scheme. In the case of Chile, a simulation analysis suggests Spanish dynamic provisions would improve banks' resilience to adverse shocks but would not reduce procyclicality. To address the latter, other countercyclical measures should be considered.
Excerpts
Introduction
It has long been acknowledged that procyclicality could pose risks to financial stability as noted by the academic and policy discussion centered on Basel II, accounting practices, and financial globalization. Recently, much attention has been focused on regulatory dynamic provisions (or statistical provisions). Under dynamic provisions, as banks build up their loan portfolio during an economic expansion, they should set aside provisions against future losses.
The use of dynamic provisions raises two questions bearing on financial stability. First, do dynamic provisions reduce insolvency risk? Second, do dynamic provisions reduce procyclicality? In theory the answer is yes to both questions. Provided loss estimates are roughly accurate, bank solvency is enhanced since buffers are built in advance ahead of the realization of large losses. Regulatory dynamic provisions could also discourage too rapid credit growth during the expansionary phase of the cycle, as it helps preventing a relaxation of provisioning practices.
However, when real data is brought to bear on the questions above the answers could diverge from what theory implies. This paper attempts to answer these questions in the specific case of Chile. It finds that the adoption of dynamic provisions could help to enhance bank solvency but it would not help to reduce procyclicality. The successful implementation of dynamic provisions, however, requires a careful calibration to match or exceed current provisioning practices, and it is worth noting that reliance on past data could lead to a false sense of security as loan losses are fat-tail events. Finally, since dynamic provisions may not be sufficient to counter procyclicality alternative measures should be considered, such as the proposed countercyclical capital buffers in Basel III and the countercyclical provision rule Peru implemented in 2008.
Conclusions
At the policy level, the case for regulatory dynamic provisions have been advanced on the grounds that they help reducing the risk of bank insolvency and dampening credit procyclicality. In the case of the Chile the data appears to partly validate these claims.
A simulation analysis suggests that under the Spanish dynamic provisions rule provision buffers against losses would be higher compared to those accumulated under current practices. The analysis also suggests that calibration based on historical data may not be adequate to deal with the presence of fat-tails in realized loan losses. Implementing dynamic provisions, therefore, requires a careful calibration of the regulatory model and stress testing loan-loss internal models.
Dynamic provision rules appear not to dampen procyclicality in Chile. Results from a VECM analysis indicate that the credit cycle does not respond to the level of or changes in aggregate provisions. In light of this result, it may be worth exploring other measures to address procyclicality. Two examples of these measures include countercyclical capital requirements, as proposed by the Basel Committee on Banking Supervision (2010a and b), or the countercyclical provision rule introduced in Peru in 2008. The Basel countercyclical capital requirements suggest that the build up and release of additional capital buffers should be conditioned on deviations of credit to GDP ratio from its long-run trend. The Peruvian rule, contrary to standard dynamic provision rules, requires banks to accumulate countercyclical provisions when GDP growth exceeds potential. Both measures, by tying up capital or provision accumulations to cyclical indicators, could be more effective for reducing procyclicality.
Sunday, May 13, 2012
What Tokyo's Governor supporters think
A Japanese correspondant wrote about what Tokyo's Governor supporters think (edited):
I'll reply one of your questions about Tokyo's governor. He is a famous writer in Japan. Once in Japan, it was said "Money can move Politics." A leading politician who provided private funds to friends in politics, Tokyo's governor was once a lawmaker.
He gained funds by his writer activity, always speak radical statements since he was young, and he had been clearly different from other influential politicians. He was independent. He always strives to influence politicians with his great ability.
Thus, he was on the side of populace.
However he became arrogant now. But the Japanese people expect great things from him, in particular, people living in the capital, Tokyo. He always talks about "Changing Japan from Tokyo."
We think that he can do it.
Yours,
Nakaki
Friday, May 11, 2012
IMF Policy Papers: Enhancing Financial Sector Surveillance in Low-Income Countries Series
IMF Policy Paper: Enhancing Financial Sector Surveillance in Low-Income Countries - Background Paper
Summary: This note provides an overview of the literature on the challenges posed by shallow financial systems for macroeconomic policy implementation. Countries with shallow markets are more likely to choose fixed exchange rates, less likely to use indirect measures as instruments of monetary policy, and to implement effective counter-cyclical fiscal policies. But causation appears to work in both directions, as policy stances can themselves affect financial development. Drawing on recent FSAP reports, the note also shows that shallow financial markets tend to increase foreign exchange, liquidity management, and concentration risks, posing risks for financial stability
http://www.imf.org/external/pp/longres.aspx?id=4650
---
IMF Policy Paper: Enhancing Financial Sector Surveillnace in Low-Income Countries - Financial Deepening and Macro-Stability
Summary: This paper aims to widen the lens through which surveillance is conducted in LICs, to better account for the interplay between financial deepening and macro-financial stability as called for in the 2011 Triennial Surveillance Review. Reflecting the inherent risk-return tradeoffs associated with financial deepening, the paper seeks to shed light on the policy and institutional impediments in LICs that have a bearing on the effectiveness of macroeconomic policies, macro-financial stability, and growth. The paper focuses attention on the role of enabling policies in facilitating sustainable financial deepening. In framing the discussion, the paper draws on a range of conceptual and analytical tools, empirical analyses, and case studies.
http://www.imf.org/external/pp/longres.aspx?id=4649
---
IMF Policy Paper: Enhancing Financial Sector Surveillance in Low-Income Countries - Case Studies
Summary: This supplement presents ten case studies, which highlight the roles of targeted policies to facilitate sustainable financial deepening in a variety of country circumstances, reflecting historical experiences that parallel a range of markets in LICs. The case studies were selected to broadly capture efforts by countries to increase reach (e.g., financial inclusion), depth (e.g., financial intermediation), and breadth of financial systems (e.g., capital market, cross-border development). The analysis in the case studies highlights the importance of a balanced approach to financial deepening. A stable macroeconomic environment is vital to instill consumer, institutional, and investor confidence necessary to encourage financial market activity. Targeted public policy initiatives (e.g., collateral, payment systems development) can be helpful in removing impediments and creating infrastructure for improved market operations, while ensuring appropriate oversight and regulation of financial markets, to address potential sources of instability and market failures.
http://www.imf.org/external/pp/longres.aspx?id=4651
Summary: This note provides an overview of the literature on the challenges posed by shallow financial systems for macroeconomic policy implementation. Countries with shallow markets are more likely to choose fixed exchange rates, less likely to use indirect measures as instruments of monetary policy, and to implement effective counter-cyclical fiscal policies. But causation appears to work in both directions, as policy stances can themselves affect financial development. Drawing on recent FSAP reports, the note also shows that shallow financial markets tend to increase foreign exchange, liquidity management, and concentration risks, posing risks for financial stability
http://www.imf.org/external/pp/longres.aspx?id=4650
---
IMF Policy Paper: Enhancing Financial Sector Surveillnace in Low-Income Countries - Financial Deepening and Macro-Stability
Summary: This paper aims to widen the lens through which surveillance is conducted in LICs, to better account for the interplay between financial deepening and macro-financial stability as called for in the 2011 Triennial Surveillance Review. Reflecting the inherent risk-return tradeoffs associated with financial deepening, the paper seeks to shed light on the policy and institutional impediments in LICs that have a bearing on the effectiveness of macroeconomic policies, macro-financial stability, and growth. The paper focuses attention on the role of enabling policies in facilitating sustainable financial deepening. In framing the discussion, the paper draws on a range of conceptual and analytical tools, empirical analyses, and case studies.
http://www.imf.org/external/pp/longres.aspx?id=4649
---
IMF Policy Paper: Enhancing Financial Sector Surveillance in Low-Income Countries - Case Studies
Summary: This supplement presents ten case studies, which highlight the roles of targeted policies to facilitate sustainable financial deepening in a variety of country circumstances, reflecting historical experiences that parallel a range of markets in LICs. The case studies were selected to broadly capture efforts by countries to increase reach (e.g., financial inclusion), depth (e.g., financial intermediation), and breadth of financial systems (e.g., capital market, cross-border development). The analysis in the case studies highlights the importance of a balanced approach to financial deepening. A stable macroeconomic environment is vital to instill consumer, institutional, and investor confidence necessary to encourage financial market activity. Targeted public policy initiatives (e.g., collateral, payment systems development) can be helpful in removing impediments and creating infrastructure for improved market operations, while ensuring appropriate oversight and regulation of financial markets, to address potential sources of instability and market failures.
http://www.imf.org/external/pp/longres.aspx?id=4651
Tuesday, May 8, 2012
Some scholars argue that top rates can be raised drastically with no loss of revenue
Of Course 70% Tax Rates Are Counterproductive. By Alan Reynolds
Some scholars argue that top rates can be raised drastically with no loss of revenue. Their arguments are flawed.WSJ, May 7, 2012
http://online.wsj.com/article/SB10001424052702303916904577376041258476020.html
President Obama and others are demanding that we raise taxes on the "rich," and two recent academic papers that have gotten a lot of attention claim to show that there will be no ill effects if we do.
The first paper, by Peter Diamond of MIT and Emmanuel Saez of the University of California, Berkeley, appeared in the Journal of Economic Perspectives last August. The second, by Mr. Saez, along with Thomas Piketty of the Paris School of Economics and Stefanie Stantcheva of MIT, was published by the National Bureau of Economic Research three months later. Both suggested that federal tax revenues would not decline even if the rate on the top 1% of earners were raised to 73%-83%.
Can the apex of the Laffer Curve—which shows that the revenue-maximizing tax rate is not the highest possible tax rate—really be that high?
The authors arrive at their conclusion through an unusual calculation of the "elasticity" (responsiveness) of taxable income to changes in marginal tax rates. According to a formula devised by Mr. Saez, if the elasticity is 1.0, the revenue-maximizing top tax rate would be 40% including state and Medicare taxes. That means the elasticity of taxable income (ETI) would have to be an unbelievably low 0.2 to 0.25 if the revenue-maximizing top tax rates were 73%-83% for the top 1%. The authors of both papers reach this conclusion with creative, if wholly unpersuasive, statistical arguments.
Most of the older elasticity estimates are for all taxpayers, regardless of income. Thus a recent survey of 30 studies by the Canadian Department of Finance found that "The central ETI estimate in the international empirical literature is about 0.40."
But the ETI for all taxpayers is going to be lower than for higher-income earners, simply because people with modest incomes and modest taxes are not willing or able to vary their income much in response to small tax changes. So the real question is the ETI of the top 1%.
Harvard's Raj Chetty observed in 2009 that "The empirical literature on the taxable income elasticity has generally found that elasticities are large (0.5 to 1.5) for individuals in the top percentile of the income distribution." In that same year, Treasury Department economist Bradley Heim estimated that the ETI is 1.2 for incomes above $500,000 (the top 1% today starts around $350,000).
A 2010 study by Anthony Atkinson (Oxford) and Andrew Leigh (Australian National University) about changes in tax rates on the top 1% in five Anglo-Saxon countries came up with an ETI of 1.2 to 1.6. In a 2000 book edited by University of Michigan economist Joel Slemrod ("Does Atlas Shrug?"), Robert A. Moffitt (Johns Hopkins) and Mark Wilhelm (Indiana) estimated an elasticity of 1.76 to 1.99 for gross income. And at the bottom of the range, Mr. Saez in 2004 estimated an elasticity of 0.62 for gross income for the top 1%.
A midpoint between the estimates would be an elasticity for gross income of 1.3 for the top 1%, and presumably an even higher elasticity for taxable income (since taxpayers can claim larger deductions if tax rates go up.)
But let's stick with an ETI of 1.3 for the top 1%. This implies that the revenue-maximizing top marginal rate would be 33.9% for all taxes, and below 27% for the federal income tax.
To avoid reaching that conclusion, Messrs. Diamond and Saez's 2011 paper ignores all studies of elasticity among the top 1%, and instead chooses a midpoint of 0.25 between one uniquely low estimate of 0.12 for gross income among all taxpayers (from a 2004 study by Mr. Saez and Jonathan Gruber of MIT) and the 0.40 ETI norm from 30 other studies.
That made-up estimate of 0.25 is the sole basis for the claim by Messrs. Diamond and Saez in their 2011 paper that tax rates could reach 73% without losing revenue.
The Saez-Piketty-Stantcheva paper does not confound a lowball estimate for all taxpayers with a midpoint estimate for the top 1%. On the contrary, the authors say that "the long-run total elasticity of top incomes with respect to the net-of-tax rate is large."
Nevertheless, to cut this "large" elasticity down, the authors begin by combining the U.S. with 17 other affluent economies, telling us that elasticity estimates for top incomes are lower for Europe and Japan. The resulting mélange—an 18-country "overall elasticity of around 0.5"—has zero relevance to U.S. tax policy.
Still, it is twice as large as the ETI of Messrs. Diamond and Saez, so the three authors appear compelled to further pare their 0.5 estimate down to 0.2 in order to predict a "socially optimal" top tax rate of 83%. Using "admittedly only suggestive" evidence, they assert that only 0.2 of their 0.5 ETI can be attributed to real supply-side responses to changes in tax rates.
The other three-fifths of ETI can just be ignored, according to Messrs. Saez and Piketty, and Ms. Stantcheva, because it is the result of, among other factors, easily-plugged tax loopholes resulting from lower rates on corporations and capital gains.
Plugging these so-called loopholes, they say, requires "aligning the tax rates on realized capital gains with those on ordinary income" and enacting "neutrality in the effective tax rates across organizational forms." In plain English: Tax rates on U.S. corporate profits, dividends and capital gains must also be 83%.
This raises another question: At that level, would there be any profits, capital gains or top incomes left to tax?
"The optimal top tax," the three authors also say, "actually goes to 100% if the real supply-side elasticity is very small." If anyone still imagines the proposed "socially optimal" tax rates of 73%-83% on the top 1% would raise revenues and have no effect on economic growth, what about that 100% rate?
Mr. Reynolds is a senior fellow with the Cato Institute and the author of "Income and Wealth" (Greenwood Press, 2006).
Some scholars argue that top rates can be raised drastically with no loss of revenue. Their arguments are flawed.WSJ, May 7, 2012
http://online.wsj.com/article/SB10001424052702303916904577376041258476020.html
President Obama and others are demanding that we raise taxes on the "rich," and two recent academic papers that have gotten a lot of attention claim to show that there will be no ill effects if we do.
The first paper, by Peter Diamond of MIT and Emmanuel Saez of the University of California, Berkeley, appeared in the Journal of Economic Perspectives last August. The second, by Mr. Saez, along with Thomas Piketty of the Paris School of Economics and Stefanie Stantcheva of MIT, was published by the National Bureau of Economic Research three months later. Both suggested that federal tax revenues would not decline even if the rate on the top 1% of earners were raised to 73%-83%.
Can the apex of the Laffer Curve—which shows that the revenue-maximizing tax rate is not the highest possible tax rate—really be that high?
The authors arrive at their conclusion through an unusual calculation of the "elasticity" (responsiveness) of taxable income to changes in marginal tax rates. According to a formula devised by Mr. Saez, if the elasticity is 1.0, the revenue-maximizing top tax rate would be 40% including state and Medicare taxes. That means the elasticity of taxable income (ETI) would have to be an unbelievably low 0.2 to 0.25 if the revenue-maximizing top tax rates were 73%-83% for the top 1%. The authors of both papers reach this conclusion with creative, if wholly unpersuasive, statistical arguments.
Most of the older elasticity estimates are for all taxpayers, regardless of income. Thus a recent survey of 30 studies by the Canadian Department of Finance found that "The central ETI estimate in the international empirical literature is about 0.40."
But the ETI for all taxpayers is going to be lower than for higher-income earners, simply because people with modest incomes and modest taxes are not willing or able to vary their income much in response to small tax changes. So the real question is the ETI of the top 1%.
Harvard's Raj Chetty observed in 2009 that "The empirical literature on the taxable income elasticity has generally found that elasticities are large (0.5 to 1.5) for individuals in the top percentile of the income distribution." In that same year, Treasury Department economist Bradley Heim estimated that the ETI is 1.2 for incomes above $500,000 (the top 1% today starts around $350,000).
A 2010 study by Anthony Atkinson (Oxford) and Andrew Leigh (Australian National University) about changes in tax rates on the top 1% in five Anglo-Saxon countries came up with an ETI of 1.2 to 1.6. In a 2000 book edited by University of Michigan economist Joel Slemrod ("Does Atlas Shrug?"), Robert A. Moffitt (Johns Hopkins) and Mark Wilhelm (Indiana) estimated an elasticity of 1.76 to 1.99 for gross income. And at the bottom of the range, Mr. Saez in 2004 estimated an elasticity of 0.62 for gross income for the top 1%.
A midpoint between the estimates would be an elasticity for gross income of 1.3 for the top 1%, and presumably an even higher elasticity for taxable income (since taxpayers can claim larger deductions if tax rates go up.)
But let's stick with an ETI of 1.3 for the top 1%. This implies that the revenue-maximizing top marginal rate would be 33.9% for all taxes, and below 27% for the federal income tax.
To avoid reaching that conclusion, Messrs. Diamond and Saez's 2011 paper ignores all studies of elasticity among the top 1%, and instead chooses a midpoint of 0.25 between one uniquely low estimate of 0.12 for gross income among all taxpayers (from a 2004 study by Mr. Saez and Jonathan Gruber of MIT) and the 0.40 ETI norm from 30 other studies.
That made-up estimate of 0.25 is the sole basis for the claim by Messrs. Diamond and Saez in their 2011 paper that tax rates could reach 73% without losing revenue.
The Saez-Piketty-Stantcheva paper does not confound a lowball estimate for all taxpayers with a midpoint estimate for the top 1%. On the contrary, the authors say that "the long-run total elasticity of top incomes with respect to the net-of-tax rate is large."
Nevertheless, to cut this "large" elasticity down, the authors begin by combining the U.S. with 17 other affluent economies, telling us that elasticity estimates for top incomes are lower for Europe and Japan. The resulting mélange—an 18-country "overall elasticity of around 0.5"—has zero relevance to U.S. tax policy.
Still, it is twice as large as the ETI of Messrs. Diamond and Saez, so the three authors appear compelled to further pare their 0.5 estimate down to 0.2 in order to predict a "socially optimal" top tax rate of 83%. Using "admittedly only suggestive" evidence, they assert that only 0.2 of their 0.5 ETI can be attributed to real supply-side responses to changes in tax rates.
The other three-fifths of ETI can just be ignored, according to Messrs. Saez and Piketty, and Ms. Stantcheva, because it is the result of, among other factors, easily-plugged tax loopholes resulting from lower rates on corporations and capital gains.
Plugging these so-called loopholes, they say, requires "aligning the tax rates on realized capital gains with those on ordinary income" and enacting "neutrality in the effective tax rates across organizational forms." In plain English: Tax rates on U.S. corporate profits, dividends and capital gains must also be 83%.
This raises another question: At that level, would there be any profits, capital gains or top incomes left to tax?
"The optimal top tax," the three authors also say, "actually goes to 100% if the real supply-side elasticity is very small." If anyone still imagines the proposed "socially optimal" tax rates of 73%-83% on the top 1% would raise revenues and have no effect on economic growth, what about that 100% rate?
Mr. Reynolds is a senior fellow with the Cato Institute and the author of "Income and Wealth" (Greenwood Press, 2006).
Bank Capitalization as a Signal. By Daniel C. Hardy
Bank Capitalization as a Signal. By Daniel C. Hardy
IMF Working Paper No. 12/114
May 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25894.0
Summary: The level of a bank‘s capitalization can effectively transmit information about its riskiness and therefore support market discipline, but asymmetry information may induce exaggerated or distortionary behavior: banks may vie with one another to signal confidence in their prospects by keeping capitalization low, and banks‘ creditors often cannot distinguish among them - tendencies that can be seen across banks and across time. Prudential policy is warranted to help offset these tendencies.
IMF Working Paper No. 12/114
May 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25894.0
Summary: The level of a bank‘s capitalization can effectively transmit information about its riskiness and therefore support market discipline, but asymmetry information may induce exaggerated or distortionary behavior: banks may vie with one another to signal confidence in their prospects by keeping capitalization low, and banks‘ creditors often cannot distinguish among them - tendencies that can be seen across banks and across time. Prudential policy is warranted to help offset these tendencies.
Friday, May 4, 2012
Women, Welch Clash at Forum - "Great women get upset about getting into the victim's unit"
Women, Welch Clash at Forum. By John Bussey
Wall Street Journal, May 4, 2012, page B1
http://online.wsj.com/article/SB10001424052702303877604577382321364803912.html
Is Jack Welch a timeless seer or an out-of-touch warhorse?
The former Master and Commander of General Electric still writes widely on business strategy. He's also influential on the speaking circuit.
On Wednesday, Mr. Welch and his wife and writing partner, Suzy Welch, told a gathering of women executives from a range of industries that, in matters of career track, it is results and performance that chart the way. Programs promoting diversity, mentorships and affinity groups may or may not be good, but they are not how women get ahead. "Over deliver," Mr. Welch advised. "Performance is it!"
Angry murmurs ran through the crowd. The speakers asked: Were there any questions?
"We're regaining our consciousness," one woman executive shot back.
Mr. Welch had walked into a spinning turbine fan blade.
"Of course women need to perform to advance," Alison Quirk, an executive vice president at the investment firm State Street Corp., said later. "But we can all do more to help people understand their unconscious biases."
"He showed no recognition that the culture shapes the performance metrics, and the culture is that of white men," another executive said.
Academy Award winning actor Geena Davis talks about the perception of women as seen in the media and about what has and has not changed in the past sixty years.
Dee Dee Myers, a former White House press secretary who is now with Glover Park Group, a communications firm, added: "While he seemed to acknowledge the value of a diverse workforce, he didn't seem to think it was necessary to develop strategies for getting there—and especially for taking a cold, hard look at some of the subtle barriers to women's advancement that still exist. If objective performance measures were enough, more than a handful of Fortune 500 senior executives would already be women. "
"This meritocracy fiction may be the single biggest obstacle to women's advancement," added Lisa Levey, a consultant who heard Mr. Welch speak.
Mr. Welch has sparked controversy in the past with his view of the workplace. In 2009, he told a group of human-resources managers: "There's no such thing as work-life balance." Instead, "there are work-life choices, and you make them, and they have consequences." Step out of the arena to raise kids, and don't be surprised if the promotion passes you by.
Of the Fortune 500 companies, only 3% have a female CEO today. Female board membership is similarly spare. A survey of 60 major companies by McKinsey shows women occupying 53% of entry-level positions, 40% of manager positions, and only 19% of C-suite jobs.
The reasons for this are complex and aren't always about child rearing. A separate McKinsey survey showed that among women who have already reached the status of successful executive, 59% don't aspire to one of the top jobs. The majority of these women have already had children.
"Their work ethic—these people are doing it all," said Dominic Barton of McKinsey. "They say, 'I'm the person turning off the lights'" at the end of the day.
Instead, Mr. Barton said, it's "the soft stuff, the culture" that's shaping their career decisions.
The group of women executives who wrestled with Mr. Welch were at a conference on Women in the Economy held by The Wall Street Journal this week. Among other things, they tackled the culture questions—devising strategies to get more high-performing women to the top, keep women on track during childbearing years, address bias, and make the goals of diversity motivating to employees. They also discussed the sexual harassment some women still experience in the workplace. (A report on the group's findings will be published in the Journal Monday.)
The realm of the "soft stuff" may not be Mr. Welch's favored zone. During his remarks, he referred to human resources as "the H.R. teams that are out there, most of them for birthdays and picnics." He mentioned a women's forum inside GE that he says attracted 500 participants. "The best of the women would come to me and say, 'I don't want to be in a special group. I'm not in the victim's unit. I'm a star. I want to be compared with the best of your best.'"
And then he addressed the audience: "Stop lying about it. It's true. Great women get upset about getting into the victim's unit."
Individual mentoring programs, meanwhile, are "one of the worst ideas that ever came along," he said. "You should see everyone as a mentor."
He had this advice for women who want to get ahead: Grab tough assignments to prove yourself, get line experience, and embrace serious performance reviews and the coaching inherent in them.
"Without a rigorous appraisal system, without you knowing where you stand...and how you can improve, none of these 'help' programs that were up there are going to be worth much to you," he said. Mr. Welch said later that the appraisal "is the best way to attack bias" because the facts go into the document, which both parties have to sign.
Mr. Welch championed the business philosophy of "Six Sigma" at GE, a strategy that seeks to expunge defects from production through constant review and improvement. It appears to work with machines and business processes.
But applying that clinical procedure to the human character, as Mr. Welch seems to want to do, is a stickier proposition.
"His advice was not tailored to how women can attain parity in today's male-dominated workplace," said one female board member of a Fortune 500 company. Indeed, a couple of women walked out in frustration during his presentation.
Wall Street Journal, May 4, 2012, page B1
http://online.wsj.com/article/SB10001424052702303877604577382321364803912.html
Is Jack Welch a timeless seer or an out-of-touch warhorse?
The former Master and Commander of General Electric still writes widely on business strategy. He's also influential on the speaking circuit.
On Wednesday, Mr. Welch and his wife and writing partner, Suzy Welch, told a gathering of women executives from a range of industries that, in matters of career track, it is results and performance that chart the way. Programs promoting diversity, mentorships and affinity groups may or may not be good, but they are not how women get ahead. "Over deliver," Mr. Welch advised. "Performance is it!"
Angry murmurs ran through the crowd. The speakers asked: Were there any questions?
"We're regaining our consciousness," one woman executive shot back.
Mr. Welch had walked into a spinning turbine fan blade.
"Of course women need to perform to advance," Alison Quirk, an executive vice president at the investment firm State Street Corp., said later. "But we can all do more to help people understand their unconscious biases."
"He showed no recognition that the culture shapes the performance metrics, and the culture is that of white men," another executive said.
Academy Award winning actor Geena Davis talks about the perception of women as seen in the media and about what has and has not changed in the past sixty years.
Dee Dee Myers, a former White House press secretary who is now with Glover Park Group, a communications firm, added: "While he seemed to acknowledge the value of a diverse workforce, he didn't seem to think it was necessary to develop strategies for getting there—and especially for taking a cold, hard look at some of the subtle barriers to women's advancement that still exist. If objective performance measures were enough, more than a handful of Fortune 500 senior executives would already be women. "
"This meritocracy fiction may be the single biggest obstacle to women's advancement," added Lisa Levey, a consultant who heard Mr. Welch speak.
Mr. Welch has sparked controversy in the past with his view of the workplace. In 2009, he told a group of human-resources managers: "There's no such thing as work-life balance." Instead, "there are work-life choices, and you make them, and they have consequences." Step out of the arena to raise kids, and don't be surprised if the promotion passes you by.
Of the Fortune 500 companies, only 3% have a female CEO today. Female board membership is similarly spare. A survey of 60 major companies by McKinsey shows women occupying 53% of entry-level positions, 40% of manager positions, and only 19% of C-suite jobs.
The reasons for this are complex and aren't always about child rearing. A separate McKinsey survey showed that among women who have already reached the status of successful executive, 59% don't aspire to one of the top jobs. The majority of these women have already had children.
"Their work ethic—these people are doing it all," said Dominic Barton of McKinsey. "They say, 'I'm the person turning off the lights'" at the end of the day.
Instead, Mr. Barton said, it's "the soft stuff, the culture" that's shaping their career decisions.
The group of women executives who wrestled with Mr. Welch were at a conference on Women in the Economy held by The Wall Street Journal this week. Among other things, they tackled the culture questions—devising strategies to get more high-performing women to the top, keep women on track during childbearing years, address bias, and make the goals of diversity motivating to employees. They also discussed the sexual harassment some women still experience in the workplace. (A report on the group's findings will be published in the Journal Monday.)
The realm of the "soft stuff" may not be Mr. Welch's favored zone. During his remarks, he referred to human resources as "the H.R. teams that are out there, most of them for birthdays and picnics." He mentioned a women's forum inside GE that he says attracted 500 participants. "The best of the women would come to me and say, 'I don't want to be in a special group. I'm not in the victim's unit. I'm a star. I want to be compared with the best of your best.'"
And then he addressed the audience: "Stop lying about it. It's true. Great women get upset about getting into the victim's unit."
Individual mentoring programs, meanwhile, are "one of the worst ideas that ever came along," he said. "You should see everyone as a mentor."
He had this advice for women who want to get ahead: Grab tough assignments to prove yourself, get line experience, and embrace serious performance reviews and the coaching inherent in them.
"Without a rigorous appraisal system, without you knowing where you stand...and how you can improve, none of these 'help' programs that were up there are going to be worth much to you," he said. Mr. Welch said later that the appraisal "is the best way to attack bias" because the facts go into the document, which both parties have to sign.
Mr. Welch championed the business philosophy of "Six Sigma" at GE, a strategy that seeks to expunge defects from production through constant review and improvement. It appears to work with machines and business processes.
But applying that clinical procedure to the human character, as Mr. Welch seems to want to do, is a stickier proposition.
"His advice was not tailored to how women can attain parity in today's male-dominated workplace," said one female board member of a Fortune 500 company. Indeed, a couple of women walked out in frustration during his presentation.
Wednesday, May 2, 2012
Dynamic Loan Loss Provisioning: Simulations on Effectiveness and Guide to Implementation
Dynamic Loan Loss Provisioning: Simulations on Effectiveness and Guide to Implementation. By Torsten Wezel, Jorge A. Chan Lau, and Francesco Columba
IMF Working Paper No. 12/110
May 01, 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25885.0
Summary: This simulation-based paper investigates the impact of different methods of dynamic provisioning on bank soundness and shows that this increasingly popular macroprudential tool can smooth provisioning costs over the credit cycle and lower banks’ probability of default. In additon, the paper offers an in-depth guide to implementation that addresses pertinent issues related to data requirements, calibration and safeguards as well as accounting, disclosure and tax treatment. It also discusses the interaction of dynamic provisioning with other macroprudential instruments such as countercyclical capital.
Excerpts:
Introduction
Reducing the procyclicality of the banking sector by way of macroprudential policy instruments has become a policy priority. The recent crisis has illustrated how excessive procyclicality of the banking system may activate powerful macro-financial linkages that amplify the business cycle and how increased financial instability can have large negative spillover effects onto the real sector. Moreover, research has shown that crises that included banking turmoil are among the longest and most severe of all crises.
Although there is no consensus yet on the very definition of macroprudential policy, an array of such tools, especially those of countercyclical nature, has been applied in many countries for years. But it was only during the financial crisis that powerful macro-financial linkages played out on a global scale, conveying a sense of urgency.
In the wake of the crisis, policymakers therefore intensified their efforts to gear the macroprudential approach to financial stability towards improving banks’ capacity to absorb shocks—a consultative process that culminated in the development of the Basel III framework in December 2010 to be phased in over the coming years. In addition to improving the quality of bank capital and liquidity as well as imposing a minimum leverage ratio, this new regulatory standard introduces countercyclical capital buffers and lends support to forward-looking loan loss provisioning, which comprises dynamic provisioning (DP).
The new capital standard promotes the build-up of capital buffers in good times that can be drawn down in periods of stress, in the form of a capital conservation requirement to increase the banking sector’s resilience entering into a downturn. Part of this conservation buffer would be a countercyclical buffer that is to be activated only when there is excess credit growth so that the sector is not destabilized in the downturn. Such countercyclical capital has also been characterized as potentially cushioning the economy’s real output during a crisis (IMF, 2011). Similarly, dynamic provisioning requires banks to build a cushion of generic provisions during an upswing that can be used to cover rising specific provisions linked to loan delinquencies during the subsequent downturn.
Both countercyclical capital and DP have been applied in practice. Some countries have adjusted capital regulations in different phases of the cycle to give them a more potent countercyclical impact: Brazil has used a formula to smooth capital requirements for interest rate risk in times of extreme volatility, China introduced a countercyclical capital requirement similar to the countercyclical buffer under Basel III, and India has made countercyclical adjustments in risk weights and in provisioning. DP was first introduced by Spain in 2000 and subsequently adopted in Uruguay, Colombia, Peru, and Bolivia, while other countries such as Mexico and Chile switched to provisioning based on expected loan loss. Peru is the only country to explicitly use both countercyclical instruments in combination.
The concept of DP examined in this paper is intriguing. By gradually building a countercyclical loan loss reserve in good times and then using it to cover losses as they arise in bad times, DP is able to greatly smooth provisioning costs over the cycle and thus insulate banks’ profit and loss statements in this regard. Therefore, DP may usefully complement other policies targeted more at macroeconomic aggregates. The implementation of DP can, however, be a delicate balancing exercise. The calibration is typically challenging because it requires specific data, and even if these are available, it may still be inaccurate if the subsequent credit cycle differs substantially from the previous one(s) on which the model is necessarily predicated. Over-provisioning may ensue in particular instances. This said, a careful calibration that tries to incorporate as many of the stylized facts of past credit developments as possible goes a long way in providing a sizeable cushion for banks to withstand periodic downswings.
This paper provides strong support for DP as a tool for countercyclical banking policies. Our contribution to this strand of the literature is threefold. We first recreate a hypothetical path of provisions under different DP systems based on historical data of an emerging banking market and compare the outcome to the actual situation without DP. These counterfactual simulations suggest that a well-calibrated system of DP mitigates procyclicality in provisioning costs and thus earnings and capital. Second, using Monte-Carlo simulations we show that the countercyclical buffer that DP builds typically lowers a bank’s probability of default. Finally, we offer a guide to implementation of the DP concept that seeks to clarify issues related to data requirements, choice of formula, parametrization, accounting treatment, and recalibration.
Other studies that have used counterfactual simulations based on historical data to assess the hypothetical performance under DP include Balla and McKenna (2009), Fillat and Montoriol- Garriga (2010), both using U.S. bank data, and Wezel (2010), using data for Uruguay. All studies find support for the notion that DP, when properly calibrated, can help absorb rising loan losses in a downturn and thus be a useful macroprudential tool in this regard. Some other studies (Lim et al., 2011; Peydró-Alcalde et al., 2011) even find that DP is effective in mitigating swings in credit growth, although this should not be expected of DP in general.
Conclusion
This paper has provided a thorough analysis of the merits and challenges associated with dynamic provisioning—a macroprudential tool that deserves attention from policymakers and regulators for its capacity to distribute the burden of loan impairment evenly over the credit cycle and so quench an important source of procyclicality in banking. Our simulations that apply the Spanish and Peruvian DP formulas to a full cycle of banking data of an advanced emerging market leave little doubt that the countercyclical buffer built under DP not only smoothes costs but actually bolsters financial stability by lowering banks’ PD in severe downturn conditions. We also show that for best countercyclical results DP should be tailored to the different risk exposures of individual banks and the specific circumstances of banking sectors, presenting measures such as bank-specific rates or hybrid systems combining the virtues of formulas.
While the simple concept of providing in good times for lean years is intuitive, it has its operational challenges. When calibrating a DP system great care must be taken to keep countercyclical reserves in line with expected loan losses and so avoid insufficient buffers or excessive coverage. As many of the features and needed restrictions are not easily understood or operationalized, we offer a comprehensive primer for regulators eager to implement one of the variants of DP analyzed in the paper. The discussion of practical challenges also includes thorny issues like compliance with accounting standards. In fact, policymakers have long tended to dismiss DP on grounds that it is not legitimate from an accounting perspective and therefore focused on other tools such as countercyclical capital. To remedy this problem, we propose ways to recalibrate the formula periodically and so keep it in line with expected loan loss. Further, while recognizing that countercyclical capital has its definite place in the macroprudential toolkit, we argue that DP acts as a first line of defense by directly shielding bank profits, thereby lowering the degree to which other countercyclical instruments are needed. However, there should be no doubt that due to the limited impact of DP in restraining excessive credit growth complacency in supervision due to DP buffers should be avoided and that DP needs to be accompanied by other macroprudential tools aimed at mitigating particular systemic risks.
Clearly, further research is needed on the interaction between DP and countercyclical capital as well as other macroprudential tools to answer the question in what ways they can complement one another in providing an integrated countercyclical buffer. As an early example, Saurina (2011) analyzes DP and countercyclical capital side-by-side but not their possible interaction. Another area of needed research is the impact of DP on credit cycles and other macroeconomic aggregates. Newer studies (e.g., Peydró-Alcalde et al., 2011; Chan-Lau, 2012) evaluate the implications of DP for credit availability, yet broader-based results are certainly warranted. The ongoing efforts by a number of countries towards adopting DP systems and other forms of forward-looking provisioning will provide a fertile ground for such future research.
IMF Working Paper No. 12/110
May 01, 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25885.0
Summary: This simulation-based paper investigates the impact of different methods of dynamic provisioning on bank soundness and shows that this increasingly popular macroprudential tool can smooth provisioning costs over the credit cycle and lower banks’ probability of default. In additon, the paper offers an in-depth guide to implementation that addresses pertinent issues related to data requirements, calibration and safeguards as well as accounting, disclosure and tax treatment. It also discusses the interaction of dynamic provisioning with other macroprudential instruments such as countercyclical capital.
Excerpts:
Introduction
Reducing the procyclicality of the banking sector by way of macroprudential policy instruments has become a policy priority. The recent crisis has illustrated how excessive procyclicality of the banking system may activate powerful macro-financial linkages that amplify the business cycle and how increased financial instability can have large negative spillover effects onto the real sector. Moreover, research has shown that crises that included banking turmoil are among the longest and most severe of all crises.
Although there is no consensus yet on the very definition of macroprudential policy, an array of such tools, especially those of countercyclical nature, has been applied in many countries for years. But it was only during the financial crisis that powerful macro-financial linkages played out on a global scale, conveying a sense of urgency.
In the wake of the crisis, policymakers therefore intensified their efforts to gear the macroprudential approach to financial stability towards improving banks’ capacity to absorb shocks—a consultative process that culminated in the development of the Basel III framework in December 2010 to be phased in over the coming years. In addition to improving the quality of bank capital and liquidity as well as imposing a minimum leverage ratio, this new regulatory standard introduces countercyclical capital buffers and lends support to forward-looking loan loss provisioning, which comprises dynamic provisioning (DP).
The new capital standard promotes the build-up of capital buffers in good times that can be drawn down in periods of stress, in the form of a capital conservation requirement to increase the banking sector’s resilience entering into a downturn. Part of this conservation buffer would be a countercyclical buffer that is to be activated only when there is excess credit growth so that the sector is not destabilized in the downturn. Such countercyclical capital has also been characterized as potentially cushioning the economy’s real output during a crisis (IMF, 2011). Similarly, dynamic provisioning requires banks to build a cushion of generic provisions during an upswing that can be used to cover rising specific provisions linked to loan delinquencies during the subsequent downturn.
Both countercyclical capital and DP have been applied in practice. Some countries have adjusted capital regulations in different phases of the cycle to give them a more potent countercyclical impact: Brazil has used a formula to smooth capital requirements for interest rate risk in times of extreme volatility, China introduced a countercyclical capital requirement similar to the countercyclical buffer under Basel III, and India has made countercyclical adjustments in risk weights and in provisioning. DP was first introduced by Spain in 2000 and subsequently adopted in Uruguay, Colombia, Peru, and Bolivia, while other countries such as Mexico and Chile switched to provisioning based on expected loan loss. Peru is the only country to explicitly use both countercyclical instruments in combination.
The concept of DP examined in this paper is intriguing. By gradually building a countercyclical loan loss reserve in good times and then using it to cover losses as they arise in bad times, DP is able to greatly smooth provisioning costs over the cycle and thus insulate banks’ profit and loss statements in this regard. Therefore, DP may usefully complement other policies targeted more at macroeconomic aggregates. The implementation of DP can, however, be a delicate balancing exercise. The calibration is typically challenging because it requires specific data, and even if these are available, it may still be inaccurate if the subsequent credit cycle differs substantially from the previous one(s) on which the model is necessarily predicated. Over-provisioning may ensue in particular instances. This said, a careful calibration that tries to incorporate as many of the stylized facts of past credit developments as possible goes a long way in providing a sizeable cushion for banks to withstand periodic downswings.
This paper provides strong support for DP as a tool for countercyclical banking policies. Our contribution to this strand of the literature is threefold. We first recreate a hypothetical path of provisions under different DP systems based on historical data of an emerging banking market and compare the outcome to the actual situation without DP. These counterfactual simulations suggest that a well-calibrated system of DP mitigates procyclicality in provisioning costs and thus earnings and capital. Second, using Monte-Carlo simulations we show that the countercyclical buffer that DP builds typically lowers a bank’s probability of default. Finally, we offer a guide to implementation of the DP concept that seeks to clarify issues related to data requirements, choice of formula, parametrization, accounting treatment, and recalibration.
Other studies that have used counterfactual simulations based on historical data to assess the hypothetical performance under DP include Balla and McKenna (2009), Fillat and Montoriol- Garriga (2010), both using U.S. bank data, and Wezel (2010), using data for Uruguay. All studies find support for the notion that DP, when properly calibrated, can help absorb rising loan losses in a downturn and thus be a useful macroprudential tool in this regard. Some other studies (Lim et al., 2011; Peydró-Alcalde et al., 2011) even find that DP is effective in mitigating swings in credit growth, although this should not be expected of DP in general.
Conclusion
This paper has provided a thorough analysis of the merits and challenges associated with dynamic provisioning—a macroprudential tool that deserves attention from policymakers and regulators for its capacity to distribute the burden of loan impairment evenly over the credit cycle and so quench an important source of procyclicality in banking. Our simulations that apply the Spanish and Peruvian DP formulas to a full cycle of banking data of an advanced emerging market leave little doubt that the countercyclical buffer built under DP not only smoothes costs but actually bolsters financial stability by lowering banks’ PD in severe downturn conditions. We also show that for best countercyclical results DP should be tailored to the different risk exposures of individual banks and the specific circumstances of banking sectors, presenting measures such as bank-specific rates or hybrid systems combining the virtues of formulas.
While the simple concept of providing in good times for lean years is intuitive, it has its operational challenges. When calibrating a DP system great care must be taken to keep countercyclical reserves in line with expected loan losses and so avoid insufficient buffers or excessive coverage. As many of the features and needed restrictions are not easily understood or operationalized, we offer a comprehensive primer for regulators eager to implement one of the variants of DP analyzed in the paper. The discussion of practical challenges also includes thorny issues like compliance with accounting standards. In fact, policymakers have long tended to dismiss DP on grounds that it is not legitimate from an accounting perspective and therefore focused on other tools such as countercyclical capital. To remedy this problem, we propose ways to recalibrate the formula periodically and so keep it in line with expected loan loss. Further, while recognizing that countercyclical capital has its definite place in the macroprudential toolkit, we argue that DP acts as a first line of defense by directly shielding bank profits, thereby lowering the degree to which other countercyclical instruments are needed. However, there should be no doubt that due to the limited impact of DP in restraining excessive credit growth complacency in supervision due to DP buffers should be avoided and that DP needs to be accompanied by other macroprudential tools aimed at mitigating particular systemic risks.
Clearly, further research is needed on the interaction between DP and countercyclical capital as well as other macroprudential tools to answer the question in what ways they can complement one another in providing an integrated countercyclical buffer. As an early example, Saurina (2011) analyzes DP and countercyclical capital side-by-side but not their possible interaction. Another area of needed research is the impact of DP on credit cycles and other macroeconomic aggregates. Newer studies (e.g., Peydró-Alcalde et al., 2011; Chan-Lau, 2012) evaluate the implications of DP for credit availability, yet broader-based results are certainly warranted. The ongoing efforts by a number of countries towards adopting DP systems and other forms of forward-looking provisioning will provide a fertile ground for such future research.
Tuesday, May 1, 2012
Pharma: New Tufts Report Shows Academic-Industry Partnerships Are Mutually Beneficial
New Tufts Report Shows Academic-Industry Partnerships Are Mutually Beneficial |
http://www.innovation.org/index.cfm/NewsCenter/Newsletters?NID=200
April 30, 2012 -
According to a new study by the Tufts Center for the Study of Drug Development, collaboration among organizations is becoming increasingly important to advancing basic research and developing new medicines. This study specifically explores the breadth and nature of partnerships between biopharmaceutical companies and academic medical centers (AMCs)[1] which are likely to play an increasingly important role in making progress in treating unmet medical needs.
According to a new study by the Tufts Center for the Study of Drug Development, collaboration among organizations is becoming increasingly important to advancing basic research and developing new medicines. This study specifically explores the breadth and nature of partnerships between biopharmaceutical companies and academic medical centers (AMCs)[1] which are likely to play an increasingly important role in making progress in treating unmet medical needs.
In
the study, researchers examine a subset of public-private partnerships,
including more than 3,000 grants to AMCs from approximately 450
biopharmaceutical company sponsors that were provided through 22 medical
schools. Findings show that while it is generally accepted that these
partnerships have become an increasingly common approach both to promote
public health objectives and to produce healthcare innovations, it is
anticipated that their nature will continue to evolve over time and
their full potential is yet to be realized.
Tufts
researchers also found that the nature of these relationships is
varied, ever-changing, and expanding. They often involve company and AMC
scientists and other researchers working side-by-side on cutting-edge
science, applying advanced tools and resources. This type of innovative
research has enabled the United States to advance biomedical research in
a number of areas, such as the development of personalized medicines
and the understanding of rare diseases.
The
report outlines the 12 primary models of academic-industry
collaborations and highlights other emerging models, which reflect a
shift in the nature of academic-industry relationships toward more risk-
and resource-sharing partnerships. While unrestricted research support
has generally represented the most common form of academic-industry
collaboration, Tufts research found that this model is becoming less
frequently used. A range of innovative partnership models are emerging,
from corporate venture capital funds to pre-competitive research centers
to increasingly used academic drug discovery centers.
These
collaborations occur across all aspects of drug discovery and the
partnerships benefit both industry and academia since they provide the
opportunity for the leading biomedical researchers in both sectors to
work together to explore new technologies and scientific discoveries.
Such innovation in both the science and technology has the potential to
treat the most challenging diseases and conditions facing patients
today.
According
to Tufts, “[t]he industry is funding and working collaboratively with
the academic component of the public sector on basic research that
contributes broadly across the entire spectrum of biomedical R&D,
not just for products in its portfolio.” In conclusion, the report notes
that in the face of an increasingly challenging R&D environment and
overall global competition, we are likely to witness the continued
proliferation of AMC-industry partnerships.
[1] C.P.
Milne, et al., “Academic-Industry Partnerships for Biopharmaceutical
Research & Development: Advancing Medical Science in the U.S.,”
Tufts Center for the Study of Drug Development, April 2012.
Tuesday, April 24, 2012
From Bail-out to Bail-in: Mandatory Debt Restructuring of Systemic Financial Institutions
From Bail-out to Bail-in: Mandatory Debt Restructuring of Systemic Financial Institutions. By Zhou, Jian-Ping; Rutledge, Virginia; Bossu, Wouter; Dobler, Marc; Jassaud, Nadege; Moore, Michael
IMF Staff Discussion Notes No. 12/03
April 24, 2012
ISBN/ISSN: 978-1-61635-392-6 / 2221-030X
Stock No: SDNETEA2012003
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25858.0
Excerpts
Executive Summary
Large-scale government support of the financial institutions deemed too big or too important to fail during the recent crisis has been costly and has potentially increased moral hazard. To protect taxpayers from exposure to bank losses and to reduce the risks posed by too-big-tofail (TBTF), various reform initiatives have been undertaken at both national and international levels, including expanding resolution powers and tools.
One example is bail-in, which is a statutory power of a resolution authority (as opposed to contractual arrangements, such as contingent capital requirements) to restructure the liabilities of a distressed financial institution by writing down its unsecured debt and/or converting it to equity. The statutory bail-in power is intended to achieve a prompt recapitalization and restructuring of the distressed institution. This paper studies its effectiveness in restoring the viability of distressed institutions, discusses potential risks when a bail-in power is activated, and proposes design features to mitigate these risks. The main conclusions are:
1. As a going-concern form of resolution, bail-in could mitigate the systemic risks associated with disorderly liquidations, reduce deleveraging pressures, and preserve asset values that might otherwise be lost in a liquidation. With a credible threat of stock elimination or dilution by debt conversion and assumption of management by resolution authorities, financial institutions may be incentivized to raise capital or restructure debt voluntarily before the triggering of the bail-in power.
2. However, if the use of a bail-in power is perceived by the market as a sign of the concerned institution’s insolvency, it could trigger a run by short-term creditors and aggravate the institution’s liquidity problem. Ideally, therefore, bail-in should be activated when a capital infusion is expected to restore a distressed financial institution to viability, with official liquidity support as a backstop until the bank is stabilized.
3. Bail-in is not a panacea and should be considered as one element of a comprehensive solution to the TBTF problem. It should supplement, not replace, other resolution tools that would allow for an orderly closure of a failed institution.
4. Most importantly, the bail-in framework needs to be carefully designed to ensure its effective implementation.
Conclusions
Bail-in power needs to be considered as an additional and complementary tool for the resolution of SIFIs. Bail-in is a statutory power of a resolution authority, as opposed to contractual arrangements, such as contingent capital requirements. It involves recapitalization through relatively straightforward mandatory debt restructuring and could therefore avoid some of the operational and legal complexities that arise when using other tools (such as P&A transactions), which require transferring assets and liabilities between different legal entities and across borders. By restoring the viability of a distressed SIFI, the pressure on the institution to post more collateral, for example against their repo contracts, could be significantly reduced, thereby minimizing liquidity risks and preventing runs by short-term creditors.
The design and implementation of a bail-in power, however, need to take into careful consideration its potential market impact and its implications for financial stability. It is especially important that the triggering of a bail-in power is not perceived by the market as a sign of the concerned institution’s non-viability, a perception that could trigger a run by short-term creditors and aggravate the institution’s liquidity problem. An effective bail-in framework generally includes the following key design elements:
IMF Staff Discussion Notes No. 12/03
April 24, 2012
ISBN/ISSN: 978-1-61635-392-6 / 2221-030X
Stock No: SDNETEA2012003
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25858.0
Excerpts
Executive Summary
Large-scale government support of the financial institutions deemed too big or too important to fail during the recent crisis has been costly and has potentially increased moral hazard. To protect taxpayers from exposure to bank losses and to reduce the risks posed by too-big-tofail (TBTF), various reform initiatives have been undertaken at both national and international levels, including expanding resolution powers and tools.
One example is bail-in, which is a statutory power of a resolution authority (as opposed to contractual arrangements, such as contingent capital requirements) to restructure the liabilities of a distressed financial institution by writing down its unsecured debt and/or converting it to equity. The statutory bail-in power is intended to achieve a prompt recapitalization and restructuring of the distressed institution. This paper studies its effectiveness in restoring the viability of distressed institutions, discusses potential risks when a bail-in power is activated, and proposes design features to mitigate these risks. The main conclusions are:
1. As a going-concern form of resolution, bail-in could mitigate the systemic risks associated with disorderly liquidations, reduce deleveraging pressures, and preserve asset values that might otherwise be lost in a liquidation. With a credible threat of stock elimination or dilution by debt conversion and assumption of management by resolution authorities, financial institutions may be incentivized to raise capital or restructure debt voluntarily before the triggering of the bail-in power.
2. However, if the use of a bail-in power is perceived by the market as a sign of the concerned institution’s insolvency, it could trigger a run by short-term creditors and aggravate the institution’s liquidity problem. Ideally, therefore, bail-in should be activated when a capital infusion is expected to restore a distressed financial institution to viability, with official liquidity support as a backstop until the bank is stabilized.
3. Bail-in is not a panacea and should be considered as one element of a comprehensive solution to the TBTF problem. It should supplement, not replace, other resolution tools that would allow for an orderly closure of a failed institution.
4. Most importantly, the bail-in framework needs to be carefully designed to ensure its effective implementation.
- The triggers for bail-in power should be consistent with those used for other resolution tools. They should be set at the point when a firm would have breached the regulatory minima but before it became balance-sheet insolvent. To make bail-in a transparent tool, its scope should be limited to (i) elimination of existing equity shares as a precondition for a bail-in; and (ii) conversion and haircut to subordinated and unsecured senior debt. Debt restructuring under a bail-in should take into account the order of priorities applicable in a liquidation.
- A clear and coherent legal framework for bail-in is essential. The legal framework needs to be designed to establish an appropriate balance between the rights of private stakeholders and the public policy interest in preserving financial stability. Debt restructuring ideally would not be subject to creditor consent, but a “no creditor worse off” test may be introduced to safeguard creditors’ and shareholders’ interests. The framework also needs to provide mechanisms for addressing issues associated with the bail-in of debt issued by an entity of a larger banking group and with the cross-border operations of that entity or banking group.
- The contribution of new capital will come from debt conversion and/or an issuance of new equity, with an elimination or significant dilution of the pre-bail in shareholders. Bail-in will need to be accompanied by mechanisms to ensure the suitability of new shareholders. Some measures (e.g., a floor price for debt/equity conversion) might be necessary to reduce the risk of a “death spiral” in share prices.
- It may be necessary to impose minimum requirements on banks for issuing unsecured debt or to set limits on the encumbrance of assets (which have been introduced by many advanced countries). This would help reassure the market that a bail-in would be sufficient to recapitalize the distressed institution, thus forestalling potential runs by short-term creditors and avert a downward share price spiral. The framework should also include measures to mitigate contagion risks to other systemic financial institutions, for example, by limiting their cross-holding of unsecured senior debt.
Conclusions
Bail-in power needs to be considered as an additional and complementary tool for the resolution of SIFIs. Bail-in is a statutory power of a resolution authority, as opposed to contractual arrangements, such as contingent capital requirements. It involves recapitalization through relatively straightforward mandatory debt restructuring and could therefore avoid some of the operational and legal complexities that arise when using other tools (such as P&A transactions), which require transferring assets and liabilities between different legal entities and across borders. By restoring the viability of a distressed SIFI, the pressure on the institution to post more collateral, for example against their repo contracts, could be significantly reduced, thereby minimizing liquidity risks and preventing runs by short-term creditors.
The design and implementation of a bail-in power, however, need to take into careful consideration its potential market impact and its implications for financial stability. It is especially important that the triggering of a bail-in power is not perceived by the market as a sign of the concerned institution’s non-viability, a perception that could trigger a run by short-term creditors and aggravate the institution’s liquidity problem. An effective bail-in framework generally includes the following key design elements:
- The scope of the statutory power should be limited to (i) eliminating or diluting existing shareholders; and (ii) writing down or converting, in the following order, any contractual contingent capital instruments, subordinated debt, and unsecured senior debt, accompanied by the power of the resolution authority to change bank management.
- The triggers for bail-in power should be consistent with those used for other resolution tools and set at the point when an insititution would have breached the regulatory minima but before it became balance-sheet insolvent, to allow for a prompt response to an SIFI’s financial distress. The intervention criteria (a combination of quantitative and qualitative assessments) need to be as transparent and predictable as possible to avoid market uncertainty.
- It may be necessary to require banks or bank holding companies to maintain a minimum amount of unsecured liabilities (as a percentage of total liabilities) beforehand, which could be subject to bail-in afterwards. This would help reassure the market that bail-in is sufficient to recapitalize the distressed institution and restore its viability, thus reduce the risk of runs by short-term creditors.
- To fund potential liquidity outflows, and given the probable temporary loss of market access, bail-in may need to be coupled with adequate official liquidity assistance.
- Bail-in needs to be considered as one element of a comprehensive framework that includes effective supervision to reduce the likelihood of bank failures and an effective overall resolution framework that allows for an orderly resolution of a failed SIFI, facilitated by up-to-date recovery and resolution plans. In general, statutory bail-in should be used in instances where a capital infusion is likely to restore a distressed financial institution to viability, possibly because, other than a lack of capital, the institution is viable and has a decent business model and good riskmanagement systems. Otherwise, bail-in capital could simply delay the inevitable failure.
Central Bank Independence and Macro-prudential Regulation. By Kenichi Ueda & Fabian Valencia
Central Bank Independence and Macro-prudential Regulation. By Kenichi Ueda & Fabian Valencia
IMF Working Paper No. 12/101
Apr 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25872.0
Summary: We consider the optimality of various institutional arrangements for agencies that conduct macro-prudential regulation and monetary policy. When a central bank is in charge of price and financial stability, a new time inconsistency problem may arise. Ex-ante, the central bank chooses the socially optimal level of inflation. Ex-post, however, the central bank chooses inflation above the social optimum to reduce the real value of private debt. This inefficient outcome arises when macro-prudential policies cannot be adjusted as frequently as monetary. Importantly, this result arises even when the central bank is politically independent. We then consider the role of political pressures in the spirit of Barro and Gordon (1983). We show that if either the macro-prudential regulator or the central bank (or both) are not politically independent, separation of price and financial stability objectives does not deliver the social optimum.
Excerpts
Introduction
A growing literature based on models where pecuniary externalities reinforce shocks in the aggregate advocates the use of macro-prudential regulation (e.g. Bianchi (2010), Bianchi and Mendoza (2010), Jeanne and Korinek (2010), and Jeanne and Korinek (2011)). Most research in this area has focused on understanding the distortions that lead to financial amplification and to assess their quantitative importance. The natural next question is how to implement macro-prudential regulation.
Implementing macro-prudential policy requires, among other things, figuring out the optimal institutional design. In this context, there is an intense policy debate about the desirability of assigning the central bank formally with the responsibility of financial stability. This debate has spurred interest in studying the interactions between monetary and macro-prudential policies with the objective of understanding the conflicts and synergies that may arise from different institutional arrangements.
This paper contributes to this debate by exploring the circumstances under which it may be suboptimal to have the central bank in charge of macro-prudential regulation. We differ from a rapidly expanding literature on macro-prudential and monetary interactions, including De Paoli and Paustian (2011) and Quint and Rabanal (2011), mainly in that our focus is on the potential time-inconsistency problems that can arise, which are not addressed in existing work. Our departure point is the work pioneered by Kydland and Prescott (1977) and Barro and Gordon (1983) who studied how time-inconsistency problems and political pressures distort the monetary authority’s incentives under various institutional arrangements. In our model, there are two stages, in the first stage, the policymaker (possibly a single or several institutions) makes simultaneous monetary policy and macro-prudential regulation decisions. In the second stage, monetary policy decisions can be revised or “fine-tuned” after the realization of a credit shock. This setup captures the fact that macro-prudential regulation is intended to be used preemptively, once a credit shock (boom or bust) have taken place, it can do little to change the stock of debt. Monetary policy, on the other hand, can be used ex-ante and ex-post.
The key finding of the paper is that a dual-mandate central bank is not socially optimal. In this setting, a time inconsistency problem arises. While it is ex-ante optimal for the dual-mandate central bank to deliver the socially optimal level of inflation, it is not so ex-post. This central bank has the ex-post incentive to reduce the real burden of private debt through inflation, similar to the incentives to monetize public sector debt studied in Calvo (1978) and Lucas and Stokey (1983). This outcome arises because ex-post the dual-mandate central bank has only one tool, monetary policy, to achieve financial and price stability.
We then examine the role of political factors with a simple variation of our model in the spirit of Barro and Gordon (1983). We find that the above result prevails if policy is conducted by politically independent institutions. However, when institutions are not politically independent (the central bank, the macro-prudential regulator, or both) neither separate institutions nor combination of objectives in a single institution delivers the social optimum. As in Barro and Gordon (1983), the non-independent institution will use its policy tool at hand to try to generate economic expansions. The non-independent central bank will use monetary policy for this purpose and the non-independent macro-prudential regulator will use regulation. Which arrangement generates lower welfare losses in the case of non-independence depends on parameter values. A calibration of the model using parameter values from the literature suggest, however, that a regime with a non-independent dual-mandate central bank almost always delivers a worse outcome than a regime with a non-independent but separate macro-prudential regulator.
Finally, if the only distortion of concern is political interference (i.e. ignoring the time-inconsistency problem highlighted earlier) all that is needed to achieve the social optimum is political independence, with separation or combination of objectives yielding the same outcome. From a policy perspective, our analysis suggests that a conflict between price and financial stability objectives may arise if pursued by a single institution. Our results also extend the earlier findings by Barro and Gordon (1983) and many others on political independence of the central bank to show that these results are also applicable to a macro-prudential regulator. We should note that we have abstracted from considering the potential synergies that may arise in having dual mandate institutions. For instance, benefits from information sharing and use of central bank expertise may mitigate the welfare losses we have shown may arise (see Nier, Osinski, J´acome and Madrid (2011)), although information sharing would also benefit fiscal and monetary interactions. However, we have also abstracted other aspects that could exacerbate the welfare loss such as loss in reputation.
Conclusions
We consider macro-prudential regulation and monetary policy interactions to investigate the welfare implications of different institutional arrangements. In our framework, monetary policy can re-optimize following a realization of credit shocks, but macro-prudential regulation cannot be adjusted immediately after the credit shock. This feature of the model captures the ability of adjusting monetary policy more frequently than macro-prudential regulation because macro-prudential regulation is an ex-ante tool, whereas monetary policy can be used ex-ante and ex-post. In this setting, a central bank with a price and financial stability mandate does not deliver the social optimum because of a time-inconsistency problem. This central bank finds it optimal ex-ante to deliver the social optimal level of inflation, but it does not do so ex-post. This is because the central bank finds it optimal ex-post to let inflation rise to repair private balance sheets because ex-post it has only monetary policy to do so. Achieving the social optimum in this case requires separating the price and financial stability objectives.
We also consider the role of political independence of institutions, as in Barro and Gordon (1983). Under this extension, separation of price and financial stability objectives delivers the social optimum only if both institutions are politically independent. If the central bank or the macro-prudential regulator (or both) are not politically independent, they would not achieve the social optimum. Numerical analysis in our model suggest however, that in most cases a non-independent macro-prudential regulator (with independent monetary authority) delivers a better outcome than a non-independent central bank in charge of both price and financial stability.
IMF Working Paper No. 12/101
Apr 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25872.0
Summary: We consider the optimality of various institutional arrangements for agencies that conduct macro-prudential regulation and monetary policy. When a central bank is in charge of price and financial stability, a new time inconsistency problem may arise. Ex-ante, the central bank chooses the socially optimal level of inflation. Ex-post, however, the central bank chooses inflation above the social optimum to reduce the real value of private debt. This inefficient outcome arises when macro-prudential policies cannot be adjusted as frequently as monetary. Importantly, this result arises even when the central bank is politically independent. We then consider the role of political pressures in the spirit of Barro and Gordon (1983). We show that if either the macro-prudential regulator or the central bank (or both) are not politically independent, separation of price and financial stability objectives does not deliver the social optimum.
Excerpts
Introduction
A growing literature based on models where pecuniary externalities reinforce shocks in the aggregate advocates the use of macro-prudential regulation (e.g. Bianchi (2010), Bianchi and Mendoza (2010), Jeanne and Korinek (2010), and Jeanne and Korinek (2011)). Most research in this area has focused on understanding the distortions that lead to financial amplification and to assess their quantitative importance. The natural next question is how to implement macro-prudential regulation.
Implementing macro-prudential policy requires, among other things, figuring out the optimal institutional design. In this context, there is an intense policy debate about the desirability of assigning the central bank formally with the responsibility of financial stability. This debate has spurred interest in studying the interactions between monetary and macro-prudential policies with the objective of understanding the conflicts and synergies that may arise from different institutional arrangements.
This paper contributes to this debate by exploring the circumstances under which it may be suboptimal to have the central bank in charge of macro-prudential regulation. We differ from a rapidly expanding literature on macro-prudential and monetary interactions, including De Paoli and Paustian (2011) and Quint and Rabanal (2011), mainly in that our focus is on the potential time-inconsistency problems that can arise, which are not addressed in existing work. Our departure point is the work pioneered by Kydland and Prescott (1977) and Barro and Gordon (1983) who studied how time-inconsistency problems and political pressures distort the monetary authority’s incentives under various institutional arrangements. In our model, there are two stages, in the first stage, the policymaker (possibly a single or several institutions) makes simultaneous monetary policy and macro-prudential regulation decisions. In the second stage, monetary policy decisions can be revised or “fine-tuned” after the realization of a credit shock. This setup captures the fact that macro-prudential regulation is intended to be used preemptively, once a credit shock (boom or bust) have taken place, it can do little to change the stock of debt. Monetary policy, on the other hand, can be used ex-ante and ex-post.
The key finding of the paper is that a dual-mandate central bank is not socially optimal. In this setting, a time inconsistency problem arises. While it is ex-ante optimal for the dual-mandate central bank to deliver the socially optimal level of inflation, it is not so ex-post. This central bank has the ex-post incentive to reduce the real burden of private debt through inflation, similar to the incentives to monetize public sector debt studied in Calvo (1978) and Lucas and Stokey (1983). This outcome arises because ex-post the dual-mandate central bank has only one tool, monetary policy, to achieve financial and price stability.
We then examine the role of political factors with a simple variation of our model in the spirit of Barro and Gordon (1983). We find that the above result prevails if policy is conducted by politically independent institutions. However, when institutions are not politically independent (the central bank, the macro-prudential regulator, or both) neither separate institutions nor combination of objectives in a single institution delivers the social optimum. As in Barro and Gordon (1983), the non-independent institution will use its policy tool at hand to try to generate economic expansions. The non-independent central bank will use monetary policy for this purpose and the non-independent macro-prudential regulator will use regulation. Which arrangement generates lower welfare losses in the case of non-independence depends on parameter values. A calibration of the model using parameter values from the literature suggest, however, that a regime with a non-independent dual-mandate central bank almost always delivers a worse outcome than a regime with a non-independent but separate macro-prudential regulator.
Finally, if the only distortion of concern is political interference (i.e. ignoring the time-inconsistency problem highlighted earlier) all that is needed to achieve the social optimum is political independence, with separation or combination of objectives yielding the same outcome. From a policy perspective, our analysis suggests that a conflict between price and financial stability objectives may arise if pursued by a single institution. Our results also extend the earlier findings by Barro and Gordon (1983) and many others on political independence of the central bank to show that these results are also applicable to a macro-prudential regulator. We should note that we have abstracted from considering the potential synergies that may arise in having dual mandate institutions. For instance, benefits from information sharing and use of central bank expertise may mitigate the welfare losses we have shown may arise (see Nier, Osinski, J´acome and Madrid (2011)), although information sharing would also benefit fiscal and monetary interactions. However, we have also abstracted other aspects that could exacerbate the welfare loss such as loss in reputation.
Conclusions
We consider macro-prudential regulation and monetary policy interactions to investigate the welfare implications of different institutional arrangements. In our framework, monetary policy can re-optimize following a realization of credit shocks, but macro-prudential regulation cannot be adjusted immediately after the credit shock. This feature of the model captures the ability of adjusting monetary policy more frequently than macro-prudential regulation because macro-prudential regulation is an ex-ante tool, whereas monetary policy can be used ex-ante and ex-post. In this setting, a central bank with a price and financial stability mandate does not deliver the social optimum because of a time-inconsistency problem. This central bank finds it optimal ex-ante to deliver the social optimal level of inflation, but it does not do so ex-post. This is because the central bank finds it optimal ex-post to let inflation rise to repair private balance sheets because ex-post it has only monetary policy to do so. Achieving the social optimum in this case requires separating the price and financial stability objectives.
We also consider the role of political independence of institutions, as in Barro and Gordon (1983). Under this extension, separation of price and financial stability objectives delivers the social optimum only if both institutions are politically independent. If the central bank or the macro-prudential regulator (or both) are not politically independent, they would not achieve the social optimum. Numerical analysis in our model suggest however, that in most cases a non-independent macro-prudential regulator (with independent monetary authority) delivers a better outcome than a non-independent central bank in charge of both price and financial stability.
Wednesday, April 18, 2012
Principles for financial market infrastructures, assessment methodology and disclosure framework
CPSS Publications No 101
April 2012
Final version of the Principles for financial market infrastructures
The report Principles for financial market infrastructures
contains new and more demanding international standards for payment,
clearing and settlement systems, including central counterparties.
Issued by the CPSS and the International Organization of Securities
Commissions (IOSCO), the new standards (called "principles") are
designed to ensure that the infrastructure supporting global financial
markets is more robust and thus well placed to withstand financial
shocks.
The principles apply to all systemically important payment systems,
central securities depositories, securities settlement systems, central
counterparties and trade repositories (collectively "financial market
infrastructures"). They replace the three existing sets of international
standards set out in the Core principles for systemically important payment systems (CPSS, 2001); the Recommendations for securities settlement systems (CPSS-IOSCO, 2001); and the Recommendations for central counterparties
(CPSS-IOSCO, 2004). CPSS and IOSCO have strengthened and harmonised
these three sets of standards by raising minimum requirements, providing
more detailed guidance and broadening the scope of the standards to
cover new risk-management areas and new types of FMIs.
The principles were issued for public consultation in March 2011. The finalised principles being issued now have been revised in light of the comments received during that consultation.
CPSS and IOSCO members will strive to adopt the new standards by the
end of 2012. Financial market infrastructures (FMIs) are expected to
observe the standards as soon as possible.
Consultation versions of an assessment methodology and disclosure framework
At the same time as publishing the final version of the principles,
CPSS and IOSCO have issued two related documents for public
consultation, namely an assessment methodology and a disclosure framework for these new principles.
Comments on these two documents are invited from all interested parties and should be sent by 15 June 2012 to both the CPSS secretariat (cpss@bis.org) and the IOSCO secretariat (fmi@iosco.org).
The comments will be published on the websites of the Bank for
International Settlements (BIS) and IOSCO unless commentators request
otherwise. After the consultation period, the CPSS and IOSCO will review
the comments received and publish final versions of the two documents
later in 2012.
Other documents
A cover note that
explains the background to the three documents above and sets out some
specific points on the two consultation documents on which the
committees are seeking comments during the public consultation period is
also available.
A summary note that provides background on the report and an overview of its contents is also available.
Saturday, April 14, 2012
America's Voluntary Standards System--A "Best Practice" Model for Innovation Policy?
America's Voluntary Standards System--A "Best Practice" Model for Innovation Policy? By Dieter Ernst
East-West Center, Apr 2012
http://www.eastwestcenter.org/publications/americas-voluntary-standards-system-best-practice-model-innovation-policy
For its proponents, America's voluntary standards system is a "best practice" model for innovation policy. Foreign observers however are concerned about possible drawbacks of a standards system that is largely driven by the private sector. There are doubts, especially in Europe and China, whether the American system can balance public and private interests in times of extraordinary national and global challenges to innovation. To assess the merits of these conflicting perceptions, the paper reviews the historical roots of the American voluntary standards system, examines its current defining characteristics, and highlights its strengths and weaknesses. On the positive side, a tradition of decentralized local self-government, has given voice to diverse stakeholders in innovation, avoiding the pitfalls of top-down government-centered standards systems. However, a lack of effective coordination of multiple stakeholder strategies tends to constrain effective and open standardization processes, especially in the management of essential patents and in the timely provision of interoperability standards. To correct these drawbacks of the American standards system, the government has an important role to play as an enabler, coordinator, and, if necessary, an enforcer of the rules of the game in order to prevent abuse of market power by companies with large accumulated patent portfolios. The paper documents the ups and downs of the Federal Government’s role in standardization, and examines current efforts to establish robust public-private standards development partnerships, focusing on the Smart Grid Interoperability project coordinated by the National Institute of Standards and Technology (NIST). In short, countries that seek to improve their standards systems should study the strengths and weaknesses of the American system. However, persistent differences in economic institutions, levels of development and growth models are bound to limit convergence to a US-Style market-led voluntary standards system.
East-West Center, Apr 2012
http://www.eastwestcenter.org/publications/americas-voluntary-standards-system-best-practice-model-innovation-policy
For its proponents, America's voluntary standards system is a "best practice" model for innovation policy. Foreign observers however are concerned about possible drawbacks of a standards system that is largely driven by the private sector. There are doubts, especially in Europe and China, whether the American system can balance public and private interests in times of extraordinary national and global challenges to innovation. To assess the merits of these conflicting perceptions, the paper reviews the historical roots of the American voluntary standards system, examines its current defining characteristics, and highlights its strengths and weaknesses. On the positive side, a tradition of decentralized local self-government, has given voice to diverse stakeholders in innovation, avoiding the pitfalls of top-down government-centered standards systems. However, a lack of effective coordination of multiple stakeholder strategies tends to constrain effective and open standardization processes, especially in the management of essential patents and in the timely provision of interoperability standards. To correct these drawbacks of the American standards system, the government has an important role to play as an enabler, coordinator, and, if necessary, an enforcer of the rules of the game in order to prevent abuse of market power by companies with large accumulated patent portfolios. The paper documents the ups and downs of the Federal Government’s role in standardization, and examines current efforts to establish robust public-private standards development partnerships, focusing on the Smart Grid Interoperability project coordinated by the National Institute of Standards and Technology (NIST). In short, countries that seek to improve their standards systems should study the strengths and weaknesses of the American system. However, persistent differences in economic institutions, levels of development and growth models are bound to limit convergence to a US-Style market-led voluntary standards system.
BCBS: Implementation of stress testing practices by supervisors
Implementation of stress testing practices by supervisors: Basel Committee publishes peer review
http://www.bis.org/press/p120413.htm
The Basel Committee on Banking Supervision has today published a peer review of the implementation by national supervisory authorities of the Basel Committee's principles for sound stress testing practices and supervision.
Stress testing is an important tool used by banks to identify the potential for unexpected adverse outcomes across a range of risks and scenarios. In 2009, the Committee reviewed the performance of stress testing practices during the financial crisis and published recommendations for banks and supervisors entitled Principles for sound stress testing practices and supervision. The guidance set out a comprehensive set of principles for the sound governance, design and implementation of stress testing programmes at banks, as well as high-level expectations for the role and responsibilities of supervisors.
As part of its mandate to assess the implementation of standards across countries and to foster the promotion of good supervisory practice, the Committee's Standards Implementation Group (SIG) conducted a peer review during 2011 of supervisory authorities' implementation of the principles. The review found that stress testing has become a key component of the supervisory assessment process as well as a tool for contingency planning and communication. Countries are, however, at varying stages of maturity in the implementation of the principles; as a result, more work remains to be done to fully implement the principles in many countries.
Overall, the review found the 2009 stress testing principles to be generally effective. The Committee, however, will continue to monitor implementation of the principles and determine whether, in the future, additional guidance might be necessary.
BCBS
April 13, 2012
The Basel Committee on Banking Supervision has today published a peer review of the implementation by national supervisory authorities of the Basel Committee's principles for sound stress testing practices and supervision.
Stress testing is an important tool used by banks to identify the potential for unexpected adverse outcomes across a range of risks and scenarios. In 2009, the Committee reviewed the performance of stress testing practices during the financial crisis and published recommendations for banks and supervisors entitled Principles for sound stress testing practices and supervision. The guidance set out a comprehensive set of principles for the sound governance, design and implementation of stress testing programmes at banks, as well as high-level expectations for the role and responsibilities of supervisors.
As part of its mandate to assess the implementation of standards across countries and to foster the promotion of good supervisory practice, the Committee's Standards Implementation Group (SIG) conducted a peer review during 2011 of supervisory authorities' implementation of the principles. The review found that stress testing has become a key component of the supervisory assessment process as well as a tool for contingency planning and communication. Countries are, however, at varying stages of maturity in the implementation of the principles; as a result, more work remains to be done to fully implement the principles in many countries.
Overall, the review found the 2009 stress testing principles to be generally effective. The Committee, however, will continue to monitor implementation of the principles and determine whether, in the future, additional guidance might be necessary.
Friday, April 13, 2012
Conference on macrofinancial linkages and their policy implications
Bank of Korea - Bank for International Settlements - International
Monetary Fund: joint conference concludes on macrofinancial linkages and
their policy implications
The Bank of Korea, the Bank for International Settlements and the
International Monetary Fund have today brought to a successful
conclusion their joint conference on "Macrofinancial linkages: Implications for monetary and financial stability policies".
Held on April 10-11 in Seoul, Korea, the event brought together central
bankers, regulators and researchers to discuss a variety of topics
related to interactions between the financial system and the real
economy. The goal of the conference was to promote a continuing dialogue
on the policy implications of recent research findings.
The conference programme included the presentation and discussion of research on the following issues:
The conference concluded with a panel discussion chaired by Stephen Cecchetti (BIS), and including Jun Il Kim (Bank of Korea), Jan Brockmeijer (IMF), Hiroshi Nakaso (Bank of Japan), and David Fernandez (JP Morgan). The panel discussion focused on the lessons or guideposts for the formulation and implementation of macroprudential and monetary policies that can be drawn from the intensive research efforts on macrofinancial issues in recent years, as well as on the empirical evidence on the effectiveness of policy measures. The roundtable also included a discussion of weaknesses in our understanding of macrofinancial linkages and touched on priorities for future research, analysis, and continuing cooperation between central banks, regulatory authorities, international organisations and academics.
Introducing the conference, Choongsoo Kim, Governor of the Bank of Korea, said, "Since major countries' measures to reform financial regulations, including Basel III of the BCBS, focus mostly on the prevention of crisis recurrence, we need to continuously monitor and track how these measures will affect the sustainability of world economic growth in the medium- and long-term. In doing so, we should be careful so that the strengthening of financial regulation does not weaken the benign function of finance, which is to drive the growth of the real economy through seamless financial intermediation. Moreover, in today's more closely interconnected world economy, the strengthening of financial regulation with a primary focus on advanced countries does not equally affect the financial system in emerging market countries with their significantly different financial structure. Hence, in examining the implementation of regulations, an in-depth analysis should be conducted of how these regulations will affect the financial industries of emerging market countries and all other countries other than the advanced economies and their careful monitoring is called for."
Stephen Cecchetti, Economic Adviser and Head of the BIS Monetary and Economic Department, remarked that "It is important that we continue to learn about the mechanisms through which financial regulation helps to stabilize the economic and financial system. We are not only exploring the effectiveness of existing tools, but also working to fashion new ones. Doing this means refining the intellectual framework, including both the theoretical models and empirical analysis, that forms the basis for macroprudential policy and microprudential policy, as well as conventional and unconventional monetary policy. The papers presented and discussed in this conference are part of the foundation of this new and essential stability-oriented policy framework."
Jan Brockmeijer, Deputy Director of the IMF Monetary and Capital Markets Department, added that "All the institutions involved in developing macroprudential policy frameworks are on a learning curve both with regard to monitoring systemic risks and in using tools to limit such risks. In such circumstances, sharing of views and experiences is crucial to identifying best practices and moving up the learning curve quickly. The Fund is eager to help its members in this regard, and the conference co-organised by the Fund is one way to serve this purpose."
April 12, 2012
http://www.bis.org/press/p120412.pdf
The conference programme included the presentation and discussion of research on the following issues:
- Banks, shadow banks and the macroeconomy;
- Bank liquidity regulation;
- The macroeconomic impact of regulatory measures;
- Macroprudential policies in theory and in practice;
- Monetary policy and financial stability.
The conference concluded with a panel discussion chaired by Stephen Cecchetti (BIS), and including Jun Il Kim (Bank of Korea), Jan Brockmeijer (IMF), Hiroshi Nakaso (Bank of Japan), and David Fernandez (JP Morgan). The panel discussion focused on the lessons or guideposts for the formulation and implementation of macroprudential and monetary policies that can be drawn from the intensive research efforts on macrofinancial issues in recent years, as well as on the empirical evidence on the effectiveness of policy measures. The roundtable also included a discussion of weaknesses in our understanding of macrofinancial linkages and touched on priorities for future research, analysis, and continuing cooperation between central banks, regulatory authorities, international organisations and academics.
Introducing the conference, Choongsoo Kim, Governor of the Bank of Korea, said, "Since major countries' measures to reform financial regulations, including Basel III of the BCBS, focus mostly on the prevention of crisis recurrence, we need to continuously monitor and track how these measures will affect the sustainability of world economic growth in the medium- and long-term. In doing so, we should be careful so that the strengthening of financial regulation does not weaken the benign function of finance, which is to drive the growth of the real economy through seamless financial intermediation. Moreover, in today's more closely interconnected world economy, the strengthening of financial regulation with a primary focus on advanced countries does not equally affect the financial system in emerging market countries with their significantly different financial structure. Hence, in examining the implementation of regulations, an in-depth analysis should be conducted of how these regulations will affect the financial industries of emerging market countries and all other countries other than the advanced economies and their careful monitoring is called for."
Stephen Cecchetti, Economic Adviser and Head of the BIS Monetary and Economic Department, remarked that "It is important that we continue to learn about the mechanisms through which financial regulation helps to stabilize the economic and financial system. We are not only exploring the effectiveness of existing tools, but also working to fashion new ones. Doing this means refining the intellectual framework, including both the theoretical models and empirical analysis, that forms the basis for macroprudential policy and microprudential policy, as well as conventional and unconventional monetary policy. The papers presented and discussed in this conference are part of the foundation of this new and essential stability-oriented policy framework."
Jan Brockmeijer, Deputy Director of the IMF Monetary and Capital Markets Department, added that "All the institutions involved in developing macroprudential policy frameworks are on a learning curve both with regard to monitoring systemic risks and in using tools to limit such risks. In such circumstances, sharing of views and experiences is crucial to identifying best practices and moving up the learning curve quickly. The Fund is eager to help its members in this regard, and the conference co-organised by the Fund is one way to serve this purpose."
Wednesday, April 11, 2012
IMF Global Financial Stability Report: Risks of stricter prudential regulations
IMF Global Financial Stability Report
Apr 2012
http://www.imf.org/External/Pubs/FT/GFSR/2012/01/index.htm
Chapter 3 of the April 2012 Global Financial Stability Report probes the implications of recent reforms in the financial system for market perception of safe assets. Chapter 4 investigates the growing public and private costs of increased longevity risk from aging populations.
Excerpts from Ch. 3, Safe Assets: Financial System Cornerstone?:
In the future, there will be rising demand for safe assets, but fewer of them will be available, increasing the price for safety in global markets. In principle, investors evaluate all assets based on their intrinsic characteristics. In the absence of market distortions, asset prices tend to reflect their underlying features, including safety. However, factors external to asset markets—including the required use of specific assets in prudential regulations, collateral practices, and central bank operations—may preclude markets from pricing assets efficiently, distorting the price of safety. Before the onset of the global financial crisis, regulations, macroeconomic policies, and market practices had encouraged the underpricing of safety. Some safety features are more accurately reflected now, but upcoming regulatory and market reforms and central bank crisis management strategies, combined with continued uncertainty and a shrinking supply of assets considered safe, will increase the price of safety beyond what would be the case without such distortions.
The magnitude of the rise in the price of safety is highly uncertain [...]
However, it is clear that market distortions pose increasing challenges to the ability of safe assets to fulfill all their various roles in financial markets. [...] For banks, the common application of zero percent regulatory risk weights on debt issued by their own sovereigns, irrespective of risks, created perceptions of safety detached from underlying economic risks and contributed to the buildup of demand for such securities. [...]
[...] Although regulatory reforms to make institutions safer are clearly needed, insufficient differentiation across eligible assets to satisfy some regulatory requirements could precipitate unintended cliff effects—sudden drops in the prices—when some safe assets become unsafe and no longer satisfy various regulatory criteria. Moreover, the burden of mispriced safety across types of investors may be uneven. For instance, prudential requirements could lead to stronger pressures in the markets for shorter-maturity safe assets, with greater impact on investors with higher potential allocations at shorter maturities, such as banks.
Apr 2012
http://www.imf.org/External/Pubs/FT/GFSR/2012/01/index.htm
Chapter 3 of the April 2012 Global Financial Stability Report probes the implications of recent reforms in the financial system for market perception of safe assets. Chapter 4 investigates the growing public and private costs of increased longevity risk from aging populations.
Excerpts from Ch. 3, Safe Assets: Financial System Cornerstone?:
In the future, there will be rising demand for safe assets, but fewer of them will be available, increasing the price for safety in global markets. In principle, investors evaluate all assets based on their intrinsic characteristics. In the absence of market distortions, asset prices tend to reflect their underlying features, including safety. However, factors external to asset markets—including the required use of specific assets in prudential regulations, collateral practices, and central bank operations—may preclude markets from pricing assets efficiently, distorting the price of safety. Before the onset of the global financial crisis, regulations, macroeconomic policies, and market practices had encouraged the underpricing of safety. Some safety features are more accurately reflected now, but upcoming regulatory and market reforms and central bank crisis management strategies, combined with continued uncertainty and a shrinking supply of assets considered safe, will increase the price of safety beyond what would be the case without such distortions.
The magnitude of the rise in the price of safety is highly uncertain [...]
However, it is clear that market distortions pose increasing challenges to the ability of safe assets to fulfill all their various roles in financial markets. [...] For banks, the common application of zero percent regulatory risk weights on debt issued by their own sovereigns, irrespective of risks, created perceptions of safety detached from underlying economic risks and contributed to the buildup of demand for such securities. [...]
[...] Although regulatory reforms to make institutions safer are clearly needed, insufficient differentiation across eligible assets to satisfy some regulatory requirements could precipitate unintended cliff effects—sudden drops in the prices—when some safe assets become unsafe and no longer satisfy various regulatory criteria. Moreover, the burden of mispriced safety across types of investors may be uneven. For instance, prudential requirements could lead to stronger pressures in the markets for shorter-maturity safe assets, with greater impact on investors with higher potential allocations at shorter maturities, such as banks.
Money and Collaterall, by Manmohan Singh & Peter Stella
Money and Collateral, by Manmohan Singh & Peter Stella
IMF Working Paper No. 12/95
Apr 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25851.0
Summary: Between 1980 and before the recent crisis, the ratio of financial market debt to liquid assets rose exponentially in the U.S. (and in other financial markets), reflecting in part the greater use of securitized assets to collateralize borrowing. The subsequent crisis has reduced the pool of assets considered acceptable as collateral, resulting in a liquidity shortage. When trying to address this, policy makers will need to consider concepts of liquidity besides the traditional metric of excess bank reserves and do more than merely substitute central bank money for collateral that currently remains highly liquid.
Excerpts:
Introduction
In the traditional view of a banking system, credit and money are largely counterparts to each other on different sides of the balance sheet. In the process of maturity transformation, banks are able to create liquid claims on themselves, namely money, which is the counterpart to the less liquid loans or credit.2 Owing to the law of large numbers, banks have—for centuries— been able to safely conduct this business with relatively little liquid reserves, as long as basic confidence in the soundness of the bank portfolio is maintained.
In recent decades, with the advent of securitization and electronic means of trading and settlement, it became possible to greatly expand the scope of assets that could be transformed directly, through their use as collateral, into highly liquid or money-like assets. The expansion in the scope of the assets that could be securitized was in part facilitated by the growth of the shadow financial system, which was largely unregulated, and the ability to borrow from non-deposit sources. This meant deposits no longer equaled credit (Schularick and Taylor, 2008). The justification for light touch or no regulation of this new market was that collateralization was sufficient (and of high quality) and that market forces would ensure appropriate risk taking and dispersion among those educated investors best able to take those risks which were often tailor made to their demands. Where regulation fell short was in failing to recognize the growing interconnectedness of the shadow and regulated sectors, and the growing tail risk that sizable leverage entailed (Gennaioli, Shleifer and Vishny, 2011).
Post-Lehman, there has been a disintermediation process leading to a fall in the money multiplier. This is related to the shortage of collateral (Singh 2011). This is having a real impact—in fact deleveraging is more pronounced due to less collateral. Section II of the paper focuses on money as a legal tender, the money multiplier; then we introduce the adjusted money multiplier. Section III discusses collateral, including tail-risk collateral. Section IV tries to bridge the money and collateral aspects from a “safe assets” angle. Section V introduces collateral chains and describes the economics behind the private pledged collateral market. Section VI brings the monetary and collateral issues together under an overall financial lubrication framework. In our conclusion (section VII) we offer a useful basis for understanding monetary policy in the current environment.
Conclusion
“Monetary” policy is currently being undertaken in uncharted territory and may change some fundamental assumptions that link monetary and macro-financial policies. Central banks are considering whether and how to augment the apparently ‘failed’ transmission mechanism and in so doing will need to consider the role that collateral plays as financial lubrication (see also Debelle, 2012). Swaps of “good” for “bad” collateral may become part of the standard toolkit.31 If so, the fiscal aspects and risks associated with such policies—which are virtually nil in conventional QE swaps of central bank money for treasuries—are important and cannot be ignored. Furthermore, the issue of institutional accountability and authority to engage in such operations touches at the heart of central bank independence in a democratic society.
These fundamental questions concerning new policy tools and institutional design have arisen at the same time as developed countries have issued massive amounts of new debt. Although the traditional bogeyman of pure seigniorage financing, that is, massive monetary purchases of government debt may have disappeared from the dark corners of central banks, this does not imply that inflation has been forever arrested. Thus a central bank may “stand firm” yet witness rises in the price level that occur to “align the market value of government debt to the value of its expected real backing.” Hence current concerns as to the potential limitations fiscal policy places on monetary policy are well founded and indeed are novel only to those unfamiliar with similar concerns raised for decades in emerging and developing countries as well as in the “mature” markets before World War II.
IMF Working Paper No. 12/95
Apr 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25851.0
Summary: Between 1980 and before the recent crisis, the ratio of financial market debt to liquid assets rose exponentially in the U.S. (and in other financial markets), reflecting in part the greater use of securitized assets to collateralize borrowing. The subsequent crisis has reduced the pool of assets considered acceptable as collateral, resulting in a liquidity shortage. When trying to address this, policy makers will need to consider concepts of liquidity besides the traditional metric of excess bank reserves and do more than merely substitute central bank money for collateral that currently remains highly liquid.
Excerpts:
Introduction
In the traditional view of a banking system, credit and money are largely counterparts to each other on different sides of the balance sheet. In the process of maturity transformation, banks are able to create liquid claims on themselves, namely money, which is the counterpart to the less liquid loans or credit.2 Owing to the law of large numbers, banks have—for centuries— been able to safely conduct this business with relatively little liquid reserves, as long as basic confidence in the soundness of the bank portfolio is maintained.
In recent decades, with the advent of securitization and electronic means of trading and settlement, it became possible to greatly expand the scope of assets that could be transformed directly, through their use as collateral, into highly liquid or money-like assets. The expansion in the scope of the assets that could be securitized was in part facilitated by the growth of the shadow financial system, which was largely unregulated, and the ability to borrow from non-deposit sources. This meant deposits no longer equaled credit (Schularick and Taylor, 2008). The justification for light touch or no regulation of this new market was that collateralization was sufficient (and of high quality) and that market forces would ensure appropriate risk taking and dispersion among those educated investors best able to take those risks which were often tailor made to their demands. Where regulation fell short was in failing to recognize the growing interconnectedness of the shadow and regulated sectors, and the growing tail risk that sizable leverage entailed (Gennaioli, Shleifer and Vishny, 2011).
Post-Lehman, there has been a disintermediation process leading to a fall in the money multiplier. This is related to the shortage of collateral (Singh 2011). This is having a real impact—in fact deleveraging is more pronounced due to less collateral. Section II of the paper focuses on money as a legal tender, the money multiplier; then we introduce the adjusted money multiplier. Section III discusses collateral, including tail-risk collateral. Section IV tries to bridge the money and collateral aspects from a “safe assets” angle. Section V introduces collateral chains and describes the economics behind the private pledged collateral market. Section VI brings the monetary and collateral issues together under an overall financial lubrication framework. In our conclusion (section VII) we offer a useful basis for understanding monetary policy in the current environment.
Conclusion
“Monetary” policy is currently being undertaken in uncharted territory and may change some fundamental assumptions that link monetary and macro-financial policies. Central banks are considering whether and how to augment the apparently ‘failed’ transmission mechanism and in so doing will need to consider the role that collateral plays as financial lubrication (see also Debelle, 2012). Swaps of “good” for “bad” collateral may become part of the standard toolkit.31 If so, the fiscal aspects and risks associated with such policies—which are virtually nil in conventional QE swaps of central bank money for treasuries—are important and cannot be ignored. Furthermore, the issue of institutional accountability and authority to engage in such operations touches at the heart of central bank independence in a democratic society.
These fundamental questions concerning new policy tools and institutional design have arisen at the same time as developed countries have issued massive amounts of new debt. Although the traditional bogeyman of pure seigniorage financing, that is, massive monetary purchases of government debt may have disappeared from the dark corners of central banks, this does not imply that inflation has been forever arrested. Thus a central bank may “stand firm” yet witness rises in the price level that occur to “align the market value of government debt to the value of its expected real backing.” Hence current concerns as to the potential limitations fiscal policy places on monetary policy are well founded and indeed are novel only to those unfamiliar with similar concerns raised for decades in emerging and developing countries as well as in the “mature” markets before World War II.
Thursday, April 5, 2012
IMF Background Material for its Assessment of China under the Financial Sector Assessment Program
IMF Releases Background Material for its Assessment of China under the Financial Sector Assessment Program
Press Release No. 12/123
April 5, 2012
A joint International Monetary Fund (IMF) and The World Bank assessment of China's financial system was undertaken during 2010 under the Financial Sector Assessment Program (FSAP). The Financial System Stability Assessment (FSSA) report, which is the main IMF output of the FSAP process, was discussed by the Executive Board of the IMF at the time of the annual Article IV discussion in July 2011.
The FSSA report was published on Monday, November 14, 2011. As background for the FSSA, comprehensive assessments were undertaken by the FSAP mission of the financial regulatory infrastructure and the Detailed Assessment Reports of China's observance with international financial standards were prepared during the FSAP exercise. At the request of the Chinese authorities, these five reports are being released today.
The documents published are as follows:
Detailed Assessment of Observance Reports
The FSAP is a comprehensive and in-depth analysis of a country’s financial sector. The FSAP findings provide inputs to the IMF’s broader surveillance of its member countries’ economies, known as Article IV consultations. The focus of the FSAP assessments is to gauge the stability of the financial sector and to assess its potential contribution to growth. To assess financial stability, an FSAP examines the soundness of the banks and other financial institutions, conducts stress tests, rates the quality of financial regulation and supervision against accepted international standards, and evaluates the ability of country authorities to intervene effectively in case of a financial crisis. Assessments in developing and emerging market countries are done by the IMF jointly with the World Bank; those in advanced economies are done by the IMF alone.
This is the first time the Chinese financial system has undergone an FSAP assessment.
Since the FSAP was launched in 1999, more than 130 countries have volunteered to undergo these examinations (many countries more than once), with another 35 or so currently underway or in the pipeline. Following the recent global financial crisis, demand for FSAP assessments has been rising, and all G-20 countries have made a commitment to undergo regular assessments.
For additional information on the program, see the Factsheet and FAQs.
Original link: http://www.imf.org/external/np/sec/pr/2012/pr12123.htm
Press Release No. 12/123
April 5, 2012
A joint International Monetary Fund (IMF) and The World Bank assessment of China's financial system was undertaken during 2010 under the Financial Sector Assessment Program (FSAP). The Financial System Stability Assessment (FSSA) report, which is the main IMF output of the FSAP process, was discussed by the Executive Board of the IMF at the time of the annual Article IV discussion in July 2011.
The FSSA report was published on Monday, November 14, 2011. As background for the FSSA, comprehensive assessments were undertaken by the FSAP mission of the financial regulatory infrastructure and the Detailed Assessment Reports of China's observance with international financial standards were prepared during the FSAP exercise. At the request of the Chinese authorities, these five reports are being released today.
The documents published are as follows:
Detailed Assessment of Observance Reports
- Observance of Basel Core Principles for Effective Banking Supervision
- Observance of IAIS Insurance Core Principles
- Observance of IOSCO Objectives and Principles of Securities Regulation
- Observance of CPSS Core Principles for Systemically Important Payment Systems
- Observance of CPSS-IOSCO Recommendations for Securities Settlement Systems and Central Counterparties
The FSAP is a comprehensive and in-depth analysis of a country’s financial sector. The FSAP findings provide inputs to the IMF’s broader surveillance of its member countries’ economies, known as Article IV consultations. The focus of the FSAP assessments is to gauge the stability of the financial sector and to assess its potential contribution to growth. To assess financial stability, an FSAP examines the soundness of the banks and other financial institutions, conducts stress tests, rates the quality of financial regulation and supervision against accepted international standards, and evaluates the ability of country authorities to intervene effectively in case of a financial crisis. Assessments in developing and emerging market countries are done by the IMF jointly with the World Bank; those in advanced economies are done by the IMF alone.
This is the first time the Chinese financial system has undergone an FSAP assessment.
Since the FSAP was launched in 1999, more than 130 countries have volunteered to undergo these examinations (many countries more than once), with another 35 or so currently underway or in the pipeline. Following the recent global financial crisis, demand for FSAP assessments has been rising, and all G-20 countries have made a commitment to undergo regular assessments.
For additional information on the program, see the Factsheet and FAQs.
Original link: http://www.imf.org/external/np/sec/pr/2012/pr12123.htm
Management Tips from the Wall Street Journal
Management Tips from the Wall Street Journal
Developing a Leadership Style
Leadership Styles
What do Managers do?
Leadership in a Crisis – How To Be a Leader
What are the Common Mistakes of New Managers?
What is the Difference Between Management and Leadership?
How Can Young Women Develop a Leadership Style?
Managing Your People
How to Motivate Workers in Tough Times
Motivating Employees
How to Manage Different Generations
How to Develop Future Leaders
How to Reduce Employee Turnover
Should I Rank My Employees?
How to Keep Your Most Talented People
Should I Use Email?
How to Write Memos
Recruiting, Hiring and Firing
Conducting Employment Interviews – Hiring How To
How to Hire New People
How to Make Layoffs
What are Alternatives to Layoffs?
How to Reduce Employee Turnover
Should I Rank My Employees?
How to Keep Your Most Talented People
Building a Workplace Culture
How to Increase Workplace Diversity
How to Create a Culture of Candor
How to Change Your Organization’s Culture
How to Create a Culture of Action in the Workplace
Strategy
What is Strategy?
How to Set Goals for Employees
What Management Strategy Should I Use in an Economic Downturn?
What is Blue Ocean Strategy?
Execution
What are the Keys to Good Execution?
How to Create a Culture of Action in the Workplace
Innovation
How to Innovate in a Downturn
How to Change Your Organization’s Culture
What is Blue Ocean Strategy?
Managing Change
How to Motivate Workers in Tough Times
Leadership in a Crisis – How To Be a Leader
What Management Strategy Should I Use in an Economic Downturn?
How to Change Your Organization’s Culture
guides.wsj.com/management/
Developing a Leadership Style
Leadership Styles
What do Managers do?
Leadership in a Crisis – How To Be a Leader
What are the Common Mistakes of New Managers?
What is the Difference Between Management and Leadership?
How Can Young Women Develop a Leadership Style?
Managing Your People
How to Motivate Workers in Tough Times
Motivating Employees
How to Manage Different Generations
How to Develop Future Leaders
How to Reduce Employee Turnover
Should I Rank My Employees?
How to Keep Your Most Talented People
Should I Use Email?
How to Write Memos
Recruiting, Hiring and Firing
Conducting Employment Interviews – Hiring How To
How to Hire New People
How to Make Layoffs
What are Alternatives to Layoffs?
How to Reduce Employee Turnover
Should I Rank My Employees?
How to Keep Your Most Talented People
Building a Workplace Culture
How to Increase Workplace Diversity
How to Create a Culture of Candor
How to Change Your Organization’s Culture
How to Create a Culture of Action in the Workplace
Strategy
What is Strategy?
How to Set Goals for Employees
What Management Strategy Should I Use in an Economic Downturn?
What is Blue Ocean Strategy?
Execution
What are the Keys to Good Execution?
How to Create a Culture of Action in the Workplace
Innovation
How to Innovate in a Downturn
How to Change Your Organization’s Culture
What is Blue Ocean Strategy?
Managing Change
How to Motivate Workers in Tough Times
Leadership in a Crisis – How To Be a Leader
What Management Strategy Should I Use in an Economic Downturn?
How to Change Your Organization’s Culture
guides.wsj.com/management/
Subscribe to:
Posts (Atom)