wricaplogo

Overview: Tue, May 14

Jeremy Stein

Thu, October 11, 2012
Brookings Institution

Some observers have argued that a long period of low rates can create incentives among market participants (such as banks, insurance companies, and pension funds) to reach for yield by taking on higher levels of risk with adverse consequences for stability. These concerns should be taken very seriously, and a lot of work at the Fed is devoted to monitoring such risks. A short summary would be that there is some qualitative evidence of reaching-for-yield behavior in certain segments of the market, but that we are not seeing anything quantitatively alarming at this point. Of course, the worry is that one often sees only the tip of the iceberg in these kinds of situations, so one needs to be cautious in interpreting the data.

Thu, October 11, 2012
Brookings Institution

I believe that the LSAP component of the statement helped bolster the credibility of the forward guidance component by pairing a declaration about future intentions with an immediate and concrete set of actions. And I suspect that this complementarity helps explain the strong positive reaction of the stock market to the release of the statement.

Fri, November 30, 2012
Boston University/Boston Fed Conference on Macro-Finance Linkages

The bottom line is that I suspect that mortgage purchases may confer more macroeconomic stimulus dollar-for-dollar than Treasury purchases.

Mon, December 17, 2012
Global Research Forum, International Finance and Macroecomomics, Sponsored by the European Central Bank

[Our] analysis underscores that the Federal Reserve's temporary dollar liquidity swap lines with the European Central Bank and other central banks are an effective response to stresses in dollar funding markets. Last week, the FOMC approved the extension of these swap lines through February 1, 2014. These lines have helped avert fire sales of dollar assets and maintain the flow of credit to U.S. households and firms.

Thu, February 07, 2013
Federal Reserve Bank of St. Louis

Let me suggest three factors that can contribute to {credit market} overheating. The first is financial innovation. While financial innovation has provided important benefits to society, the institutions perspective warns of a dark side, which is that innovation can create new ways for agents to write puts that are not captured by existing rules…

The second closely related factor on my list is changes in regulation. New regulation will tend to spur further innovation, as market participants attempt to minimize the private costs created by new rules. And it may also open up new loopholes, some of which may be exploited by variants on already existing instruments.

The third factor that can lead to overheating is a change in the economic environment that alters the risk-taking incentives of agents making credit decisions. For example, a prolonged period of low interest rates, of the sort we are experiencing today, can create incentives for agents to take on greater duration or credit risks, or to employ additional financial leverage, in an effort to "reach for yield."

Thu, February 07, 2013
Federal Reserve Bank of St. Louis

It is sometimes argued that … policymakers should follow what might be called a decoupling approach. That is, monetary policy should restrict its attention to the dual mandate goals of price stability and maximum employment, while the full battery of supervisory and regulatory tools should be used to safeguard financial stability. There are several arguments in favor of this approach. First, monetary policy can be a blunt tool for dealing with financial stability concerns…

A related concern is that monetary policy already has its hands full with the dual mandate, and that if it is also made partially responsible for financial stability, it will have more objectives than instruments at its disposal and won't do as well with any of its tasks…

Nevertheless, as we move forward, I believe it will be important to keep an open mind and avoid adhering to the decoupling philosophy too rigidly. In spite of the caveats I just described, I can imagine situations where it might make sense to enlist monetary policy tools in the pursuit of financial stability. Let me offer three observations in support of this perspective. First, despite much recent progress, supervisory and regulatory tools remain imperfect in their ability to promptly address many sorts of financial stability concerns...

Second, while monetary policy may not be quite the right tool for the job, it has one important advantage relative to supervision and regulation--namely that it gets in all of the cracks. The one thing that a commercial bank, a broker-dealer, an offshore hedge fund, and a special purpose ABCP vehicle have in common is that they all face the same set of market interest rates. To the extent that market rates exert an influence on risk appetite, or on the incentives to engage in maturity transformation, changes in rates may reach into corners of the market that supervision and regulation cannot.

Third, in response to concerns about numbers of instruments, we have seen in recent years that the monetary policy toolkit consists of more than just a single instrument…

One of the most difficult jobs that central banks face is in dealing with episodes of credit market overheating that pose a potential threat to financial stability. We ought to be open-minded in thinking about how to best use the full array of instruments at our disposal. Indeed, in some cases, it may be that the only way to achieve a meaningfully macroprudential approach to financial stability is by allowing for some greater overlap in the goals of monetary policy and regulation.

Thu, February 07, 2013
Federal Reserve Bank of St. Louis

Continuing on with the theme of maturity transformation, the next brief stop on the tour is the agency mortgage real estate investment trust (REIT) sector. These agency REITs buy agency mortgage-backed securities (MBS), fund them largely in the short-term repo market in what is essentially a levered carry trade, and are required to pass through at least 90 percent of the net interest to their investors as dividends. As shown in exhibit 7, they have grown rapidly in the past few years, from $152 billion at year-end 2010 to $398 billion at the end of the third quarter of 2012.

One interesting aspect of this business model is that its economic viability is sensitive to conditions in both the MBS market and the repo market. If MBS yields decline, or the repo rate rises, the ability of mortgage REITs to generate current income based on the spread between the two is correspondingly reduced.

Wed, April 17, 2013
International Monetary Fund

I should note at the outset that solving the TBTF [Too Big to Fail] problem has two distinct aspects. First, and most obviously, one goal is to get to the point where all market participants understand with certainty that if a large SIFI [Systemically Important Financial Institution] were to fail, the losses would fall on its shareholders and creditors, and taxpayers would have no exposure. However, this is only a necessary condition for success, but not a sufficient one. A second aim is that the failure of a SIFI must not impose significant spillovers on the rest of the financial system, in the form of contagion effects, fire sales, widespread credit crunches, and the like. Clearly, these two goals are closely related. If policy does a better job of mitigating spillovers, it becomes more credible to claim that a SIFI will be allowed to fail without government bailout.

...

Still, we are quite a way from having fully solved the policy problems associated with SIFIs. For one thing, the market still appears to attach some probability to the government bailing out the creditors of a SIFI; this can be seen in the ratings uplift granted to large banks based on the ratings agencies' assessment of the probability of government support. While this uplift seems to have shrunk to some degree since the passage of Dodd-Frank, it is still significant. All else equal, this uplift confers a funding subsidy to the largest financial firms.

Moreover, as I noted earlier, even if bailouts were commonly understood to be a zero-probability event, the problem of spillovers remains. It is one thing to believe that a SIFI will be allowed to fail without government support; it is another to believe that such failure will not inflict significant damage on other parts of the financial system. In the presence of such externalities, financial firms may still have excessive private incentives to remain big, complicated, and interconnected, because they reap any benefits--for example, in terms of economies of scale and scope--but don't bear all the social costs.

Wed, April 17, 2013
International Monetary Fund

How can we do better? Some have argued that the current policy path is not working, and that we need to take a fundamentally different approach. Such an alternative approach might include, for example, outright caps on the size of individual banks, or a return to Glass-Steagall-type activity limits.

My own view is somewhat different. While I agree that we have a long way to go, I believe that the way to get there is not by abandoning the current reform agenda, but rather by sticking to its broad contours and ratcheting up its forcefulness on a number of dimensions. In this spirit, two ideas merit consideration: (1) an increase in the slope of the capital-surcharge schedule that is applied to large complex firms, and (2) the imposition at the holding company level of a substantial senior debt requirement to facilitate resolution under Title II of Dodd-Frank. In parallel with the approach to capital surcharges, a senior debt requirement could also potentially be made a function of an institution's systemic footprint.

Wed, April 17, 2013
International Monetary Fund

Let me briefly mention another piece of the puzzle that I think is sometimes overlooked, but strikes me as having the potential to play an important complementary role in efforts to address the TBTF problem--namely, corporate governance. Suppose we do everything right with respect to capital regulation, and set up a system of capital surcharges that imposes a strong incentive to shrink on those institutions that don't create large synergies. How would the adjustment process actually play out? The first step would be for shareholders, seeing an inadequate return on capital, to sell their shares, driving the bank's stock price down. And the second step would be for management, seeking to restore shareholder value, to respond by selectively shedding assets.

But as decades of research in corporate finance have taught us, we shouldn't take the second step for granted. Numerous studies across a wide range of industries have documented how difficult it is for managers to voluntarily downsize their firms, even when the stock market is sending a clear signal that downsizing would be in the interests of outside shareholders. Often, change of this sort requires the application of some external force, be it from the market for corporate control, an activist investor, or a strong and independent board. As we move forward, we should keep these governance mechanisms in mind, and do what we can to ensure that they support the broader regulatory strategy.

Wed, April 17, 2013
International Monetary Fund

 Where do we stand with respect to fixing the problem of "too big to fail" (TBTF)? Are we making satisfactory progress, or it is time to think about further measures?

...While I agree that we have a long way to go, I believe that the way to get there is not by abandoning the current reform agenda, but rather by sticking to its broad contours and ratcheting up its forcefulness on a number of dimensions. In this spirit, two ideas merit consideration: (1) an increase in the slope of the capital-surcharge schedule that is applied to large complex firms, and (2) the imposition at the holding company level of a substantial senior debt requirement to facilitate resolution under Title II of Dodd-Frank. In parallel with the approach to capital surcharges, a senior debt requirement could also potentially be made a function of an institution's systemic footprint.

...

Suppose instead we attack the problem by imposing capital requirements that are an increasing function of bank size. This price-based approach creates some incentive for all three banks to shrink, but lets them balance this incentive against the scale benefits that they realize by staying big. In this case, we would expect A, with its significant scale economies, to absorb the tax hit and choose to remain large, while B and C, with more modest scale economies, would be expected to shrink more radically. In other words, price-based regulation is more flexible, in that it leaves the size decision to bank managers, who can then base their decision on their own understanding of the synergies--or lack thereof--in their respective businesses.

...

Suppose we do everything right with respect to capital regulation, and set up a system of capital surcharges that imposes a strong incentive to shrink on those institutions that don't create large synergies. How would the adjustment process actually play out? The first step would be for shareholders, seeing an inadequate return on capital, to sell their shares, driving the bank's stock price down. And the second step would be for management, seeking to restore shareholder value, to respond by selectively shedding assets.

But as decades of research in corporate finance have taught us, we shouldn't take the second step for granted. Numerous studies across a wide range of industries have documented how difficult it is for managers to voluntarily downsize their firms, even when the stock market is sending a clear signal that downsizing would be in the interests of outside shareholders. Often, change of this sort requires the application of some external force, be it from the market for corporate control, an activist investor, or a strong and independent board.11 As we move forward, we should keep these governance mechanisms in mind, and do what we can to ensure that they support the broader regulatory strategy.

Fri, June 28, 2013
Council on Foreign Relations

However, a key point is that as we approach an FOMC meeting where an adjustment decision looms, it is appropriate to give relatively heavy weight to the accumulated stock of progress toward our labor market objective and to not be excessively sensitive to the sort of near-term momentum captured by, for example, the last payroll number that comes in just before the meeting.

In part, this principle just reflects sound statistical inference--one doesn't want to put too much weight on one or two noisy observations. But there is more to it than that. Not only do FOMC actions shape market expectations, but the converse is true as well: Market expectations influence FOMC actions. It is difficult for the Committee to take an action at any meeting that is wholly unanticipated because we don't want to create undue market volatility. However, when there is a two-way feedback between financial conditions and FOMC actions, an initial perception that noisy recent data play a central role in the policy process can become somewhat self-fulfilling and can itself be the cause of extraneous volatility in asset prices.

Thus both in an effort to make reliable judgments about the state of the economy, as well as to reduce the possibility of an undesirable feedback loop, the best approach is for the Committee to be clear that in making a decision in, say, September, it will give primary weight to the large stock of news that has accumulated since the inception of the program and will not be unduly influenced by whatever data releases arrive in the few weeks before the meeting--as salient as these releases may appear to be to market participants. I should emphasize that this would not mean abandoning the premise that the program as a whole should be both data-dependent and forward looking. Even if a data release from early September does not exert a strong influence on the decision to make an adjustment at the September meeting, that release will remain relevant for future decisions. If the news is bad, and it is confirmed by further bad news in October and November, this would suggest that the 7 percent unemployment goal is likely to be further away, and the remainder of the program would be extended accordingly.

Fri, June 28, 2013
Council on Foreign Relations

"Seven percent is an indicative goal," Mr. Stein said to the Council on Foreign Relations in New York. "On the one hand, we'd like some ability to have some specificity, and to do that you have to pick a number ... but it's an attempt to provide clarity. It doesn't mean we're going to shut out other relevant data on the labor market."

Thu, September 26, 2013
Center for Financial Studies

An alternative hypothesis is that our policies were indeed responsible for the very low level of long-term rates, but in part through a more indirect channel. According to this view, real and nominal term premiums were low not just because we were buying long-term bonds, but because our policies induced an outward shift in the demand curve of other investors, which led them to do more buying on our behalf--because we both gave them an incentive to reach for yield, and at the same time provided a set of implicit assurances that tamped down volatility and made it feel safer to lever aggressively in pursuit of that extra yield. In the spirit of my earlier comments, let's call this the "Fed recruitment" view.

I take the events of the past few months to be evidence in favor of the recruitment view…

Again, the existence of this recruitment channel is helpful; without it, I suspect that our policies would have considerably less potency and, therefore, less ability to provide needed support to the real economy. At the same time, an understanding of this channel highlights the uncertainties that inevitably accompany it. If the Fed's control of long-term rates depends in substantial part on the induced buying and selling behavior of other investors, our grip on the steering wheel is not as tight as it otherwise might be. Even if we make only small changes to the policy parameters that we control directly, long-term rates can be substantially more volatile. And if we push the recruits very hard--as we arguably have over the past year or so--it is probably more likely that we are going to see a change in their behavior and hence a sharp movement in rates at some point. Thus, if it is a goal of policy to push term premiums far down into negative territory, one should be prepared to accept that this approach may bring with it an elevated conditional volatility of rates and spreads.

Thu, September 26, 2013
Center for Financial Studies

I voted with the majority of the Committee to continue our asset purchase program at its current flow rate of $85 billion per month. It was a close call for me, but I did so because I continue to support our efforts to create a highly accommodative monetary environment so as to help the recovery along by using both asset purchases and our threshold-based approach to forward guidance.

How should the pace of purchases evolve going forward? The Chairman laid out a framework for winding down purchases in his June press conference. Within that framework, I would have been comfortable with the FOMC's beginning to taper its asset purchases at the September meeting. But whether we start in September or a bit later is not in itself the key issue--the difference in the overall amount of securities we buy will be modest. What is much more important is doing everything we can to ensure that this difficult transition is implemented in as transparent and predictable a manner as possible. On this front, I think it is safe to say that there may be room for improvement.

Achieving the desired transparency and predictability doesn't require that the wind-down happen in a way that is independent of incoming data. But I do think that, at this stage of the asset purchase program, there would be a great deal of merit in trying to find a way to make the link to observable data as mechanical as possible. For this reason, my personal preference would be to make future step-downs a completely deterministic function of a labor market indicator, such as the unemployment rate or cumulative payroll growth over some period. For example, one could cut monthly purchases by a set amount for each further 10 basis point decline in the unemployment rate. Obviously the unemployment rate is not a perfect summary statistic for our labor market objectives, but I believe that this approach would help to reduce uncertainty about our reaction function and the attendant market volatility. Moreover, we would still retain the flexibility to respond to other contingencies (such as declines in labor force participation) via our other more conventional policy tool--namely, the path of short-term rates.

Fri, October 04, 2013
Federal Reserve Bank of New York

I'm not arguing that the very low risk-based charges on repo lending in Basel III are "wrong" in any microprudential sense. After all, they are designed to solve a different problem--that of ensuring bank solvency. And if a bank holding company's broker-dealer sub makes a repo loan of short maturity that is sufficiently well-collateralized, it may be at minimal risk of bearing any losses--precisely because it operates on the premise that it can dump the collateral and get out of town before things get too ugly. The risk-averse lenders in the triparty market--who, in turn, provide financing to the dealer--operate under the same premise. As I noted earlier, these defensive reactions by providers of repo finance mean that the costs of fire sales are likely to be felt elsewhere in the financial system.



My aim here has been to survey the landscape--to give a sense of the possible tools that can be used to address the fire-sales problem in SFTs--without making any particularly pointed recommendations. I would guess that a sensible path forward might involve drawing on some mix of the latter set of instruments that I discussed: namely, capital surcharges, modifications to the liquidity regulation framework, and universal margin requirements. As we go down this path, conceptual purity may have to be sacrificed in some places to deliver pragmatic and institutionally feasible results. It is unlikely that we will find singular and completely satisfactory fixes.

Thu, October 17, 2013
National Bureau of Economic Research

The theme of this conference is, "Lessons from the Financial Crisis for Monetary Policy." Given the opportunity to speak about this topic, my first thought was that I should organize my remarks around the familiar "lean versus clean" debate. The traditional, pre-crisis framing of the question went something like this: Should policymakers rely on ex ante measures to lean against potential financial imbalances as they build up, and thereby lower the probability of a bad event ever happening, or should they do most of their work ex post, focusing on the clean-up?

Post-crisis, the emphasis in the debate has shifted. I think it's safe to assume that nobody in this room would now argue that we should be putting all our eggs in the "clean" basket.

Thu, October 17, 2013
National Bureau of Economic Research

I also believe that much of the promise of the CCAR framework lies in its potential to help us achieve a better outcome not just in normal times, but also in the important in-between times, in the early stages of a crisis. In other words, when thinking about the design of CCAR, one of the questions I keep coming back to is this: Suppose we were granted a do-over, and it was late 2007. If we had the CCAR process in place, how would things have turned out differently? Would we have seen significantly more equity issuance at this earlier date by the big firms, and hence a better outcome for the real economy?

On the one hand, there is some reason for optimism on this score. After all, the original stress tests--the Supervisory Capital Assessment Program (SCAP) in May 2009--provided the impetus for a significant recapitalization of the banking system. More than $100 billion of new common equity was raised from the private sector in the six months after the SCAP, and in many ways it was a watershed event in the course of the crisis.

Moreover, the current CCAR framework gives the Federal Reserve both the authority and the independent analytical basis to require external equity issues in the event that, under the stress scenario, a firm's post-stress, tier 1 common equity ratio is below 5 percent, and the ratio cannot be restored simply by suspending dividends and share repurchases.

Sun, November 03, 2013
Workshop on Fire Sales

Governor Tarullo alluded to the possibility of liquidity-linked capital surcharges that would effectively augment the existing regime of risk-based capital requirements.12 Depending on how these surcharges are structured… {they might come} quite close to the Pigouvian notion of directly taxing this specific activity. As compared to relying on the leverage ratio to implement the tax, this approach has the advantage that it is more likely to treat institutions uniformly: the tax on SFTs would not be a function of the overall business model of a given firm, but rather just the characteristics of its SFT book. This is because the surcharge is embedded into the existing risk-based capital regime, which should in principle be the constraint that binds for most firms.
There are a couple of important qualifications, however. First, going this route would involve a significant conceptual departure from the notion of capital as a prudential requirement at the firm level. As noted previously, a large matched repo book may entail relatively little solvency or liquidity risk for the broker-dealer firm that intermediates this market. So, to the extent that one imposes a capital surcharge on the broker-dealer, one would be doing so with the express intention of creating a tax that is passed on to the downstream borrower (i.e., to the hedge fund, in my example).
Second, and a direct corollary of the first, imposing the tax at the level of the intermediary naturally raises the question of disintermediation. In other words, might the SFT market respond to the tax by evolving so that large hedge funds are more readily able to borrow via repo directly from money market funds and securities lenders, without having to go through broker-dealers? I can't say that I have a good understanding of the institutional factors that might facilitate or impede such an evolution. But if the market ultimately does evolve in this way, it would be hard to argue that the underlying fire-sales problem has been addressed.

Fri, January 03, 2014
American Economic Association

Let me summarize. The creation of private money--that is, safe claims that are useful for transactions purposes--is obviously central to what banks do. But safe claims can be manufactured from risky collateral in different ways, and banks are not the only type of intermediary that engages in this activity. What makes banks unique is that they use a particular combination of financial and institutional arrangements--including capital, deposit insurance, and access to a lender of last resort--as well as substantial investments in bricks and mortar, to create liabilities that are not only safe and money-like, but also relatively stable and thus unlikely to run at the first sign of trouble. This is in contrast to shadow banks, who create money-like claims more cheaply, by relying on an early exit option, and who are therefore more vulnerable to runs and the accompanying fire-sale risk. I have argued that there is a synergy between banks' stable funding model and their investing in assets that have modest fundamental risk but whose prices can fall significantly below fundamental values in a bad state of the world. This synergy helps explain both why deposit-taking banks might have a comparative advantage at making information-intensive loans and, at the same time, why they tend to hold the specific types of securities that they do.

Thu, March 20, 2014
International Research Forum on Monetary Policy

Let me preview my bottom line. I am going to try to make the case that, all else being equal, monetary policy should be less accommodative--by which I mean that it should be willing to tolerate a larger forecast shortfall of the path of the unemployment rate from its full-employment level--when estimates of risk premiums in the bond market are abnormally low. These risk premiums include the term premium on Treasury securities, as well as the expected returns to investors from bearing the credit risk on, for example, corporate bonds and asset-backed securities.

Tue, May 06, 2014
Money Marketeers of NYU

One hypothesis is that going into the May-June period, there was a wide divergence of opinion among market participants as to the future of the asset purchase program. In particular, however reasonable the median expectation, there were a number of "QE-infinity" optimists who expected our purchases to go on for a very long time. And, crucially, in asset markets, it is often the beliefs of the most optimistic investors--rather than those of the moderates--that drive prices, as they are the ones most willing to take large positions based on their beliefs. Moreover, this same optimism can motivate them to leverage their positions aggressively.6

In this setting, a piece of monetary policy communication that merely "clarifies" things--that is, one that delivers the median market expectation but truncates some of the more extreme possibilities--can have powerful effects. Highly levered optimists are forced to unwind their positions, which then must be absorbed by other investors with lower valuations. This effect is likely to be amplified if the preannouncement period was one with unusually low volatility, as was the case in early May 2013, when the implied volatility on long-tenor swaptions was near historical lows. To the extent that some of the optimists are operating subject to value-at-risk constraints, low volatility is likely to induce them to take on more leverage. If volatility rises sharply in the wake of an announcement, this increase will tend to exacerbate the unwind effect.

Tue, May 06, 2014
Money Marketeers of NYU

In this spirit, I think the FOMC may face a similar communications challenge as the nature of the forward guidance for the path of short-term interest rates evolves over the next couple of years. The 6.5 percent unemployment threshold that we had until recently was not only quantitative in nature, but it also represented a relatively firm commitment on the part of the Committee. While this kind of commitment was entirely appropriate at the zero lower bound, as policy eventually normalizes, guidance will necessarily take a different form; it will be both more qualitative as well as less deterministic. So, for example, when I fill in my "dot" for 2016 in the Survey of Economic Projections, I think of myself as writing down not a commitment for where the federal funds rate will be at that time, but only my best forecast, and one that is highly uncertain at that.

Chair Yellen made a similar point in her March press conference:

More generally, you know, the end of 2016 is a long way out. Monetary policy will be geared to evolving conditions in the economy, and the public does need to understand that as those views evolve, the Committee's views on policy will likely evolve with them. And that's a kind of uncertainty that the Committee wouldn't want to eliminate completely from its guidance because we want the policy we put in place to be appropriate to the economic conditions that will prevail years down the road.

Tue, May 06, 2014
Money Marketeers of NYU

One advantage of going with an open-ended approach is that when we rolled out QE3 in September 2012, we were able to make a forceful statement that we would continue with asset purchases until we observed, as Chairman Bernanke put it in his postmeeting press conference, a "substantial improvement in the outlook for the labor market." We were able to do so even though I suspect that, had we tried to put a number to it, there would have been considerable disagreement among Committee members as to the exact meaning of "substantial improvement." So in this case, leaving the Committee's reaction function incompletely worked out allowed us to move forward with a major policy initiative in a timely manner, which otherwise might have been very difficult. Of course, the flip side of this reaction-function incompleteness is that it becomes harder for the Committee to precisely communicate its future intentions to the market--in part because these future intentions have not yet been fully fleshed out. Rather, it makes more sense in this case to think of the Committee's reaction function as being something that is not entirely predetermined and that will naturally tend to evolve over time.

Tue, May 06, 2014
Money Marketeers of NYU

Of course, if the Committee is using asset purchases to signal its policy intentions, then the information content of purchase decisions depends importantly on what the public expects it to do. For example, if it is early 2013 and the market has somehow arrived at the belief that the Committee will continue buying assets at an $85 billion per month clip so long as monthly payroll growth does not exceed 200,000 jobs per month for three months in a row, then even a small cut down to $80 billion per month is likely to elicit a powerful market reaction--not because the $5 billion cut is consequential in and of itself, but because of the message it sends about the Committee's policy leanings more generally. But then you can see the feedback loop that arises: The more strongly the market becomes attached to this belief--even if it was initially somewhat arbitrary--the more wary the Committee must be of making an unexpected change, and this wariness further reinforces the market's initial belief. In this sense, the Committee's reaction function for the appropriate quantity of asset purchases under the QE3 program is not only evolving over time, it is coevolving along with the market's beliefs.

Tue, May 06, 2014
Money Marketeers of NYU

Digging deeper, though, it is important to recognize that part of the reason that the bond market would react so strongly to a sharp change in the short-term policy rate is that we have settled into a self-sustaining equilibrium in which the Fed tends to act gradually, and the market has come to expect that gradualism. In other words, the market has learned that a given increase in the federal funds rate at the beginning of a tightening cycle is typically followed by many more moves in the same direction, so there is naturally a multiplier effect on long-term rates of a given change in short-term rates. And that multiplier depends on the expected degree of gradualism: The more inertia there is in Fed policy, the more significant is any small move, and hence the larger is the multiplier. Thus, an expectation of gradualism on the part of the market makes it all the more important for the Fed to adjust the policy rate gradually, thereby fulfilling the market's beliefs. This line of reasoning can be thought of as a piece of positive economics--that is, it may shed some light on why the world is as it is. But what, if any, are its normative implications? On the one hand, as I have emphasized, a gradualist approach to monetary policy is likely to be the best way for us to deliver on our mandate at any point in time, taking as given the market's expectations for Fed behavior. As such, it would probably not make sense, in the short run, for the Committee to deviate from this approach--with an unprepared market, the result might well be an undesirable degree of market turbulence, with attendant negative effects on the real economy. On the other hand, there is clearly a time-consistency problem lurking here; the world we are in need not be the best of all possible worlds. In particular, it is interesting to think about an alternative long-run equilibrium in which the Fed has somehow developed a reputation for worrying less about the immediate bond-market effect of its actions and is known to react more aggressively to changes in economic conditions.10 In this alternative equilibrium, the market would expect the Fed to behave in a less gradualist fashion, so any given move in the funds rate would have a smaller multiplier effect on long rates. Thus, it is possible that in this alternative world, market volatility would be no higher than it is in our world, but the Fed would nevertheless be able to adjust policy more nimbly when it needed to.