wricaplogo

Overview: Mon, May 06

Daily Agenda

Time Indicator/Event Comment
11:3013- and 26-wk bill auction$70 billion apiece
12:50Barkin (FOMC voter)On the economic outlook
13:00Williams (FOMC voter)Speaks at Milken Institute conference
15:00STRIPS dataApril data

US Economy

Federal Reserve and the Overnight Market

Treasury Finance

This Week's MMO

  • MMO for May 6, 2024

     

    Last week’s Fed and Treasury announcements allowed us to do a lot of forecast housekeeping.  Net Treasury bill issuance between now and the end of September appears likely to be somewhat higher on balance and far more volatile from month to month than we had previously anticipated.  In addition, we discuss the implications of the unexpected increase in the Treasury’s September 30 TGA target and the Fed’s surprising MBS reinvestment guidance. 

Economic Modeling

Loretta Mester

Thu, May 12, 2016

The precision of the forecasts, or lack thereof, needs to be kept in mind when setting monetary policy. We must be forward looking, which means we must rely on models to forecast inflation, but there is no one model that forecasts with much accuracy. The best we can do in this situation is to recognize that there is uncertainty around our forecasts. I am in favor of the FOMC providing some type of error band around its projections. Not only will it help the public understand some of the risks around our forecast, but it will also be a helpful reminder to policymakers that we constantly live with uncertainty. This shouldn’t paralyze us. Instead we should cope with it by looking at the outcomes from multiple models and alternative simulations, using techniques like model averaging, and by continually evaluating the forecasts from the models against incoming data. The FOMC has been expanding the models it routinely examines as a part of the policymaking process — these include the Board of Governors staff’s large-scale FRB/US model and two smaller-scale DSGE models called EDO and SIGMA, as well as various models maintained and utilized at the Federal Reserve Banks. Researchers are now building model archives to aid in the systematic comparison of empirical results and policy implications across a large set of economic models as an aid to policy analysis. One such archive, The Macroeconomic Model Data Base (MMB), headed by Volker Wieland of Goethe University Frankfurt, currently includes 61 models. Given the state of our knowledge, this seems to be a promising approach to ensuring that policy actions are robust across the span of plausible models of economic dynamics and economic circumstances.

Jerome Powell

Tue, June 23, 2015

So I would say that I spent a lifetime working with financial models and they are essential and get you about 80 percent of the way there, but if you think the models are going to make your decision for you, then you’re not going to get very far in life. So, this was my world in the financial markets for many years, it’s really not that different.

You can’t do anything without modeling it, but at the same time, that last 20 percent of judgment and experience and understanding and risk management, the weighing of risks, is critical.

Richard Fisher

Wed, February 11, 2015

My advice is to heed Charles Kindlebergers warning that "different circumstances call for different prescriptions." "The art of economics," he said, "is to choose the right model for the given problem, and to abandon it when the problem changes shape."

Loretta Mester

Fri, May 30, 2014

[F]irms appear to change prices more frequently than predicted by standard macroeconomic models that have been calibrated to match various features of the macroeconomic business cycle. This is a troubling finding in that it suggests a disconnect between the micro data and macro models that are often used to inform monetary policy analysis. Better reconciliation of the models with the empirical facts of the micro data would be a welcome avenue of additional research, and I would expect our models to improve the more we learn.

Charles Plosser

Sat, January 04, 2014

A different conceptual approach to defining a gap is implied by a class of economic models in wide use today the new Keynesian dynamic stochastic general equilibrium, or DSGE, models. DSGE models explicitly posit that firms have some pricing power; that is, there is imperfect competition so that a firm can choose to sell more of its output by lowering its price or to sell less of its output by raising its price, and the firm will set its price at a markup over marginal cost to maximize its profits. DSGE models also assume that firms are able to only adjust prices infrequently. This form of sticky prices, together with imperfect competition, allow monetary policy to have real effects in the short run, while remaining neutral for the real side of the economy in the long run. The sticky prices generate distortions that mean allocations and output can be inefficient in the face of shocks. In these models, the efficient level of output is the level of output that would prevail in the absence of the sticky prices and other market imperfections that allow deviations from perfect competition. In this framework, the relevant output gap to be addressed is the difference between the efficient level and that level generated by the distortion introduced by the sticky prices and market imperfections. The behavior of the efficient level of output is unlikely to be a smooth or a slowly evolving series like the CBO concept. In fact, it could be quite volatile and may bear little or no resemblance to the traditional concept of potential used by the CBO and others. Efficient output would be altered by changes in technology that affect productivity or changes in agents' preferences. The role of monetary policy in these models is to react to economic conditions in a way that minimizes the potential for distortions arising from the price stickiness or other market imperfections. The general policy prescription is to minimize the gap between output and the efficient level of output. In the absence of unexpected events that lead firms to change their desired markups over marginal cost, or other real rigidities like real wage rigidities, this would be equivalent to stabilizing inflation.

William Dudley

Sat, January 04, 2014

But there is much more work to do. In particular, I think we have just scratched the surface in understanding how developments in one area, such as capital and liquidity requirements for large, complex financial institutions, affect other areas, such as effective monetary policy implementation. We still don't have well developed macro-models that incorporate a realistic financial sector. We don't understand fully how large-scale asset purchase programs work to ease financial market conditionsis it the effect of the purchases on the portfolios of private investors, or alternatively is the major channel one of signaling?

Ben Bernanke

Sun, June 02, 2013

Economics is a highly sophisticated field of thought that is superb at explaining to policymakers precisely why the choices they made in the past were wrong. About the future, not so much.

Narayana Kocherlakota

Sun, March 03, 2013

In response to a question about whether he was surprised that the public felt his views had changed radically.

In some ways, yes, I was surprised.

I was surprised by the reaction in the sense that I felt I was putting a lot of weight on the price stability mandate by suggesting that even an inflation outlook—medium-term outlook, two-year outlook—that is a quarter percentage point higher than 2 percent should be viewed as a cause for concern. I’m not saying we’re going to raise rates at that point, just to be clear. But I’m saying it’s a time to consider raising rates.

I felt that I was actually being highly respectful of the price stability mandate, and properly so. With that said, I think it is true that to suggest that unemployment could get as low as 5 1/2 percent without pushing inflation above 2 1/4 percent, that was a change in my thinking relative to where I was in April. That change in my thinking came just because of the data on inflation and reading a ton of work that had been done on the factors generating high unemployment.



I gave a speech about structural unemployment in August 2010 in which I pointed to the shift in the “Beveridge curve.” This is a plot of unemployment and vacancies over time. It has shifted outward, meaning that, roughly speaking, it looks like firms are having a surprisingly hard time filling their job openings given how many people are looking for jobs. There are other interpretations of this, though, as there always are in economics. So, I laid out my concerns about that shift in August 2010. That shift is still there in the data.

But what’s changed since August 2010 is that there’s been a lot of research trying to parse out what is responsible for this shift. That work goes through a number of factors. It’s summarized in a paper that Professor Edward Lazear gave at the Kansas City Fed’s Jackson Hole Conference earlier this year [2012].8 As a Fed president, I was already aware of a lot of that work because much of it has been done within the [Federal Reserve] System.

What this work usually does is look, factor by factor, at how much unemployment is caused by each structural factor. Generally, the answer is not a lot. You can get to maybe a percentage point, or point and a half, of the increase in unemployment since 2007, due to structural factors, something like that.

Those studies were very important in shaping my thinking. Another thing that happened was that inflation over the course of 2012 came in considerably lower than I had anticipated. Both of those things mattered in shaping how I thought about inflation going forward.

William Dudley

Fri, October 01, 2010

In making our assessments about next steps, we need to be a bit humble about our capacity to forecast how market participants would respond to our actions. We do not control their behavior nor have much historical experience that we can draw on to easily assess how they are likely to behave. Even viewpoints that turned out to be incorrect could persist for a long time and generate adverse consequences. It is not enough for us to be right in theory. We also have to be convincing in practice and in explaining why concerns we think are misplaced are indeed unwarranted.

Ben Bernanke

Fri, September 24, 2010

Standard macroeconomic models, such as the workhorse new-Keynesian model, did not predict the crisis, nor did they incorporate very easily the effects of financial instability. Do these failures of standard macroeconomic models mean that they are irrelevant or at least significantly flawed?  I think the answer is a qualified no. Economic models are useful only in the context for which they are designed. Most of the time, including during recessions, serious financial instability is not an issue. The standard models were designed for these non-crisis periods, and they have proven quite useful in that context. Notably, they were part of the intellectual framework that helped deliver low inflation and macroeconomic stability in most industrial countries during the two decades that began in the mid-1980s.

Narayana Kocherlakota

Tue, April 06, 2010

My own forecast calls for about 3 percent growth per year over the next two years, as opposed to the consensus view among economists, which is 3.5 percent. This pessimism derives from two sources. First, our statistical forecasting model at the Federal Reserve Bank of Minneapolis is predicting that GDP growth over this period will be around 2.5 percent per year. The model is a simple one in many ways, but its forecasting track record is surprisingly good.

Donald Kohn

Wed, November 12, 2008

Central banks should be wary of placing too much faith in model-based analyses, which are necessarily predicated on past empirical correlations and relationships. As we have seen, financial innovation can induce structural changes that can importantly alter the way financial institutions, markets, and the broader economy respond to shocks. For this reason, policymakers should take a critical approach to evaluating analyses of this sort, and should always probe to find the sensitivity of results to unstated assumptions that may no longer be valid.

Frederic Mishkin

Fri, September 21, 2007

Active, and sometimes bitter, debates about which modeling approaches are the right ones are ongoing in macroeconomics, and there often is not a consensus on the best model. As a result, central banks must express some degree of humility regarding their knowledge of the structural relationships that determine activity and prices. This humility is readily apparent in the practice at central banks, which involves looking at many different modelsstructural, reduced-form, general equilibrium and partial equilibrium, and continually using judgment to decide which models are most informative.

Charles Plosser

Sat, September 08, 2007

As you are no doubt aware, the monthly statistics reported on the economy are very volatile and subject to revision. The FOMC works hard to differentiate those factors that may have only a temporary impact on the economy or inflation from those of a more sustained nature...

The Committee looks at a variety of data and economic information in formulating its economic outlook. When information indicates that the outlook for economic growth and inflation has changed, one still has to ask whether it has changed enough to impede the achievement of the Fed’s goals of price stability and maximum sustainable economic growth. As I mentioned, the economy is remarkably resilient. One must also ask how much monetary policy can influence that forecast over the relevant time horizon. Thus the Committee usually does not base its decision to change monetary policy on any one number, but instead assesses the cumulative impact of all incoming data for the outlook in light of its ultimate goals.

Ben Bernanke

Tue, July 10, 2007

The Board staff employs a variety of formal models, both structural and purely statistical, in its forecasting efforts. However, the forecasts of inflation (and of other key macroeconomic variables) that are provided to the Federal Open Market Committee are developed through an eclectic process that combines model-based projections, anecdotal and other "extra-model" information, and professional judgment. In short, for all the advances that have been made in modeling and statistical analysis, practical forecasting continues to involve art as well as science.

[12 3  >>  

MMO Analysis