I'm afraid I'm at it again, defending the apparently indefensible: this time the issue in question is Greepspan's use of 'baffling tactics'. I simply love the "We policymakers, rather than relying solely on the specific linkages expressed in our formal models, have tended to draw from broader, though less mathematically precise, hypotheses of how the world works". This is of course incredibly arrogant, it is saying we know better than you, but he is right to say it. What one might ask is the 'broader' view of how the world works. This, I suppose, is what we call savoir faire (savvy), those non-specifiable instincts that all good economists have about what could happen. Of course if we didn't have them then the 'science' of economics could never advance, since we'd all spend our time following simple rules. (what I'm saying by-the-by here is that an economic theory that cannot account for its own existence has a hole in it somewhere). I think here we're back with Big Arny's problem: never say never. A simple set of rules has the advantage that it is clear and easily understandable by all participants. It has the disadvantage that it lacks flexibility, and that the easy to read part makes it difficult to influence participants behaviour since they too can read the rule and recursively work back from some sort of forward looking induction point to see what they should be doing. Hence the central banker has to read this off before applying the rule, and they have to try to read off the central bankers reading, etc etc etc, ad infinitum. The process is essentially indeterminate, and the rule is now not so simple. In particular, at a time of falling search and information costs, response times are likely to continue to fall, and the inherent circularity of the rule could as easily produce more volatility rather than less depending on the initial situation. So I'm all with you Alan, keep 'em guessing, please.
Greenspan defends Fed's 'baffling' tactics
Alan Greenspan, chairman of the Federal Reserve, yesterday defended the Fed's apparently ad hoc approach to setting interest rates, calling it the best response to an intrinsically uncertain economic environment. Speaking at the annual Fed conference in Jackson Hole, Wyoming, Mr Greenspan responded to criticism that the Fed's approach to policy, which relies more on judgments and less on formal targets, was confusing. "We policymakers, rather than relying solely on the specific linkages expressed in our formal models, have tended to draw from broader, though less mathematically precise, hypotheses of how the world works," he said. "Some critics have argued that such an approach to policy is too undisciplined - judgmental, seemingly discretionary, and difficult to explain."
But he rejected any suggestion that the Fed should adopt a more formal rule for setting interest rates, such as tying them to specific outcomes of inflation and economic growth. "That any approach along these lines would lead to an improvement in economic performance, however, is extremely doubtful," Mr Greenspan said. Many economists and investors have criticised the Fed in recent months for baffling financial markets with apparently contradictory signals about its campaign to combat deflation. Currently, the markets are pricing in sharp rises in interest rates next year, in spite of the Fed's insistence that they can be left low.
This confusion has increased calls for the Fed to be more explicit in saying exactly what it is attempting to achieve. Mr Greenspan did not directly address the issue of current policy. But his robust restatement of the prevailing Fed orthodoxy suggests that such a radical change is unlikely soon. Mr Greenspan rejected the notion that flexibility led to confusion, saying that central banks should set policy in the context of an overall framework of risk-management. In particular, he said it was perfectly sensible to concentrate on eliminating small but potentially damaging risks even at the cost of ignoring the demands of the rest of the economy.
Source: Financial Times
Link
Incidentally you can find the full text of Greenspan's speech here . Of particular note are his point about the difficulty of controlling monetary aggregates, and the argument from Knightian uncertainty. It is a well know argument on this blog that the accelerating pace of technological change, and the increasing returns dynamic associated with the 'new economy', are key factors - factors which if anything are ven more important more important than the oft-mentioned 'geopolitical ones - in understanding the growing fundamental uncertainty which characterises the contemporary global panorama.
Uncertainty is not just an important feature of the monetary policy landscape; it is the defining characteristic of that landscape. As a consequence, the conduct of monetary policy in the United States at its core involves crucial elements of risk management, a process that requires an understanding of the many sources of risk and uncertainty that policymakers face and the quantifying of those risks when possible. It also entails devising, in light of those risks, a strategy for policy directed at maximizing the probabilities of achieving over time our goal of price stability and the maximum sustainable economic growth that we associate with it.
Despite the extensive efforts to capture and quantify these key macroeconomic relationships, our knowledge about many of the important linkages is far from complete and in all likelihood will always remain so. Every model, no matter how detailed or how well designed conceptually and empirically, is a vastly simplified representation of the world that we experience with all its intricacies on a day-to-day basis. Consequently, even with large advances in computational capabilities and greater comprehension of economic linkages, our knowledge base is barely able to keep pace with the ever-increasing complexity of our global economy.
Given this state of flux, it is apparent that a prominent shortcoming of our structural models is that, for ease in parameter estimation, not only are economic responses presumed fixed through time, but they are generally assumed to be linear. An assumption of linearity may be adequate for estimating average relationships, but few expect that an economy will respond linearly to every aberration. Although some nonlinearities are accounted for in our modeling exercises, we cannot be certain that our simulations provide reasonable approximations of the economy's behavior in times of large idiosyncratic shocks.
Recent history has also reinforced the perception that the relationships underlying the economy's structure change over time in ways that are difficult to anticipate. This has been most apparent in the changing role of our standard measure of the money stock. Because an interest rate, by definition, is the exchange rate for money against non-monies, money obviously is central to monetary policy. However, in the past two decades, what constitutes money has been obscured by the introduction of technologies that have facilitated the proliferation of financial products and have altered the empirical relationship between economic activity and what we define as money, and in doing so has inhibited the keying of monetary policy to the control of the measured money stock
In implementing a risk-management approach to policy, we must confront the fact that only a limited number of risks can be quantified with any confidence. And even these risks are generally quantifiable only if we accept the assumption that the future will replicate the past. Other risks are essentially unquantifiable--representing Knightian uncertainty, if you will--because we may not fully appreciate even the full range of possibilities, let alone each possibility's likelihood. As a result, risk management often involves significant judgment on the part of policymakers, as we evaluate the risks of different events and the probability that our actions will alter those risks.
For such judgment, we policymakers, rather than relying solely on the specific linkages expressed in our formal models, have tended to draw from broader, though less mathematically precise, hypotheses of how the world works. For example, inference of how market participants might respond to a monetary policy initiative may need to reference past behavior during a period only roughly comparable to the current situation.
Some critics have argued that such an approach to policy is too undisciplined--judgmental, seemingly discretionary, and difficult to explain. The Federal Reserve should, some conclude, attempt to be more formal in its operations by tying its actions solely to the prescriptions of a formal policy rule. That any approach along these lines would lead to an improvement in economic performance, however, is highly doubtful. Our problem is not the complexity of our models but the far greater complexity of a world economy whose underlying linkages appear to be in a continual state of flux.
Rules by their nature are simple, and when significant and shifting uncertainties exist in the economic environment, they cannot substitute for risk-management paradigms, which are far better suited to policymaking. Were we to introduce an interest rate rule, how would we judge the meaning of a rule that posits a rate far above or below the current rate? Should policymakers adjust the current rate to that suggested by the rule? Should we conclude that this deviation is normal variance and disregard the signal? Or should we assume that the parameters of the rule are misspecified and adjust them to fit the current rate? Given errors in our underlying data, coupled with normal variance, we might not know the correct course of action for a considerable time. Partly for these reasons, the prescriptions of formal interest rate rules are best viewed only as helpful adjuncts to policy, as indeed many proponents of policy rules have suggested.
No comments:
Post a Comment