Tuesday, April 10, 2012

As far as I can tell, Simon Wren-Lewis has been convinced by Paul Krugman. He now proposes parallel reasearch projects one of which is to be focused on fitting the data. This is exactly what Krugman advocated.

Update: clearly I couldn't see very far. In fact, as he has repeatedly written, Wren-Lewis always agreed with Krugman about what is to be done. I assume he still disagrees with Krugman about the fruits of the effort to micro found macro. In any case, I missinterpreted his new parallel research program proposal. It is the same as his original paralkel research program proposal.



Also he isn't the one who caused the Bank of England model to have an ad hoc periphery around the consistent core. That was the work of Bank of England employees. Or something. Just go to his blog for more reliably correct corrections.


Thomaists please click this link.

6 comments:

  1. Nice post, Robert, but I think the problem of modern macroeconomics actually goes deeper than implied by your argument. I think we have to ask ourselves what it was that went wrong with our macroeconomic models, since they obviously did not foresee the financial crisis of 2007-08 or even make it conceivable.

    Exaggerated mathematization of economics, or irrational and corrupt politicians, may be a part of an explanation, but the root of our problem goes much deeper. It ultimately goes back to how we look upon the data we are handling. In “modern” macroeconomics – dynamic stochastic general equilibrium, new synthesis, new-classical and new-Keynesian -variables are treated as if drawn from a known “data-generating process” that unfolds over time and on which we therefore have access to heaps of historical time-series. If we do not assume that we know the “data-generating process” – if we do not have the “true” model – the whole edifice collapses. And of course it has to. I mean, who really honestly believes that we should have access to this mythical Holy Grail, the data-generating process?

    “Modern” macroeconomics obviously did not anticipate the enormity of the problems that unregulated “efficient” financial markets created. Why? Because it builds on the myth of us knowing the “data-generating process” and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances.

    This is like saying that you are going on a holiday-trip and that you know that the chance the weather being sunny is at least 30%, and that this is enough for you to decide on bringing along your sunglasses or not. You are supposed to be able to calculate the expected utility based on the given probability of sunny weather and make a simple decision of either-or. Uncertainty is reduced to risk.

    But as Keynes convincingly argued in his monumental Treatise on Probability (1921), this is no always possible. Often we simply do not know. According to one model the chance of sunny weather is perhaps somewhere around 10% and according to another – equally good – model the chance is perhaps somewhere around 40%. We cannot put exact numbers on these assessments. We cannot calculate means and variances. There are no given probability distributions that we can appeal to.

    In the end this is what it all boils down to. We all know that many activities, relations, processes and events are of the Keynesian uncertainty-type. The data do not unequivocally single out one decision as the only “rational” one. Neither the economist, nor the deciding individual, can fully pre-specify how people will decide when facing uncertainties and ambiguities that are ontological facts of the way the world works.

    Some macroeconomists, however, still want to be able to use their hammer. So they decide to pretend that the world looks like a nail, and pretend that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption. The result: financial crises and economic havoc.

    How much better – how much bigger chance that we do not lull us into the comforting thought that we know everything and that everything is measurable and we have everything under control – if instead we could just admit that we often simply do not know, and that we have to live with that uncertainty as well as it goes.

    Fooling people into believing that one can cope with an unknown economic future in a way similar to playing at the roulette wheels, is a sure recipe for only one thing – economic catastrophy!

    As long as macroeconomists haven't really solved how to deal with genuine uncertainty in an adequate manner, they are far from producing relevant models for solving real-world macroeconomic problems. And ff these problems are about policy questions or forecasting, really ought to be of second-order interest.

    ReplyDelete
  2. Anonymous10:22 PM

    But the single-agent, micro-founded models you are referring to aren't really micro-founded at all. Their properties don't generalize to even two-agent models, where excess demand curves can be upward sloping, there are multiple equilibria etc. In a minimally realistic general equilibrium model you can't do comparative statics and "rational expectations" is ill-defined. All of this has been know by general equilbirum theorists for more than 30 years. And this is before you even consider the issues of incomplete markets, market power and increasing returns to scale. How macroeconomics was hijacked by a group of economists using toy "micro-founded" models with no hope of any connection to reality still crys out for some explanation.

    ReplyDelete
  3. Anonymous2:12 AM

    Whatever happened to microfoundations?

    Here is a useful reminder that providing a microfoundation for macroeconomics was once a serious intellectual enterprise.

    http://www.ufrgs.br/PPGE/pcientifica/2006_05.pdf

    ReplyDelete
  4. Lars P Syll I agree. This post was about Simon Wren-Lewis. I think that micro founded macro is fundamentally false, basically because I don't think that people have subjective probabilities of all possible events (that is I agree with Keynes).

    Anonymous 1 way more than 30 years. The fact that general equilibrium models can have multiple equilibria has been known for a long long time. I think at least 40 and probably 50 years. The application of general equilibrium theory in a way denounced by all general equilibrium theorists who I have ever met is a mystery.

    Anonymous 2 What happened to microfoundations is that they are required in academic work. Also models with microfoundations have all all been grossly rejected by the data. The lack of influence that this rejection has had on the research project strikes me as the opposite of a serious enterprise.

    I don't know when the quest for micro foundations could have been a serious enterprise. I perceived it to be completely absurd nonsense 30 years ago.

    ReplyDelete
  5. Anonymous6:06 PM

    No Robert, that is not a serious position. The "microfoundations" research programme in the 1970s and early 1980s was focused on answering the fundamental questions of macroeconomics, originally posed in the General Theory by Keynes. How could a capitalist market economy settle on extremely inefficient outcomes, or states, such as equilibria (or disequilibria) with high unemployment and low output? How do market economies coordinate (or fail to coordinate) in the absence of the Walrasian auctioneer? Why might a market system fail to coordinate the actions of heterogeneous individuals? Etc.

    Robert Lucas, perhaps surprisingly, put it very well in a 2008 lecture in Italy (http://faculty.chicagobooth.edu/brian.barry/igm/ditella.pdf ). Referring to 34% decline in real output in the USA from 1929 to 1933 he asks: "Why did we, collectively, choose to reduce production by 1/3, with no change in available resources?" He describes these events as "frightening precisely because they are so mysterious."

    Since the early 1980s economists have made literally no progress in understanding these fundamental questions of macroeconomis. But to dismiss the attempt as "absurd nonsense" is, well, absurd nonsense. Perhaps macroeconomics is just too complex and difficult, forcing us to rely on more or less ad hoc aggregate models which seem to have some predictive power (although we are not sure why). But should economists really give up on any attempt to reach a deeper understanding of the questions posed in the General Theory?

    PS Lars P Syll is clearly right that a big part of this is dealing with risk versus uncertainty, and moving away from the inappropriate use of Baysian reasoning in "large world" contexts, a subject recently taken up at length in Ken Binmore´s book Rational Decisions.

    ReplyDelete
  6. Anonymous III reading your comment I experienced a ragegasm. I replied using very strong language which I hope I have managed to delete. I am still angry enough to pull the debate back to the blog. This is not flattering to you. I am very very angry.

    Anonymouses I II and III look I don't know much about blogger and won't spend time trying to make the comment feature here work. But you could easily sign your comments by typing
    carriage return - name or nickname.

    You can use a nickname if you don't want to use your name, but please stop making me address "anonymous i". I apologise for this complaint as I didn't in any way indicate any dislike for "anonymous" until now, but, for the future, please sign with a name or nickname.

    ReplyDelete