Complexity

I mentioned in a previous blog that we hosted a day of complexity experts here in Cambridge. I wasn’t at the seminar but was lucky enough to be introduced to Eric Beinhocker, the author of The Origin of Wealth: Evolution, Complexity, and the Radical Remaking of Economics. This is one of the most interesting books I’ve read in years, showing the common strands among physics, economics, evolutionary biology and computer science of what is known as complexity. The book was published in 2006, before the recent search for new ideas to reform economic theory after its poor performance in predicting the financial crisis. I hope that complexity will be a contender for new research funding from, for example, George Soros’s Institute for New Economic Thinking (INET), which had its inaugural conference earlier this year at Kings College, Cambridge.

Complexity is a rich topic but one of the key ideas is that systems have properties that are only evident at the level of the whole, and cannot be deduced or observed at the level of individual agents. This is not in itself a recent idea – the invention of macroeconomics in the 1930s was partly based on the observation that what holds for the whole is not necessarily true for the sum of the parts. A classic and unfortunately timely example is Keynes’s paradox of thrift. If each individual in an economy tries to save more (as many people urge in the indebted nations of the USA and UK, for example) then the overall result is a fall in aggregate income and a possible fall in savings, as each person is forced to spend more of their income on necessities.

Something similar operates at the level of international macroeconomics. If China, Japan and Germany all seek to have high savings and grow their economies through exports, this is fine if the USA and/or the rest of Europe wants to run deficits. If not, we will have global depression. Unfortunately that is by no means out of the question.

Other examples operate at the microeconomic level, including market behaviour. Traditional economic theories of markets are very complex mathematically yet still leave a lot unexplained and fail to predict the mostly stable but occasionally unstable behaviour that is characteristic of financial markets. Computer systems (or agent based economics as it’s often called) can generate very realistic looking behaviour from a few simple but essentially realistic assumptions about individual traders. This sort of work has been done for many years but has accelerated with cheaper computing power. Thomas Schelling, Nobel prize winner and author of another of my favourite books, The Strategy of Conflict, once used a computer simulation to show how very mild preferences for people to live near people of similar skin colour to themselves could lead to completely segregated communities, which is not what they wanted. The Strategy of Conflict is a brilliant analysis of game theory, mainly applied to the grim subject of nuclear war (which spurred a lot of analysis in this area) but with examples and ideas that are far more widely applicable.

It is not a good idea to just dump traditional economic theory, and it’s not going to happen in any case. But Eric’s work and that of the increasing number of like-minded researchers around the world show one possible way forward for understanding better how economies actually work. Eric shows that there has been a vigorous stream of work in economics that is consistent with and has contributed to the complexity analysis, but that most of it has been marginalised or forgotten by the orthodox economists.

And by an odd coincidence, Eric’s wife, an investment manager, was a speaker on the MFin last year.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.