Some new book recommendations (2): Thinking, fast and slow

Daniel Kahneman – Thinking, fast and slow

Whenever I’m asked in future to recommend a book on behavioural finance, I shall enthusiastically suggest this one. Although it’s not limited to behavioural finance or even its wider relation behavioural economics, the book covers all of the ideas that underpin those subjects. And it does so with delightful readability.

Kahneman and his research collaborator Amos Tversky (who would have shared Kahneman’s Nobel prize in economics if he had lived long enough) pioneered the application of psychological research to economics. This was sorely needed because the model of decision making in mainstream economics is utterly unrealistic, to the point of absurdity.  The rational economic beings of economics are rational only in a procedural sense, that is their may have bizarre or antisocial preferences but they are assumed to know exactly what those preferences are and to be hyper-rational in pursuing them.

Being unrealistic is not in itself a reason to criticise a theory, though the poverty of the economics model has invited a steady stream of criticism during the half century or so that it has been dominant. But the research done by psychologists including Kahneman and Tversky shows repeatedly that people simply aren’t rational in the economics sense of the word. People, even very clever people, are prone to various biases that arise from learned and evolved short cuts in their thinking, which may be helpful in some contexts but which produce systematic errors.

Kahneman tells us that the brain can be thought of as combining two systems of thought. System 1 is fast, intuitive and prone to error. System 2 is slower, more deliberative and can correct many of the errors of System 1, but it is lazy and easily over-ridden by System 1. System 1 has evolved to ensure that many critical decisions, such as whether to run away when you think you see certain patterns in the grass, are taken very quickly, ensuring you don’t get eaten by a predator.

But the very quickness of System 1 and the rules of thumb (heuristics) it uses, makes us prone to errors, particularly in the area of decisions concerning risk. Humans can learn the theory of probability but their brains are not instinctively aligned with it. This results in a whole host of mistakes that even well trained doctors make. People are overconfident about their decisions, which are highly sensitive to the context in which they are framed. We find convenient stories into which we put the information we have and rely too much on similar stories from our past.

These biases have been codified before but this book is a delight in explaining how they work and telling the research story behind them. The book’s lack of dogmatism is part of its charm.

There are some practical applications arising from knowing about these biases, which should make for superior decisions, if people can be disciplined and honest enough to acknowledge them. Unfortunately it is in groups and teams that the biases are most likely to lead to bad decisions and where it is hardest for people to risk admitting their errors, even though everyone else makes them too.

This book doesn’t tell you how to make money (or avoid losses) through a knowledge of behavioural finance. But, to my knowledge, no other book does either. That’s because behavioural finance doesn’t give you a nice list of predictions from which to profit (or avoid loss). It’s very good at explaining, after the event, why things didn’t work out or why an investor did well by bucking the trend.

But it just might make you a bit more aware of when you’re making a poorly grounded decision or when you should be a lot more cautious. Over-confidence is a necessary quality of entrepreneurs and to some extent of all successful people. And I could hardly avoid including academics in that. To quote Kahneman (p. 264)

“I have always believed that scientific research is another domain where a form of optimism is essential to success: I have yet to meet a successful scientist who lacks the ability to exaggerate the importance of what he or she is doing, and I believe that someone who lacks a delusional sense of significance will wilt in the face of repeated experiences of multiple small failures and rare successes, the fate of most researchers.”

But excessive optimism can also lead to disaster.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.