Other Sellers on Amazon
98% positive over last 12 months
94% positive over last 12 months

Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet or computer – no Kindle device required. Learn more
Read instantly on your browser with Kindle for Web.
Using your mobile phone camera, scan the code below and download the Kindle app.


Thinking, Fast & Slow (L) (Penguin Press Non-Fiction) Paperback – 28 May 2012
Price | New from |
Kindle Edition
"Please retry" | — |
Audible Audiobook, Unabridged
"Please retry" |
₹0.00
| Free with your Audible trial |
Hardcover, Illustrated
"Please retry" | ₹600.00 |
Paperback
"Please retry" | ₹395.00 | ₹395.00 |
Audio CD
"Please retry" |
—
| — |
Save Extra with 3 offers
- Free Delivery
- Pay on Delivery
- 10 days Replacement
- Amazon Delivered
- Secure transaction
10 days Replacement
Replacement Reason | Replacement Period | Replacement Policy |
---|---|---|
Physical Damage, Defective, Wrong and Missing Item | 10 days from delivery | Replacement |
Replacement Instructions

Read full returns policy
Purchase options and add-ons
- ISBN-100141033576
- ISBN-13978-0141033570
- PublisherPenguin Books Ltd
- Publication date28 May 2012
- LanguageEnglish
- Dimensions12.6 x 2.4 x 19.7 cm
- Print length512 pages
Customers who viewed this item also viewed
Special offers and product promotions
- 5% Instant Discount up to INR 250 on HSBC Cashback Card Credit Card Transactions. Minimum purchase value INR 1000 Here's how
- No cost EMI available on select cards. Please check 'EMI options' above for more details. Here's how
- Get GST invoice and save up to 28% on business purchases. Sign up for free Here's how
- The gorilla study illustrates two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness.Highlighted by 15,865 Kindle readers
- Mood evidently affects the operation of System 1: when we are uncomfortable and unhappy, we lose touch with our intuition.Highlighted by 13,564 Kindle readers
- A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth.Highlighted by 13,363 Kindle readers
- This remarkable priming phenomenon—the influencing of an action by the idea—is known as the ideomotor effect.Highlighted by 11,768 Kindle readers
- One of the tasks of System 2 is to overcome the impulses of System 1. In other words, System 2 is in charge of self-control.Highlighted by 11,490 Kindle readers
From the Publisher



|
|
|
---|---|---|
|
|
|
|
|
|
---|---|---|
|
|
|
Product description
Review
There have been many good books on human rationality and irrationality, but only one masterpiece. That masterpiece is Daniel Kahneman's Thinking, Fast and Slow.Kahneman, a winner of the Nobel Prize for economics, distils a lifetime of research into an encyclopedic coverage of both the surprising miracles and the equally surprising mistakes of our conscious and unconscious thinking. He achieves an even greater miracle by weaving his insights into an engaging narrative that is compulsively readable from beginning to end. My main problem in doing this review was preventing family members and friends from stealing my copy of the book to read it for themselves...this is one of the greatest and most engaging collections of insights into the human mind I have read -- William Easterly ― Financial Times
Absorbing, intriguing...By making us aware of our minds' tricks, Kahneman hopes to inspire individuals and organisations to identify strategies to outwit them -- Jenni Russell ― Sunday Times
Profound . . . As Copernicus removed the Earth from the centre of the universe and Darwin knocked humans off their biological perch, Mr. Kahneman has shown that we are not the paragons of reason we assume ourselves to be ― The Economist
[Thinking, Fast and Slow] is wonderful, of course. To anyone with the slightest interest in the workings of his own mind, it is so rich and fascinating that any summary would seem absurd -- Michael Lewis ― Vanity Fair
It is an astonishingly rich book: lucid, profound, full of intellectual surprises and self-help value. It is consistently entertaining and frequently touching, especially when Kahneman is recounting his collaboration with Tversky . . . So impressive is its vision of flawed human reason that the New York Times columnist David Brooks recently declared that Kahneman and Tversky's work 'will be remembered hundreds of years from now,' and that it is 'a crucial pivot point in the way we see ourselves.' They are, Brooks said, 'like the Lewis and Clark of the mind' . . . By the time I got to the end of Thinking, Fast and Slow, my skeptical frown had long since given way to a grin of intellectual satisfaction. Appraising the book by the peak-end rule, I overconfidently urge everyone to buy and read it. But for those who are merely interested in Kahenman's takeaway on the Malcolm Gladwell question it is this: If you've had 10,000 hours of training in a predictable, rapid-feedback environment-chess, firefighting, anesthesiology-then blink. In all other cases, think ― The New York Times Book Review
[Kahneman's] disarmingly simple experiments have profoundly changed the way that we think about thinking . . . We like to see ourselves as a Promethean species, uniquely endowed with the gift of reason. But Mr. Kahneman's simple experiments reveal a very different mind, stuffed full of habits that, in most situations, lead us astray -- Jonah Lehrer ― The Wall Street Journal
This is a landmark book in social thought, in the same league as The Wealth of Nations by Adam Smithand The Interpretation of Dreams by Sigmund Freud -- Nassim Nicholas Taleb, author of 'The Black Swan'
Daniel Kahneman is among the most influential psychologists in history and certainly the most important psychologist alive today...The appearance of Thinking, Fast and Slow is a major event -- Steven Pinker, author of ― The Language Instinct
Daniel Kahneman is one of the most original and interesting thinkers of our time. There may be no other person on the planet who better understands how and why we make the choices we make. In this absolutely amazing book, he shares a lifetime's worth of wisdom presented in a manner that is simple and engaging, but nonetheless stunningly profound. This book is a must read for anyone with a curious mind -- Steven D. Levitt, co-author of 'Freakonomics'
This book is a tour de force by an intellectual giant; it is readable, wise, and deep. Buy it fast. Read it slowly and repeatedly. It will change the way you think, on the job, about the world, and in your own life -- Richard Thaler, co-author of 'Nudge'
[A] tour de force of psychological insight, research explication and compelling narrative that brings together in one volume the high points of Mr. Kahneman's notable contributions, over five decades, to the study of human judgment, decision-making and choice . . . Thanks to the elegance and force of his ideas, and the robustness of the evidence he offers for them, he has helped us to a new understanding of our divided minds-and our whole selves -- Christoper F. Chabris ― The Wall Street Journal
Thinking, Fast and Slow is a masterpiece - a brilliant and engaging intellectual saga by one of the greatest psychologists and deepest thinkers of our time. Kahneman should be parking a Pulitzer next to his Nobel Prize -- Daniel Gilbert, Professor of Psychology, Harvard University, author of 'Stumbling on Happiness', host of the award-winning PBS television series 'This Emotional Life'
A major intellectual event . . . The work of Kahneman and Tversky was a crucial pivot point in the way we see ourselves -- David Brooks ― The New York Times
Kahneman provides a detailed, yet accessible, description of the psychological mechanisms involved in making decisions -- Jacek Debiec ― Nature
This book is one of the few that must be counted as mandatory reading for anyone interested in the Internet, even though it doesn't claim to be about that. Before computer networking got cheap and ubiquitous, the sheer inefficiency of communication dampened the effects of the quirks of human psychology on macro scale events. No more. We must now confront how we really are in order to make sense of our world and not screw it up. Daniel Kahneman has discovered a path to make it possible -- Jaron Lanier, author of You Are Not a Gadget
For anyone interested in economics, cognitive science, psychology, and, in short, human behavior, this is the book of the year. Before Malcolm Gladwell and Freakonomics, there was Daniel Kahneman who invented the field of behavior economics, won a Nobel...and now explains how we think and make choices. Here's an easy choice: read this ― The Daily Beast
I will never think about thinking quite the same. [Thinking, Fast and Slow] is a monumental achievement -- Roger Lowenstein ― Bloomberg/Businessweek
A terrific unpicking of human rationality and irrationality - could hardly have been published at a better moment. Kahnemann is the godfather of behavioural economics, and this distillation of a lifetime's thinking about why we make bad decisions - about everything from money to love - is full of brilliant anecdote and wisdom. It is Kahnemann's belief that anyone who thinks they know exactly what is going on hasn't understood the question; as such it's the perfect gift for opinionated family members everywhere. -- Tim Adams ― Observer Books of the Year
The book I most want to be given is Thinking, Fast and Slow by Daniel Kahneman. I'm a speedy thinker myself, so am hoping to be endorsed in that practice. -- Sally Vickers ― Observer Books of the Year
In this comprehensive presentation of a life's work, the world's most influential psychologist demonstrates that irrationality is in our bones, and we are not necessarily the worse for it -- 10 Best Books of 2011 ― New York Times
Selected by the New York Times as one of the 100 Notable Books of 2011 ― New York Times
About the Author
Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
Product details
- Publisher : Penguin Books Ltd (28 May 2012)
- Language : English
- Paperback : 512 pages
- ISBN-10 : 0141033576
- ISBN-13 : 978-0141033570
- Reading age : 10 years and up
- Item Weight : 340 g
- Dimensions : 12.6 x 2.4 x 19.7 cm
- Country of Origin : United Kingdom
- Best Sellers Rank: #98 in Books (See Top 100 in Books)
- #7 in Analysis & Strategy
- #11 in Society & Social Sciences
- #37 in Personal Transformation
- Customer Reviews:
About the author

Daniel Kahneman (Hebrew: דניאל כהנמן, born March 5, 1934) is an Israeli-American psychologist notable for his work on the psychology of judgment and decision-making, as well as behavioral economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences (shared with Vernon L. Smith). His empirical findings challenge the assumption of human rationality prevailing in modern economic theory. With Amos Tversky and others, Kahneman established a cognitive basis for common human errors that arise from heuristics and biases (Kahneman & Tversky, 1973; Kahneman, Slovic & Tversky, 1982; Tversky & Kahneman, 1974), and developed prospect theory (Kahneman & Tversky, 1979).
In 2011, he was named by Foreign Policy magazine to its list of top global thinkers. In the same year, his book Thinking, Fast and Slow, which summarizes much of his research, was published and became a best seller. He is professor emeritus of psychology and public affairs at Princeton University's Woodrow Wilson School. Kahneman is a founding partner of TGG Group, a business and philanthropy consulting company. He is married to Royal Society Fellow Anne Treisman.
In 2015 The Economist listed him as the seventh most influential economist in the world.
Bio from Wikipedia, the free encyclopedia. Photo by see page for author [Public domain], via Wikimedia Commons.
Customer reviews

Reviewed in India on 6 June 2023
-
Top reviews
Top reviews from India
There was a problem filtering reviews right now. Please try again later.
Kahneman’s thesis breaks our decision-making systems into two pieces, System 1 and System 2, which are the respective “fast” and “slow” of the title. System 1 provides intuitive judgements based on stimulus we might not even be conscious of receiving; it’s the snap signals that we might not even know we are acting upon. System 2 is the more contemplative, cognitively taxing counterpart that we engage for serious mental exertion. Though often oppositional in the types of decisions they produce, Kahneman is keen to emphasize that it’s not about System 1 versus System 2. Instead, he’s out to educate us about how the interplay between these systems causes us to make decisions that aren’t always rational or sensible given the statistics and evidence at hand.
Kahneman takes us through an exhaustive tour of biases and fallacies people are prone to making. He talks about the halo effect, affection bias, confirmation bias, and even regression to the mean. As a mathematician, I liked his angle on probability and statistics; as a logician, I appreciated his brief segues into the logical aspects of our contradictory decision-making processes. Lest I give the impression Kahneman gets too technical, however, I should emphasize that, despite its length, Thinking, Fast and Slow remains aggressively accessible. There are a few points where, if you don’t have a basic grasp of probability (and if Kahneman demonstrates anything, it’s that most people don’t), then you might feel talked over (or maybe it’s those less-than-infrequent, casual mentions of “and later I won a Nobel Prize”). But this book isn’t so much about science as it is about people.
There are two other things I really appreciated about this book, both of which are related to psychology. I’m a fairly easygoing person, and I don’t always like to make waves, but sometimes I like to make some trouble and argue with some of my friends about whether psychology is a science. The problem for psychology is that it’s actually a rather broad term for a series of overlapping fields of investigation into human behaviour. On one end of this continuum, you have Freud and Jung and the various psychoanalysts who, let’s face it, are one step up from astrologers and palm-readers. On the other end, you have the cutting-edge cognitive psychology informed by the neuroscience of MRIs, split-brain studies, and rat research. So claiming that psychology is or isn’t a science is a little simplistic, and I’m willing to grant that there are areas within psychology that are science. For what it’s worth, Kahneman went a long way to reinforcing this: it’s clear he and his collaborators have done decades of extensive research. (Now, yes, it’s social science, but I won’t get into that particular snobbery today.)
The other thing I liked about Thinking, Fast and Slow is its failure to mention evolutionary psychology. Once in a while, Kahneman alludes to System 1’s behaviour being the result of evolutionary adaptation—and that’s fine, because it is true, almost tautologically so. But he never quite delves into speculation about why such behaviour evolved, and I appreciate this. There’s a difference between identifying something as an adaptation and determining why it’s an adaptation, and I’m not a fan of evolutionary psychologists’ attempts to reduce everything to the trauma of trading trees for bipedalism … I’m willing to admit I have an ape brain, but culture must count for something, hmm?
I suppose it’s also worth mentioning that this book reaffirms my supercilious disregard for economics. According to Kahneman, stock brokers and investors have no idea what they are doing—and some of them know this, but most of them don’t. Economists are, for the most part, highly-trained, but they seem bent upon sustaining this theoretical fantasy land in which humans are rational creatures. Aristotle aside, the data seem to say it isn’t so. I occasionally try my hand at reading books about the economy, just so I can say I did, but they usually end up going over my head. I’m a mathematician and I don’t get numbers—but at least I’m not the only one.
So Thinking, Fast and Slow is genuinely interesting. I learned a lot from it. I would rate it higher, but I was starting to flag as I approached the finish line. Truth be told, I skipped the two articles Kahneman includes at the end that were the original publications about the theories he explains in the book. I’m sure they are fascinating for someone with more stamina, but at that point I just wanted to be done. That’s never good: one of the responsibilities of a non-fiction author is to know how to pace a book and keep its length appropriate. Too short and the book is unsatisfying—too long, and maybe it’s more so. And I think this flaw is entirely avoidable; it’s a result of Kahneman’s tendency to reiterate, to circle back around to the same discussions over and over again. He spends an entire chapter on prospect theory, then a few chapters later he’s telling us about its genesis all over again, just from a slightly different angle. Like that party guest, Kahneman is full of interesting stories, but after telling one after another for such a long period of time, it starts sounding like white noise. And he ate all those little cocktail snacks too.
I inevitably ended up comparing Thinking, Fast and Slow to How We Decide, a much slimmer volume along much the same lines as this one. Whereas Lehrer’s focus is on the neurology behind decision-making, Kahneman is more interested in psychology. Both books boil down to we suck at automatic decision-making when statistics are involved; therefore, we behave less rationally than we believe we do. Lehrer explains why things go wrong, and Kahneman categorizes all the different way things go wrong. In many ways the books are complementary, and if this is an area of interest for you, I’ll recommend them both. For the casual reader, however, Thinking, Fast and Slow is a rather dense meal. By all means, give it a try, but take it slow.
Fonts are small but comfortable to read
Spine is well binded
Cover can be better in quality
Recommended if you want to own.

Reviewed in India 🇮🇳 on 6 June 2023
Fonts are small but comfortable to read
Spine is well binded
Cover can be better in quality
Recommended if you want to own.



The Downsides:
The only downsides for me was the length of the book and the language. Though the author has tried his best to make it easier to read and understand but there were parts where I felt it got too academic to me and hence, I found it too difficult to concentrate.
Apart from it, the coherence in the ideas of the book was something I felt was missing. The initial chapters were well connected and were creating a coherent story but as the book progressed there were times when I found the previous chapter is not the right link to the present chapter.
The Upsides:
Keeping these subjective flaws aside, this book was a game changer for me. As it gives a glimpse into human psychology, not some gimmicky kind of but one based on the papers, research and more.
This book beautifully explained the decision making concepts, happiness concept by making a distinction of two fictional selves in the brain viz experienced self, and remembering self Further, it helped in understanding the concept of rationality that I have never known of before.
Concluding:
I usually complete a book within 10 days, and 20 days is the maximum that I have spent on any non fiction book. However, Thinking, Fast and Slow took me an entire duration of two months. I started this book in October, and I completed it on 2nd January, 2023.
It's not just the whopping 420 pages that took more time but understanding each page, inculcation of knowledge of each word was a task in itself. This book has knowledge that at least my mind took time to digest. In one sitting, at max, I read 10 pages because these 10 pages were packed with research, and knowledge that a self help productivity book would have provided.
Though it took time, this book was a pleasure to read. It taught me patience, it taught me how to take it slow, as I didn't want to miss any concept. I taught the power of slow thinking and the infinite powers of my brains fast thinking process. Learnt about decision making, happiness, risks, and more.
I would recommend you to read it only if you have the patience to read and understand it and if you are up for something a little more challenging.
I will need to come back to this book again and again to ensure an effective implementation.
Edit - Just received the replacement order and it's in mint condition as it should be thankyou Amazon and I'm going to attach some photographs as well

Edit - Just received the replacement order and it's in mint condition as it should be thankyou Amazon and I'm going to attach some photographs as well




Top reviews from other countries

I read Kahneman's 2011 book over several months because it was long (499 pages) and thorough repetitive.
My top-line recommendation that that you read this insightful book, but I suggest you take a chapter per day (or week) to allow yourself time to digest -- and experience -- the ideas. (Alternatively, print this review and read one note per day! :)
Here are some notes on Kahneman's ideas:
Kahneman suggests that we process decisions by instinct (System 1 thinking, or "guts") or after consideration (System 2 thinking, or "brains"). The important point is that each system is right for some situations but not others. Order food that "feels right" but don't buy a car that way. A car (or job or house) decision involves many factors that will interact and develop over years. We cannot predict all these factors, but we can give them appropriate weights with care.
Salespeople appeal to your guts when they want you to trust them. You should rely on brains to evaluate their promises.
We make better gut decisions when we're happy but worse ones when we're sad or angry.
Kahneman says we often fail to look beyond "what we see is all there is" when considering a situation. This leads to misdirected gut responses. (Nassim Taleb's Fooled by Randomness addresses this bias.) People over-estimate the risk of violent death because the media loves exotic, bloody stories.
People vote for "competence" (strong, trustworthy) over "likability" when judging candidates on looks. Many voters choose candidates based on looks.
People believe that a beneficial technology's risk is lower and that low risk technology brings more benefits. This may explain why most people don't care about the risks of driving cars (far more dangerous than flying in airplanes) or using cell phones. It also suggests that policy changes (e.g., higher prices for water) will be more acceptable when they are small and reversible. After the sky does not fall, the "low risk" strategy can be expanded.
The measuring stick of risk (relative to what?) affects people's perceptions of risk.
Bayesian reasoning: (1) anchor your judgement on the probability of an outcome (given a plausible set of repetitions), then (2) question the accuracy of your belief in that probability as outcomes appear. Put differently, take a stand and reconsider it as new data arrive.
People will pay more for a "full set of perfect dishes" than for the same set with extra damaged dishes -- violating the "free disposal" assumption of economic theory (we can always dump excess). This bias explains why a house with freshly painted, empty rooms will sell for more than one with fresh paint but old furniture.
Stereotyping is bad from a social perspective, but we should not ignore the information included in group statistics. Looking from the other direction, people are far TOO willing to assume the group behaves as one individual. (When I was traveling, I learned "not to judge a person by their country nor a country by one person.")
Passing the buck: People "feel relieved of responsibility then they know others have heard the same request for help." This fact explains the importance of putting one person in charge, asking that person for a decision, and setting a deadline to evaluate the decision's impact.
"Regression to the mean" happens when average performance replaces a "hot streak." It's not caused by burn out. It's caused by statistics. (Try to get a "hot streak" in coin flips.)
"Leaders who have been lucky are not punished for taking too much risk... they are credited with flair and foresight" [p204]. Two of three mutual funds underperform the market in any given year, but lucky managers (and their investors) cling to their "illusion of skill."
Successful stock traders find undervalued companies, not good companies whose shares may already be overpriced.
Philip Tetlock interviewed 284 people "who made their living commenting or offering advice on economic and political trends." Their predictions could have been beaten by dart-throwing monkeys -- even within their specializations. They offered excuses to explain their "bad luck" (see Note 8).
"Errors of prediction are inevitable because the world is unpredictable" [p220].
Algorithms are statistically superior to experts when it comes to diagnosing medical, psychological, criminal, financial and other events in "uncertain, unpredictable" domains. See my paper on real estate markets [pdf].
Simpler statistics are often better. Forget multivariate regressions. Use simple weights. For example: Marital stability = f (frequency of lovemaking - frequency of quarrels).
"Back-of-envelope is often better than an optimally weighted formula and certainly better than expert judgement" [p226].
Good (trustworthy) intuition comes from having enough time to understand the regularities in a "predictable environment," e.g., sports competition. "Intuition cannot be trusted in the absence of stable regularities in the environment" [p241].
The "planning fallacy" might lead one to believe the best-case prediction when events do not follow the best case path. Use less optimistic weights -- and read this book.
Overoptimism explains lawsuits, wars, scientific research and small business startups. Leaders tend to be overoptimistic, for better or worse. (Aside: I think men are more optimistic than women, which is why they discover more and die more often.)
Want to plan ahead? "Imagine it's one year in the future and the outcome of the plan was a complete disaster. Write a debrief on that disaster." This is useful because there are more ways to fail than succeed.
Our attitudes towards wealth are affected by our reference point. Start poor, and it's all up; start rich, and you may be disappointed. (If you own a house, decide if you use the purchase price or its "value" during the bubble.") You're much happier going from $100 to $200 than $800 to $900.
The asymmetry of loses/wins in prospect theory explains why it's harder for one side to "give up" the exact same amount as the other side gains. This explains the durability of institutions -- for better or worse -- and why they rarely change without (1) outside pressure of bigger losses or (2) huge gains to compensate for losses. It also explains why it's hard for invaders to win.
Economists often fail to account for reference points, and they dislike them for "messing up" their models. Economists whose models ignore context may misunderstand behavior.
We give priority to bad news, which is why losing $100 does not compensate for winning $100. Hence, "long term success in a relationship" depends on avoiding the negative more than seeking the positive" [p302].
People think it's fairer to fire a $9/hr worker and hire a $7/hr worker than reduce the wages of the $9/hr worker. That may not be a good way to go.
"The sunk cost fallacy keeps people too long in poor jobs, unhappy marriages and unpromising research projects" [p345].
"The precautionary principle is costly, and when interpreted strictly it can be paralyzing." It would have prevented "airplanes, air conditioning, antibiotics, automobiles..."
Framing and anchoring affect our perspectives. The American preference for miles per gallon (instead of liters per 100km) means they cannot accurately compare fuel efficiency among cars. This is not an accident as far as US car companies are concerned. (Another non-accident is raising fuel economy standards instead of gas taxes.)
People may choose a vacation according to what they PLAN to remember than what they will experience. That may be because we remember high and low points but forget their duration.
"The easiest way to increase happiness is to control use of your time. Can you find more time to do the things you enjoy doing?" (I have the freedom to write this review, but it gets tedious after 3 hours...)
"Experienced happiness and life satisfaction are largely determined by the genetics of temperament," but "the importance that people attached to income at age 18 anticipated their satisfaction with their income as adults" [pp400-401]. I am fortunate, I think, to have started life with low expectations. That makes it easier for me to make 1/3 the money in Amsterdam that I would in Riyadh because it's definitely better to be "poor" in Amsterdam.
That said, "the goals people set for themselves are so important to what they do and how they feel that... we cannot hold a concept of well-being that ignores what people want" [p402].
"Adaptation to a situation means thinking less and less about it" [p405].
[Paraphrased from p412]: Our research has not shown that people are irrational. It has clarified the shape of their rationality, which creates a dilemma: should we protect people against their mistakes or limit their freedom to make them? Seen from the other side, we may think it easier to protect people from the quirks of "guts" and laziness of "brains." (Hence my support for a ban on advertising.)
"Brains" may help us rationalize "guts" but they can also stop foolish impulses -- when we acknowledge the limits to our reason and the information we rely on.
"Gut" feelings can guide us well if we can tell the difference between clear and complicated circumstances.
"An organization is a factory that manufactures judgements and decisions" [p417]. It's important, therefore, to balance between its "gut" and "brain" functions.
Bottom Line: I give this book FOUR STARS. Skip psychology and read it to understand yourself and others.

Instead, Prospect Theory shows that important choices are prone to the relativity of shifting reference points (context) and formulations of inconsequential features within a situation such that human preferences struggle to become reality-bound. In particular our decisions are susceptible to heuristic (short-cutting) or cognitive illusory biases - an inconsistency that is built in to the design of our minds, for example, the 'duration neglect' of time (less is more effect) in recounting a story by the Remembering Self, as opposed to sequentially by the Experiencing Self. Prospect Theory is based on the well known dominance of threat/escape (negativity) over opportunity/approach (positivity) as a natural tendency or hard wired response towards risk adversity that Kahneman's grandmother would have acknowledged. Today this bias is explored by behavioural economics (psychophysics) and the science of neuroeconomics - in trying to understand what a person's brain does while they make safe or risky decisions.
It would appear that there are two species of homo sapiens: those who think like "Econs" - who can compare broad principles and processes 'across subjects', like spread betters (broad framing) in trades of exchange; and "Humans" who are swayed optimistically or pessimistically in terms of conviction and fairness by having attachments to material usage (narrow framing) and a whole host of cognitive illusions, e.g. to name but a very few: the endowment effect, sunk cost fallacy and entitlement. Kahnmann argues that these two different ways of relating to the world are heavily predicated by a fundamental split in the brain's wet-ware architecture delineated by two complementary but opposing perspectives:
System 1 is described as the Inside View: it is "fast" HARE-like intuitive thought processes that jump to best-case scenario and plausible conclusions based on recent events and current context (priming) using automatic perceptual memory reactions or simple heuristic intuitions or substitutions. These are usually affect-based associations or prototypical intensity matches (trying to compare different categories, e.g. apples or stake?). System 1 is susceptible to emotional framing and prefers the sure choice over the gamble (risk adverse) when the outcomes are good but tends to accept the gamble (risk seeking) when all outcomes are negative. System 1 is 'frame-bound' to descriptions of reality rather than reality itself and can reverse preferences based on how information is presented, i.e. is open to persuasion. Therefore, instead of truly expert intuitions System 1 thrives on correlations of coherence (elegance), certainty (conviction) and causality (fact) rather than evidential truth. System 1 has a tendency to believe, confirm (well known bias), infer or induce the general from the particular (causal stereotype). System 1 does not compute base rates of probability, the influence of random luck or mere statistics as correlation (decorrelation error) or the regression to the mean (causality error). System 1's weakness is the brain's propensity to succumb to over-confidence and hindsight in the resemblance, coherence and plausibility of flimsy evidence of the moment acronymically termed WYSIATI (What You See Is All There Is) at the expense of System 2 probability. To succumb is human as so humbly shown throughout the book which has no bounds to profession, institution, MENSA level or social standing. Maybe Gurdjieff was right when he noticed that the majority of humans are sheep-like.
System 2 on the other hand is the Outside View that attempts to factor in Rumsfeld's "unknown unknowns" by using realistic baselines of reference classes. It makes choices that are 'reality-bound' regardless of presentation of facts or emotional framing and can be regarded as "slow" RAT-like controlled focus and energy sapping intention, the kind used in effort-full integral, statistical and complex reasoning using distributional information based on probability, uncertainty and doubt.
However System 2 is also prone to error especially in the service of System 1 and even though it has the capability with application not to confuse mere correlation with causation and deduce the particular from the general, it can be blocked when otherwise engaged, indolent or full of pride! As Kahneman puts it "...the ease at which we stop thinking is rather troubling" and what may appear to be compelling is not always right especially when the ego - the executive regulator of will power and concentration - is depleted of energy, or conversely when it is in a good mood of cognitive ease (not stress) deriving from situations of 'mere exposure' (repetition and familiarity). Experiments have repeatedly shown that cognitive aptitude and self-control are in direct correlation, and biases of intuition are in constant need of regulation which can be hard work such as uncovering one's outcome bias (part hindsight bias and halo effect) based on the cognitive ease with which one lays claim to causal 'narrative fallacies' (Taleb) rather than "adjusting" to statistical random events born out of luck!
So..
Do not expect a fun and "simples" read if you want clarity in to how impulses become voluntary actions and impressions and feelings and inclinations so readily become beliefs, attitudes and intentions (when endorsed by System 2).
The solution..
Kahneman makes the special plea that our higher-minded intuitive statistician of System 2 take over the art of decision-making and wise judgement in "accurate choice diagnosis" to minimise the "errors in the design of the machinery of cognition." We should learn to recognise situations in which significant mistakes are likely by making the time and putting in the analytical effort to avoid them especially when the stakes are high - usually when a situation is unfamiliar and there is no time to collect more information. 'Thinking Fast and Slow' practically equips the reader with sufficient understanding to approach reasoning situations applying a certain amount of logic in order to balance and counter our intuitive illusions. For example recognising the Texas sharp shooter fallacy (decorrelation error) or de-constructing a representative heuristic (stereotype) in one's day-to-day affairs should be regarded as a reasonable approach to life even by any non-scientific yard stick. In another example, the System 2 objectivity of a risk policy is one remedy against the System 1 biases inherent in the illusion of optimists who think they are prudent, and pessimists who become overly cautious missing out on positive opportunities - however marginal a proposition may appear at first.
One chapter called "Taming Intuitive Predictions" is particularly inspiring when it comes to corrections of faulty thinking. A reasonable procedure for systematic bias in significant decision-making situations where there is only modest validity (validity illusion) especially in-between subjects is explored. For example, when one has to decide between two candidates, be they job interviewees or start up companies as so often happens the evidence is weak but the emotional impression left by System 1 is strong. Kahneman recommends that when WYSIATI to be very wary of System 1's neglect of base rates and insensitivity to the quality of information. The law of small numbers states that there is more chance of an extreme outcome with a small sample of information in that the candidate that performs well at first with least evidence have a tendency not to be able to keep up this up over the longer term (once employed) due to the vagaries of talent and luck, i.e. there is a regression towards the mean. The candidate with the greater referential proof but less persuasive power on the day is the surer bet in the long term. However, how often can it be said that such a scenario presents itself in life, when the short term effect is chosen over the long term bet? Possibly a cheeky pertinant example here is the choice of Moyes over Mourinho as the recently installed Man Utd manager! A good choice of bad choice?
There are many examples shown in low validity environments of statistical algorithms (Meehl pattern) showing up failed real world assumptions revealing in the process the illusion of skill and hunches to make long-term predictions. Many of these are based on clinical predictions of trained professionals, some that serve important selection criteria of interviewing practices which have great significance. Flawed stories from the past that shape our views of the present world and expectations of the future are very seductive especially when combined with the halo effect and making global evaluations rather than specific ratings.
For example one's belief in the latest block busting management tool adopted by a new CEO has been statistically shown to be only a 10% improvement at best over random guess work. Another example of a leadership group challenge to select Israeli army leaders from cadets in order to reveal their "true natures" produced forecasts that were inaccurate after observing only one hour of their behaviour in an artificial situation - this was put down to the illusion of validity via the representation heuristic and non regressive weak evidence. Slightly more worryingly, the same can be said for the illusory skills of selling and buying stock persistently over time. It has shown that there is a narrative being played within the minds of the traders: they think they are making sensible educated guesses when the exposed truth is that their success in long term predictability is based on luck - a fact that is deeply ingrained in the culture of the industry with false credit being "taken" in bonuses!! Kahneman pulls no punches about the masters of the universe and I am inclined to believe in the pedigree of his analysis!!
According to Kahneman so-called experts - and he is slightly derisive in his use of the term - in trying to justify their ability to assess masses of complexity as a host of mini-skills can produce unreliable judgements, especially long term forecasts (e.g planning fallacy) due to the inconsistency of extreme context (low or zero-validity environments with non regular practice) - a System 1 type error. Any final decision should be left to an independent person with the assessment of a simple equally weighted formula which is shown to be more accurate than if the interviewer also makes a final decision who is susceptible to personal impression and "taste"..(see wine vintage predictions). The best an expert can do is anticipate the near future using cues of recognition and then know the limits of their validity rather than make random hits based on subjectively compelling intuitions that are false. "Practice makes perfect" is the well known saying though the heuristics of judgement (coherence, cognitive ease and overconfidence) are invoked in low validity environments by those who do not know what they are doing (the illusion of validity).
Looking at other similar books on sale, "You are Not So Smart" for example by David McRaney is a more accessible introduction to the same subject but clearly rests on Kahneman's giant shoulders who with his erstwhile colleagues would appear to have informed the subject area in every conceivable direction. It is hard not to do justice to such a brilliant book with a rather longish review. This is certainly one of the top ten books I have ever read for the benefits of rational perseverance and real world knowledgeable insights and seems to be part of a trend or rash of human friendly Econ (System 2) research emanating out of the USA at the moment. For example, recently 2013 Nobel winning economics research by R Shiller demonstrates that there are predictable regularities in assets markets over longer time periods, while E Fama makes the observation that there is no predictability in the short run.
In summary, "following our intuitions is more natural, and 'somehow' more pleasant than acting against them" and we usually end up with products of our extreme predictions, i.e. overly optimistic or pessimistic, since they are statistically non-regressive by not taking account of a base rate (probability) or regression towards the mean (self correcting fluctuations in scores over time). The slow steady pace of the TORTOISE might be considered the right pace to take our judgements but we are prone not to give the necessary time and perspective in a busy and obtuse world. The division of labour and conflict between the two Systems of the mind can lead to either cognitive illusion (i.e. prejudice/bias) or if we are lucky wise judgement in a synthesis of intuition and cognition (called TORTOISE thinking by Dobransky in his book Quitting the Rat Race).
Close your eyes and imagine the future ONLY after a disciplined collection of objective information unless of course you happen to have expert recognition, which is referred to in Gladwell's book on subject called Blink, but then your eyes are still open and liable to be deceived. Kahneman's way seems so much wiser but harder nonetheless. The art and science of decision-making just got so much more interesting in the coming world of artificial intelligence!

Clearly the second is in a subset that is a member of the set in which the first is found, so naturally someone who takes this list on its face will recognize that the first is more likely than the second just by definition. However, a large majority of participants choose the second selector as the most likely from a probability point of view. The experimenters conclude that this is an example of "representation" and ignoring basic principles of probability, or confusing probability with plausibility or coherence.
It's true that the result illustrates a fault in the logic of most participants, but there can be other explanations for why this fault occurs. In this example, it occurs to me, as one who answered wrongly and then thought about why I did so, that specifying two categories that are so similar could cause one to imply something about the less specific category that is not actually true. "Bank teller" in light of "bank teller who is a feminist" might automatically be seen by many participants as "bank teller who is not a feminist", without the participants even realizing she has made this change in the selector. That's what I think I did, and, though an error, is not one of representation. A very different mechanic was in place for me here. I would guess that this also applied to many other participants.
In other words, the selectors are presented as if they are on the same level - no hierarchy. Is she a school teacher, or a bookstore clerk, or an insurance salesperson, or a bank teller, or a bank teller who is a feminist? I think System 1 sees that most of these are either/or choices, but two seem to be this and also possibly that, which is hierarchical so doesn't really fit in with the rest of the list. System 1 might think of modifying the "bank teller" selector to "bank teller that is not a feminist", thus making the selector list much more uniform (and while it's making these changes, why not change "is active in the feminist movement" into "is active in the feminist movement and is not a bank teller"). And since this is automatic and System 1 never explains itself, most people will never be able to explain why they made such a seemingly obvious mistake. Maybe...if so, then this possible explanation also applies to all the subsequent refinements of the Linda experiment, in my uneducated and unqualified view.
To go further, another experiment is presented in which there is a die that has 4 green faces and 2 red faces, and the participant is given a list of three outcomes, all of which are unlikely, because they have more reds that greens, and the participant is supposed to choose which one she would select to win $25. Here are the choices:
1. RGRRR
2. GRGRRR
3. GRRRRR
The word "probability" is left out completely, as are any verbalized descriptions, to control for the possibility that participants are misunderstanding what is meant by probability. Even with this attempt at control, the result is that the participants choose 2 more often that 1 or 3, even though clearly if one chooses 1 they will win the $25 if the result is RGRRR or GRGRRR (because the latter contains the former). Right. But what if System 1 sees that two of the three selectors has 6 results, and only one has 5, so rules in its automated way that 1 is invalid? The rules seem to imply that there are six rolls of the die, and if we bet on 1, we will lose (the probability that I will get only 5 results from 6 rolls of the die is exactly zero percent! - well, maybe more than 0% if there's a perfectly timed earthquake or other cataclysm) - an unspoken rule is detected such that the bet is on the full 6 rolls. Given that the only two valid selectors are 2 and 3, then 2 is the most probably choice. Maybe...but this, of course, depends on the eventuality of System 1 to also ignore the detail that there would be 20 die rolls - and we've already learned that System 1 is wont to do this for the sake of expediency.
In both cases above, System 1 might have chosen to modify the scenario - in other words, System 1 might have substituted the original question with one that is easier to answer, which is a heuristic technique of System 1 that is discussed at length earlier in the book.
The point is that either way, there has been an error, but the mechanics of that error may not be what they appear to be. That's my thought anyway. I think this also applies to many other cases. Or maybe not...
The heart attack and Borg experiments were very good ones as well, but my System 2 is now too depleted of energy to think of alternative conclusions, or maybe just too lazy. :-). These are strong enough that I may just eventually conclude that my second guessing above is merely a narrative fallacy on my part.
Anyway, very fascinating book, and very revealing.

“Thinking, Fast and Slow” is a fascinating look at how the mind works. Drawing on knowledge acquired from years of research in cognitive and social psychology, Nobel Prize Winner, Dr. Daniel Kahneman delivers his magnum opus on Behavioral Economics. This excellent book focuses on the three key sets of distinctions: between the automatic System 1 and the effortful System 2, between the conception of agents in classical economics and in behavioral economics, and between the experiencing and the remembering selves. This enlightening 512-page book is composed of thirty-eight chapters and broken out by the following five Parts: Part I. Two Systems, Part II. Heuristics and Biases, Part III. Overconfidence, Part IV. Choices, and Part V. Two Selves.
Positives:
1. Award-winning research. A masterpiece of behavioral economics knowledge. Overall accessible.
2. Fascinating topic in the hands of a master. How the mind works. The biases of intuition, judgment, and decision making.
3. Excellent format. Each chapter is well laid out and ends with a Speaking of section that summarizes the content via quotes.
4. A great job of defining and summarizing new terms. "In summary, most of what you (your System 2) think and do originates in your System 1, but System 2 takes over when things get difficult, and it normally has the last word."
5. Supports findings with countless research. Provides many accessible and practical examples that help readers understand the insightful conclusions.
6. A great job of letting us what we know and to what degree. "It is now a well-established proposition that both self-control and cognitive effort are forms of mental work."
7. You are guaranteed to learn something. Countless tidbits of knowledge throughout this insightful book and how it applies to the read world. "The best possible account of the data provides bad news: tired and hungry judges tend to fall back on the easier default position of denying requests for parole. Both fatigue and hunger probably play a role."
8. The differences of Systems 1 and 2 and how they function with one another. "System 1 is impulsive and intuitive; System 2 is capable of reasoning, and it is cautious, but at least for some people it is also lazy." "System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy."
9. Important recurring concepts like WYSIATI (What You See Is All There IS). "You surely understand in principle that worthless information should not be treated differently from a complete lack of information, but WYSIATI makes it very difficult to apply that principle."
10. Understanding heuristics and biases. "The strong bias toward believing that small samples closely resemble the population from which they are drawn is also part of a larger story: we are prone to exaggerate the consistency and coherence of what we see. The exaggerated faith of researchers in what can be learned from a few observations is closely related to the halo effect, the sense we often get that we know and understand a person about whom we actually know very little. System 1 runs ahead of the facts in constructing a rich image on the basis of scraps of evidence. A machine for jumping to conclusions will act as if it believed in the law of small numbers. More generally, it will produce a representation of reality that makes too much sense."
11. Paradoxical results for your enjoyment. "People are less confident in a choice when they are asked to produce more arguments to support it."
12. Understanding how our brains work, "The world in our heads is not a precise replica of reality; our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are exposed."
13. Wisdom. "'Risk' does not exist 'out there,' independent of our minds and culture, waiting to be measured. Human beings have invented the concept of “risk” to help them understand and cope with the dangers and uncertainties of life. Although these dangers are real, there is no such thing as 'real risk' or 'objective risk.'" Bonus. "To be useful, your beliefs should be constrained by the logic of probability."
14. You will learn lessons that are practical. "Rewards for improved performance work better than punishment of mistakes."
15. An interesting look at overconfidence. "Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance." "Remember this rule: intuition cannot be trusted in the absence of stable regularities in the environment."
16. Have you ever had to plan anything in your life? Meet the planning fallacy. "This may be considered the single most important piece of advice regarding how to increase accuracy in forecasting through improved methods. Using such distributional information from other ventures similar to that being forecasted is called taking an “outside view” and is the cure to the planning fallacy."
17. A very interesting look at Econs and Humans. "Economists adopted expected utility theory in a dual role: as a logic that prescribes how decisions should be made, and as a description of how Econs make choices."
18. Prospect theory explained. "The pain of losing $900 is more than 90% of the pain of losing $1,000. These two insights are the essence of prospect theory."
19. Avoiding poor psychology. "The conclusion is straightforward: the decision weights that people assign to outcomes are not identical to the probabilities of these outcomes, contrary to the expectation principle. Improbable outcomes are overweighted—this is the possibility effect. Outcomes that are almost certain are underweighted relative to actual certainty. The expectation principle, by which values are weighted by their probability, is poor psychology."
20. Great stuff on well being.
21. An excellent Conclusions chapter that ties the book up comprehensively.
Negatives:
1. Notes not linked up.
2. No formal separate bibliography.
3. Requires an investment of time. Thankfully, the book is worthy of your time.
4. The book overall is very well-written and accessible but some topics are challenging.
5. Wanted more clarification on how Bayes's rules work.
In summary, a masterpiece on behavioral economics. Dr. Kahneman shares his years of research and provides readers with an education on how the mind works. It requires an investment of your time but it so well worth it. A tremendous Kindle value don't hesitate to get this book. I highly recommend it!
Further suggestions: "Subliminal" by Leonard Mlodinow, "Incognito" by David Eagleman, "Switch" by Chip and Dan Heath, “Drive: The Surprising Truth about What Motivates Us” by Daniel H. Pink, “Blink” by Malcolm Gladwell, “The Power of Habit” by Charles Duhigg, “Quiet: The Power of Introverts in a World That Can't T Stop Talking” by Susan Cain, "The Social Animal" by David Brooks, "Who's In Charge" Michael S. Gazzaniga, "The Belief Instinct" by Jesse Bering, "50 Popular Beliefs that People Think Are True" by Guy P. Harrison, "The Believing Brain" by Michael Shermer, "Predictably Irrational" by Dan Ariely, "Are You Sure?" by Ginger Campbell, and "Mistakes Were Made But Not By Me" by Carol Tavris.

The first part of the book is worth it, however. It turns out that we possess two types of thinking, default System 1 (fast) and System 2 (slow). Between them they determine how we react and make decisions. The book exposes many behavioural studies concerning the relationship between psychology and economics, and the competition between these two disciplines to explain people’s actions and decisions, including a brief mention of the new discipline of neuroeconomics.
According to the author, we are primitive in the art of prediction. We lack methodology. We suffer from illusions of validity. Why does, on one side, someone decide to sell a stock, and on the other another decides to buy it? The evidence shows that more active stock sellers have worse results. Studies show that forecasts by doctors, investment advisors, sports analysts, politicians, economists and myriads of other professionals don’t compare favourably with machine prediction. I would add the element of ‘destructive cleverness’ whereby people tend to apply their expertise in other non-relevant areas to a problem that exists within a different set of givens, contributing factors, and noise factors that need to be properly appreciated. People perhaps tend to think too often out of the box because of lack of familiarity with a subject, and are invariably inconsistent in doing so. The art of decision making needs to be demystified. It needs to be transformed into a more scientifically and factually-based procedure. Might justice be better delivered by computer? Planning fallacies include over-optimism in costs and time due to only taking the inside view and failing to refer to references classes. Optimism is the life blood of entrepreneurs, and only 35% of businesses in the USA survive 5 years. There is the notion of optimistic martyrs; firms that fail, market fodder, if you like, yet signal new markets to more qualified competitors. Potentially dangerous groupthink can be moderated by carrying out a pre-mortem; asking participants to write a reason why a project might have failed (the discipline of FMEA in industrial language).
We learn of hedonometry that quantifies pleasure and pain. We have a less unfavourable memory of pain if it tails off at the end. Our memory of an experience may not be the same as during the experience (example of a scratch at the end of a record). There is a difference between the experiencing and remembering selves. In summing up someone’s life we are over influenced by how it ends. One Unpleasant Index survey of women yielded double for child-rearing than watching TV, which was the same level as socialising. Being alone is more pleasurable than the presence of an immediate boss. Increasing focus is being placed on the measurement of well-being. Above a certain salary (quoted as $75,000 in high-cost areas) affluence does not improve a feeling of well-being, possibly, it is argued, because richer people no longer have the opportunity to enjoy in the same way the small pleasures of life (bars of chocolate).
The author focuses on System 1, for which the cardinal rule is WYSIATI – what you see is all there is. It is impulsive, intuitive, our minds appear over-influenced by bias and spin. It is rarely indifferent to emotional words (a ‘survival chance’ of 90% is preferred to a ‘mortality rate’ of 10%). It jumps to conclusions, and can even govern important decisions depending upon how a problem is presented. Intuition requires training in skills (a top class chess player requires about 10,000 hours of practice). Reminding people of their mortality increases the appeal of authoritarian ideas. There is the Lady Macbeth Effect whereby when people feel their soul is stained they have the desire to clean themselves. The Florida Effect was illustrated when students, who had been encouraged to think of words related to old age, walked more slowly down a corridor. When we place a pen crosswise in our mouth, thereby forcing a smile, we tend to think more favourably of things. The Halo Effect occurs, for example, when people like a president’s politics because they like his voice and appearance. In the Availability Cascade, biases, popular reactions and exaggerated fears (often influenced by the media and popular reactions) influence policy. We are unduly worried about unlikely events, for example when a teenage daughter is late at night. Terrorism speaks directly to System 1 even though, even in the worst cases, it may be responsible for nowhere near the number of deaths by car accident. In decisions related to numbers, there is an Anchor Effect whereby a suggested value influences our decision. Hindsight bias causes us to blame the intelligence services for 9/11. System 1, in effect, tries to make sense of the world, to stereotype, making it predictable and explicable and overestimating predictability. It even breeds overconfidence. It averages instead of adds. People will assign less value to a larger set of dinner crockery that contains some broken items than a smaller set of the same quality but with no broken items. We tempt to rationalise the past in order to predict the future. We suffer from theory-induced blindness. The Endowment Effect provokes an aversion to loss and determines economic behaviour. We may be prepared to sell, but only at a higher price than buy (the ratio is higher in the USA than the UK). People who are poor see a small amount of income as a reduced loss rather than a gain. The brain has a rapid mechanism to detect threats, but no such thing for good events. The negative trumps the positive. A single cockroach will destroy the appeal of a bowl of cherries, but a single cherry will have no effect on a bowl of cockroaches. A stable marital relationship has been found to require at least 5:1 ratio of good interactions to bad ones. A friendship that may take years to develop can be destroyed with one single action. Golfers try harder to avoid a bogie than to gain a birdie. In relation to rational probability, our decisions are skewed negatively near 100% and positively near 0%. People attach value to gains or losses rather than wealth. System 1 makes us overweigh improbable outcomes unless we have prior experience. We tend to overestimate our chances and overweigh estimates. People tend to be risk averse in potential gains and risk-seeking in potential losses. The Sunk Cost fallacy keeps people for too long in poor jobs, unhappy marriages, and unpromising research projects. We tend to reinvest in a project in which we are already implicated, even if the prospects have deteriorated, rather than divert out effort into a more promising venture, so as not be part of a failure. We are reluctant to cut our losses. We tend to be risk-averse and environmental and safety laws, for example, are set up to protect us, yet these laws might have prevented development of the airplane, x-rays, open-heart surgery. We prepare ourselves for the feeling of regret. We avoid being too hopeful about a potential football win. People more readily forgo a discount than pay a surcharge even when the end result is identical.
System 2 includes rational thinking and reasoning, but it is inherently lazy. To counteract the negative effects of System 1 determining a choice, we can ask ourselves to produce more arguments to support it. Disbelieving something is hard work. Governments making decisions bases solely on hard facts and statistics as opposed to popular reactions.