Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts Info

Which weight loss plan works best? What are the best books on health and nutrition - What is the best free weight loss app? Discover the best Health, Fitness & Dieting books and ebooks. Check our what others have to say about Carol Tavris,Elliot Aronson books. Read over #reviewcount# reviews on Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts before downloading. Read&Download Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts by Carol Tavris,Elliot Aronson Online


“Entertaining, illuminating and—when you
recognize yourself in the stories it tells—mortifying.”
Wall Street Journal


“Every page
sparkles with sharp insight and keen observation. Mistakes were
made—but not in this book!” —Daniel Gilbert, author
of Stumbling on Happiness

 
Why is it
so hard to say “I made a mistake”—and really believe
it?
 
When we make mistakes, cling to outdated
attitudes, or mistreat other people, we must calm the cognitive
dissonance that jars our feelings of self-worth. And so, unconsciously,
we create fictions that absolve us of responsibility, restoring our
belief that we are smart, moral, and right—a belief that often
keeps us on a course that is dumb, immoral, and wrong. Backed by years
of research, Mistakes Were Made (But Not
by 
Meoffers a fascinating explanation of
self-justification—how it works, the damage it can cause, and how
we can overcome it. This updated edition features new examples and
concludes with an extended discussion of how we can live with
dissonance, learn from it, and perhaps, eventually, forgive
ourselves.
 
“A revelatory study of how lovers,
lawyers, doctors, politicians—and all of us—pull the wool
over our own eyes . . . Reading it, we recognize the behavior of our
leaders, our loved ones, and—if we’re
honest—ourselves, and some of the more perplexing mysteries of
human nature begin to seem a little clearer.” —Francine
Prose, O, The Oprah Magazine



Average Ratings and Reviews
review-bg

4.21

21694 Ratings

5

4

3

2

1


Ratings and Reviews From Market


client-img 4.5
372
76
31
10
20
client-img 3.9
5
5
3
0
1
client-img 4.02
8738
8325
2654
5
1
client-img 4.4
13
16
8
2
1

Reviews for Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts:

5

Jun 05, 2009

I found this a remarkably challenging book to read. There was a time when I thought psychology was an odd sort of discipline. As someone who had studied physics for a while I couldn’t really bring myself to call it a science and as someone who studied philosophy I also felt it had failings on that score too. My understanding of psychology was fairly limited, but Freudian, Jungian, Behaviourist and god knows what other –isms all seemed to me to depend too much on a foundation that seemed much too I found this a remarkably challenging book to read. There was a time when I thought psychology was an odd sort of discipline. As someone who had studied physics for a while I couldn’t really bring myself to call it a science and as someone who studied philosophy I also felt it had failings on that score too. My understanding of psychology was fairly limited, but Freudian, Jungian, Behaviourist and god knows what other –isms all seemed to me to depend too much on a foundation that seemed much too arbitrary. The books I’ve been reading lately on psychology, however, are much less ‘ideological’ and much more scientific.

I’ve read this book in about three days – and that despite also having about four other books on the go at the same time. This one pushed all the others I’ve started to the bottom of the list. Like I said, a lot of this book I found very challenging, but all of it very compelling.

One of the psychological insights that has been messing around with my mind lately is the idea that if you ask someone who is studying to become a doctor why one of their fellow students is also becoming a doctor they are likely to say that it is obvious that that person is virtually made to be a doctor. In fact, they are likely to think that virtually everyone else in their course is there because they are almost constitutionally designed to become a doctor. But if you ask the person themselves why they are becoming a doctor they are likely to say that they are in the course more or less by accident. That there have been a network of lines that intersected and by a series of coincidences they have ended up here. And this is not just true of people’s understanding of those around them when it comes to career choices – but virtually everything else they do too. The tendency is for us to greatly over-rate what others do as being a manifestation of their ‘essential nature’ and what we do as being an unpredictable consequence of arbitrary and random forces.

But this has consequences that go far beyond a mere curiosity related to people’s chosen career paths. When we find that a friend has engaged in what we might consider to be an ‘act of betrayal’ against us this same tendency kicks in again and we are likely to see this betrayal not as a momentary lapse on our friends part caused by them being carried away by circumstance, but as an indication of what is their essential nature. Our acts of betrayal against our friends, on the other hand, we tend to see as either momentary lapses or justified retaliation given their infinitely worse behaviour.

This book looks at the consequences that our tendencies to under-rate our own culpability for mistakes and misdemeanours has and to over-rate the intention and severity of the actions of others when committed against us. The ‘us’ here is not just ourselves personally, but also the ‘us’ as a group or as a society as a whole. Some of the examples given in this book range from case studies of marriages falling apart (something that had cringe-making moments for me as I saw some of the very much less attractive parts of my own personality displayed before me in vivid Technicolor in relation to both my current relationship and my marriage breakdown) all the way up to the long standing problems existing between Iran and the United States.

The book also looks at how people who were involved in what should really be referred to as the ‘recovered memory scandal’ have dealt with their role in this. The most generous answer is ‘not very well’. But this isn’t a book about pointing the finger and complaining about how pathetic some people are, you know, the sorts of people who make mistakes. Rather, it is a book that tries to show that we humans are all too prone to self-justification and this is a terrible danger particularly when we do things that are by any definition not things that we can be proud of. The book points out that despite our often simple-minded ideas that some people are just basically bad and that they do things just to be evil, in fact, most people who ‘do evil’ imagine they are doing good. The road to Hell is paved with good intentions and kept shining after being buffed clean by our rationalisations.

If you look up a dictionary definition of ‘evil politician’ it wouldn’t be too surprising if there was a picture of Hitler. But even if you had the chance to interview Hitler in the bunker just before he popped his pill, it is very unlikely that he would have admitted that he had made many (any) mistakes. It is also unlikely that he would think that anything he had done was either wrong or bad. No, he would have the (to us) remarkable perspective that not only he had done good (and probably not just ‘on balance’) and had acted in the best interests of the future of all humanity, but that one day people would even realise that he was as wonderful as he had always thought himself. I think we (or perhaps just I) find this hard to accept, because we like to believe that deep down the people we consider to be evil know they are bad. If only the world was so simple.

The image that stays with me from the last few years is of Lynndie England and her thumbs-up sign while she was standing beside a pyramid of naked Iraqi men. It is hard not to think that here is an instance of someone with some sort of moral deficiency, someone who clearly gains enjoyment out of the humiliation of others and therefore she must be someone devoid of some basic human quality – and that lacking is what separates her from us. Unfortunately, even that proves not to be the case. The most disturbing bit of research quoted in this book (and there are lots of disturbing bits of research discussed in this book) is that those most likely to become utter monsters are those who have high self-esteem and they are most likely to become monsters towards those who have virtually no power to retaliate. Why? Because we do not want to think of ourselves as bad people, particularly those of us with high self-esteem. But if we start to do horrible things to our enemies then we need to be able to justify those terrible acts – and we tend to do that by saying that they deserved it, that they are less than human, that they do worse to their enemies, that we are acting in a way that is pure and good and (dare we say it) humane, and in fact, that they are the ones (these powerless victims of ours) who are to blame.

The section of this book on police interrogation methods should be made compulsory reading. Years ago I read a book that talked about a psychological experiment that has stayed with me since. People were asked to come to a room in a university to do a memory test involving a series of nonsense syllables. When they got to the room they were told that the experiment was running a little late, so would they mind sitting in a chair for a few minutes. Directly in front of the chair was a poster – one of those graphic posters that show police at a car accident and warning about drink driving or something of the kind. The poster was both graphic and directly in front of the people – so not something they were likely to not notice. When they were finally let into the room to do the test half of them were actually given the syllables to learn for half an hour, the other half of them were asked if they had noticed the poster in the waiting room. These people were then quizzed for half an hour on as many details as they could remember from the poster. What colour was the car, how many policemen were there, was it the man’s right or left leg that had been cut off in the accident? You know the sort of thing. Lots and lots of detail.

Now for the interesting bit. At the end of the half hour both groups of people (the ones who did the syllables and the ones who did the ‘remembering’ of the poster) were shown another copy of the poster and asked if this was the poster they had seen in the waiting room. Virtually everyone who did the memorising of the nonsense syllables said it was – however, virtually no one who had spent half an hour ‘remembering’ the poster said it was. Why? Because those who had spent half an hour ‘remembering’ the poster had decided for sure there were three police officers, and the guy on the road was wearing a green shirt and there was a bicycle in the background and in the poster they were being shown none of those things were there.

When I first heard about this experiment (remember, we are talking about events that have all taken place in a span of slightly more than half an hour) I was shocked at what this experiment implied about our justice system. In short, we are very suggestible creatures and the legal system (particularly the police force) needs to be very careful not to pollute witnesses to crimes in ways that can destroy any hope of justice for the accused – something that should be of foremost concern. However, this book makes my concerns over the justice system seem terribly naïve. I’ve learnt that you also have to add to this mix humans who are convinced they are right, people who refuse to consider any evidence other than that which supports their conclusion after they have reached it, who take it as a professional slight if they are challenged to support or (god forbid) reconsider their favourite theory, people who won’t even change their view of the guilt of the accused after irrefutable evidence is presented to them. The need to rethink our justice system so as to take into consideration the latest findings psychology presents us with becomes all rather urgent.

This is, as I said, a deeply troubling book. Parts of this book felt like a mirror had been held up to me and I have to say that I really didn’t like what I saw. But this is a very important book and one that demands to be read. I recommend it without hesitation.
...more
4

Apr 21, 2014

"People will do anything, no matter how absurd, to avoid facing their own souls." - C.G. Jung

"Memory is a complicated thing, a relative to truth, but not its twin." - Barbara Kingsolver

Neither of the quotes above were included in this book, but they speak to some of the ideas at its core. Anyone who has any social psychology, experimental methods course, and/or paid cursory attention to the bevy of material out there about how the human mind and we, as people, work, will find a lot of "People will do anything, no matter how absurd, to avoid facing their own souls." - C.G. Jung

"Memory is a complicated thing, a relative to truth, but not its twin." - Barbara Kingsolver

Neither of the quotes above were included in this book, but they speak to some of the ideas at its core. Anyone who has any social psychology, experimental methods course, and/or paid cursory attention to the bevy of material out there about how the human mind and we, as people, work, will find a lot of familiar concepts in Mistakes Were Made . That is not to say, however, that it's not worth reading.

The overarching principles being examined are those of cognitive dissonance and self-justification . And, before you get all defensive (get it?), these are normal and necessary facets of a human mind-brain (as Krieger might call it). I was going to go into this elaborate robot "does not compute" comparison to illustrate the nature of cognitive dissonance, but then I figured that I'd leave it to Lucille Bluth.



Basically the reasoning parts of our brain shut down when confronted with "dissonant" information, and the emotion circuits light up. "These mechanisms provide a neurological basis for the observation that once our minds are made up, it is hard to change them."

As Lucille points out, much of this occurs with respect to our sense of self as well as our need to find explanations for current problems are situations. Confirmation bias and confabulation are just two of the means by which we find evidence for what we're looking for, and causes that aren't there and there are plenty of great research and case studies (some of which is in this book) that illustrate these ideas.

These ubiquitous feats of mental gymnastics give rise to various appalling truths, one of which is best described by research psychologist John Kihlstrom. "The weakness of the relationship between accuracy and confidence is one of the best-documented phenomena in the 100-year history of eyewitness memory research."
So, basically, the least accurate (in this case witnesses) tend to have the most confidence in their accuracy. And the implications of this aren't restricted to the courtroom. I'm not sure I love the choice of case studies of this phenomenon among professionals in this book (the recovered memory movement in therapy, and gross miscarriages of justice in, well, the justice system), as they undermine quotidian examples (we literally do this all day every day). However, the finding that was, to me, most chilling was that in these cases "training does not increase accuracy; it increases people's confidence in their accuracy."

So, in keeping with the spirit of the book, I have to acknowledge my own sequential bias, I've read a lot of other books that covered this material and because it was new to me then, I'm prejudiced to think it was more interesting in those books...so do with that what you will.

Recommended reading:
- Moral Tribes: Emotion, Reason, and the Gap Between Us and Them
- Predictably Irrational: The Hidden Forces That Shape Our Decisions
- How We Decide
- Sway: The Irresistible Pull of Irrational Behavior ...more
5

Mar 23, 2014

OMFG. This book is relentless. Reading it is an ordeal. A wonderful, fruitful ordeal. But an ordeal none the less. Every page and chapter has been an opportunity for self examination and (I hope) enhanced self honesty, insight and personal growth.

And just in case that sounds to woo woo for you. It should be noted that the assertions made in the book are backed by decades worth of hard, experimentally derived evidence.

It doesn't get much better than that.

Both authors are respected researchers in OMFG. This book is relentless. Reading it is an ordeal. A wonderful, fruitful ordeal. But an ordeal none the less. Every page and chapter has been an opportunity for self examination and (I hope) enhanced self honesty, insight and personal growth.

And just in case that sounds to woo woo for you. It should be noted that the assertions made in the book are backed by decades worth of hard, experimentally derived evidence.

It doesn't get much better than that.

Both authors are respected researchers in the field of social psychology. A field that is no stranger to dramatic overstatement (to say the least). But also, a field that produces some of the most denuding, insight producing, and frankly, disturbing findings of all the sub fields of psychology.

The central construct explored in the book is Cognitive Dissonance. Leon Festinger's venerable finding that individuals who hold two or more contradictory beliefs, ideas, or values at the same time, or who behave in ways that contradict there values and beliefs will experience excessive mental stress and discomfort. Furthermore, individuals suffering from said mental stress and discomfort will be motivated to reduce the crappy feeling by lying to themselves and others, and even bending and recasting memories of events, in order justify their hypocritical positions and actions.

In case your wondering who those blind, tortured souls are who think and behave in such insane, self delusional and amoral ways. It's you, me and everyone else we know. In other words, everyone.

This thing (Cognitive Dissonance) is in fact a feature (not a flaw) of the human mind/brain. But as is the case with so many human psychological features, it can get us in a FUCK TON of trouble if allowed to operate unchecked.

Imagine (just imagine) living in a 360 degree wrap around lie in which we falsely perceive ourselves as heroic victims to our own needless and profound detriment (let the finger pointing begin). Sounds pretty bad right. That is what's at stake here. But fear not, there is a pathway out of the matrix*.

As I mentioned. Encountering the material in this book is very growth engendering. The book is literally a partial antidote to the poison it describes. But be warned, the antidote burns as it goes down.

The cost that the reader pays for the afore mentioned rewards (enhanced self honesty, growth, insight etc), is the very experience of painful dissonance the book so expertly describes.

The cost incurred to exit the matrix is minor in hindsight. But paying that cost is aversive enough to prevent all of us at one time or another to run and hide from the truth. The cost that I'm referring to, is the naked experience of the pain of realizing that we are in fact human after all.

Ironically, the unwillingness to face and experience these feelings is the active ingredient. And the crazy webs we weave in order to maintain said experiential avoidance is the aforementioned poison.

This is a circuitous way of saying that you can't help but recognize and feel the pain of the human condition when you read this fantastically well executed, educational and therapeutic book.

For an equally eye opening, and decidedly more fun exploration of analogous territory, read Robert Kurzban's Why Everyone (else) Is a Hypocrite.

Someone in my FB feed shared that their therapist recommended that she not read Nietzsche or Camus while she was experiencing a bout of depression (see if you can count the pretentious statements masquerading as self deprecating humor in that statement).

I'd have to include this little gem in that list. You definitely want to be on stable footing when you read this thing. If not, than hide the sharp objects and designate a trusted friend to be at the ready to talk you down when it hits you how hopelessly self delusional all us humans actually is.

That being said. I call this absolutely essential reading. It's as necessary as Khanaman's Thinking Fast and Slow. A dreadfully boring but crucial read for anyone who hasn't had the privilege/curse of studying Cognitive Psychology, or even if you have. Identifying that and how the Human mind/brain is biased (or rather. evolutionarily conditioned) to interpret information and instruct behavior accordingly programmatic ways is as close as we currently have to taking the red pill*.

* Note: The red pill and its opposite, the blue pill, are pop culture symbols (originating in the comic and film The Matrix) representing the choice between embracing the sometimes painful truth of reality (red pill) and the blissful ignorance of illusion (blue pill).

So go ahead. Read the book, eat the red pill, embrace the painful glare of the light of day, and live a life of more freedom. This is psychology at its most potent best. ...more
5

Apr 15, 2008

Sometimes, I think that the world is full of hypocrites. The news is full of politicians who preach family values and then are caught in an affair. Everyday we see religious advocates who call for peace and in the same breath state that their God is the only true God. Then, there's the business world where lying and cheating seem to be part of the game.

Sometimes, I wonder how these people live with themselves.

Mistake Were Made (but not by me) addresses that exact question. It would seem that Sometimes, I think that the world is full of hypocrites. The news is full of politicians who preach family values and then are caught in an affair. Everyday we see religious advocates who call for peace and in the same breath state that their God is the only true God. Then, there's the business world where lying and cheating seem to be part of the game.

Sometimes, I wonder how these people live with themselves.

Mistake Were Made (but not by me) addresses that exact question. It would seem that the human mind is designed to selectively remember and process information. Thus, the politician, religious leader, business person, or even ourselves often don't realize that we are being hypocritical. Moreover, as our actions and logic become further and further separated, we tend to hold tighter onto our original notions. Instead of admitting that we were wrong, we justify our actions even more strongly.

Mistake Were Made (but not by me) was a huge eye opener. People don't justify stupid decisions because they are bad people. On the contrary, no one wants to admit they are a fool. Look within, what beliefs do you fight the most adamantly about? ...more
2

Feb 01, 2012

Ultimately, I think that Tavris's conclusions about self-justification are probably correct, but her argument was flawed. There were a number of things that put me off from this book. Here's my list of gripes:

1) The book relied much too heavily on anecdotal evidence to prove its points. Tavris did back up her claims about self-justification with some psychological research (that sounded like it was peer-reviewed, I guess), but it was pretty sparse (like 1 study per chapter if that---as opposed Ultimately, I think that Tavris's conclusions about self-justification are probably correct, but her argument was flawed. There were a number of things that put me off from this book. Here's my list of gripes:

1) The book relied much too heavily on anecdotal evidence to prove its points. Tavris did back up her claims about self-justification with some psychological research (that sounded like it was peer-reviewed, I guess), but it was pretty sparse (like 1 study per chapter if that---as opposed to anecdote after anecdote after anecdote). Plus, she never really discussed the full context of the studies she cited, nor did she ever give any qualifications for the research or her own conclusions.

2) The overly sanctimonious, self-righteous tone of the book was a total turn-off. For the most part, I felt that it really condemned the people in the examples of self-justification that Tavris wrote about. Even though she had a good point, I feel that most of the situations are more complex than she made them out to be.

3) She used a lot of logical fallacies. Her pet metaphor of the "pyramid" is just another version of the slippery slope fallacy. And she heavily relied on either-or logic to support her claims.

4) Throughout the whole book she speaks about self-justification as though it were a fundamental flaw in human psychology. I think that is far from the truth. My contention is that evolution created the human brain the way it is for a reason. If it didn't serve a purpose, self-justification would have been discarded long, long ago because it would have caused humans to make disastrously bad decisions. But the truth is, self-justification and other illusions we create about ourselves and our world are extremely important to our ability to function in the world. I don't have time to go through all of the useful purposes that these cognitive processes serve, but here's a few: seeing ourselves as good people allows us to achieve more than people who are depressed and who have a more realistic perception of themselves---and applying patterns from old situations to new ones helps us to adapt to novelty and change more effectively. In short, a more nuanced acknowledgement of the complexity of the human brain would have been better---and more informative.

I'd read Invisible Gorilla instead. It says the same basic thing at this book, but in a much more compelling and informative way. ...more
0

Feb 06, 2017

This was really a GREAT book, but a lot has changed since i picked it up *but i promise that i'll get back to you and read THE WHOLE book this time not just few pages. it's not you sweetie it's me....sometimes i talk to books like they're real people, not creepy at all*! i got really distracted by shiny new books that were delivered to my house and i just couldn't focus on this one.

Sorry..
3

Apr 17, 2008

This is yet another wonderful book written by social psychologists, although it is probably unlikely to make the New York Times best seller list for a couple of reasons. First, this book ranks right up there with Jimmy Carter’s famed “Great Malaise” speech that pointed an accusing finger at the American people for all of their problems. No one wants to know that WE are the cause of the problem, just like no one really wants to know that I made a mistake, not someone else. This book is about This is yet another wonderful book written by social psychologists, although it is probably unlikely to make the New York Times best seller list for a couple of reasons. First, this book ranks right up there with Jimmy Carter’s famed “Great Malaise” speech that pointed an accusing finger at the American people for all of their problems. No one wants to know that WE are the cause of the problem, just like no one really wants to know that I made a mistake, not someone else. This book is about cognitive dissonance and the power of rationalization in many domains of life. The problem with this topic (as I have found after many quarters of teaching it to college students), is that even after learning the concept, literally no one likes to think that they actually engage in these mental gymnastics. Biases in perception, even the automatic activation of stereotypes are easier to get people to believe than trying to show them how every decision or experience we have is colored by the process of making ourselves appear consistent. In reality, we are all highly hypocritical in countless ways, but as the authors show over and over again, this is much easier to detect in others than in ourselves. Only suggestion I would make is to try to use more examples from across the political spectrum to arrest any rationalization ammo for critics of the book. Would recommend to everyone (along with Aronson’s other books). ...more
5

May 25, 2010

As someone interested in the psychology of religion, it's always interesting to me how cognitive weaknesses play a role in establishing and maintaining religious beliefs. Some atheists are wont to believe that religion is a kind of mental illness, but this book (and others) make it clear that's really not so. The vast majority of religious people are cognitively normal. It's just that normal human cognition is very prone to making certain kinds of errors, and religious memes propagate very As someone interested in the psychology of religion, it's always interesting to me how cognitive weaknesses play a role in establishing and maintaining religious beliefs. Some atheists are wont to believe that religion is a kind of mental illness, but this book (and others) make it clear that's really not so. The vast majority of religious people are cognitively normal. It's just that normal human cognition is very prone to making certain kinds of errors, and religious memes propagate very easily on this substrate. As an example, for a religious person to admit that there are no gods, they have to confront the enormous cognitive dissonance that they think of themselves as smart, well-educated, pragmatic - but have, for many years, been putting vast amounts of effort, emotion, thought, and perhaps money into something that hasn't the slightest basis in reality. For someone who was devoutly religious, this is the granddaddy of all cognitive dissonance. That so many people manage to confront this and deal with it is quite impressive.

One of the things I like about this book is that for every section on various instantiations of cognitive dissonance and self-justification, they close by talking about someone who has overcome this natural propensity, and done right. The therapist who confronts the fact that she helped people "recover" false memories of abuse, and meets with the affected families to try to set things right. The prosecutor who accepts that he had an innocent man incarcerated for years, and comes back to the case. One of the best examples in my opinion is Edzard Ernst, who is not in this book, as they don't discuss "alternative" medicine. He confronted the fact that he had been giving people useless medical treatments for years, as a homeopathic doctor, and has since become a crusader for science-based medicine.

One of the most disappointing realizations for me, as an educator, is that clear explanation with ample evidence generally will not change people's minds. Many people who think they've been abducted by aliens are well aware of the phenomenon of sleep paralysis, but - for no justifiable reason - reject it as an explanation of their experiences. I've spoken with a climate change denialist who swore up and down that they understood the greenhouse effect just fine - and then immediately turned around and said something clearly contradicting this theory.

Education isn't entirely futile, though. First, if we can educate people before they've formed their opinions on the subject, that will have a dramatic difference. Second, a large-scale, concerted education effort can change some minds. This can lead to changes of the intellectual environment that can persuade others via non-rational means. Smokers in the 1940s didn't understand the link between smoking and lung cancer. Almost every smoker today does understand this link (although they smoke anyway, exercising ample self-justification). But we've managed to convince enough people that the society in the US has changed, and smoking is much less accepted (and as a result much less common).

Science was developed to counteract all the problems mentioned in this book. Nobody likes to be wrong, and scientists are no exception, but they are professionally forced to be. To be sure, being too wrong can cost them prestige, money, or jobs, but they're expected to be wrong fairly frequently. And the whole endeavor of science is set up to make it clear when someone is wrong. Scientists aren't allowed to conduct their arguments in a vague or metaphorical manner, and must be vulnerable to proof that they are wrong (mathematical or empirical). And while individual scientists can be recalcitrant, the discipline as a whole is self-correcting, and moves on. The world would be a vastly better place if everyone aspired to the scientific ideal. ...more
5

Nov 30, 2018

In this book we see the the trail of self-justification through the territories of family, memory, therapy, law, prejudice, conflict, and war.
How we do self justification before the decision and after the decision is taken
Example : U.S - Iraq war
It show how self justification work in Iraq war.It started with Iraq had pile of weapon of mass destruction but at the end of event there was no such thing still we try to justified the event with other matter such as stability in Middle East democracy In this book we see the the trail of self-justification through the territories of family, memory, therapy, law, prejudice, conflict, and war.
How we do self justification before the decision and after the decision is taken
Example : U.S - Iraq war
It show how self justification work in Iraq war.It started with Iraq had pile of weapon of mass destruction but at the end of event there was no such thing still we try to justified the event with other matter such as stability in Middle East democracy and terrorism etc .
People even see lack of evidence as an evidence .
Consequence of self-justification: how it exacerbates prejudice and corruption, distorts memory, turns professional confidence into arrogance, creates and perpetuates injustice, warps love, and generates feuds and rifts.
How cognitive dissonance works?
cognitive dissonance, the hardwired psychological mechanism that creates self-justification and protects our certainties, self-esteem, and affiliations.
Then book talk about of Bias of memory .
How Memory is reconstructive and subjective to source of confusion.
Parent blaming is popular and convenient form of self justification .
Memories create our stories, but our stories also create our memories. Once we have a narrative, we shape our memories to fit into it and assemble as mosaic form .
Imagination inflammation
MORAL OF THE BOOK
The moral of the book is easy to say, and difficult to execute. When you screw up, try saying this: "I made a mistake. I need to understand what went wrong. I don't want to make the same mistake again.
An appreciation of how dissonance works, in ourselves and others, gives us some ways to override our wiring. And protect us from those who can't. ...more
5

May 06, 2015

Extremely interesting social psychology book on the reasons why do people do the things they do.
The author presents compelling arguments (supported by the evidence of many studies and experiments) for some puzzling human behaviours, such as why people insist on justifying indefensible positions long after they are proven wrong. She explains, among other things, the power of gifts (even low value) in swaying decision making, the reasoning behind stereotypes and strongly denied biases (and why no Extremely interesting social psychology book on the reasons why do people do the things they do.
The author presents compelling arguments (supported by the evidence of many studies and experiments) for some puzzling human behaviours, such as why people insist on justifying indefensible positions long after they are proven wrong. She explains, among other things, the power of gifts (even low value) in swaying decision making, the reasoning behind stereotypes and strongly denied biases (and why no one is immune of such behaviour), the fallacy of memory (distorted or confabulated memories leading to the extremes of believing themselves victims of sexual abuse or alien abduction).

Other areas for which dissonance and obstinate self-justification are problematic include law enforcement (that could results in false confessions and wrong criminal prosecution), relationships (leading to nasty quarrels and divorce) and conflicts (to extremes of torture and war crimes).
Highly recommended. 4 stars rounded up because I just loved the last chapter on the attitudes to learning and the importance of encouraging children to accept their mistakes.

As per usual, a selection of my favourite quotes (and there are many, many more):

Between the conscious lie to fool others and unconscious self-justification to fool ourselves lies a fascinating gray area, patrolled by that unreliable, self-serving historian—memory. Memories are often pruned and shaped by an ego-enhancing bias that blurs the edges of past events, softens culpability, and distorts what really happened.

Prejudices emerge from the disposition of the human mind to perceive and process information in categories. "Categories" is a nicer, more neutral word than "stereotypes," but it's the same thing.

The brain is designed with blind spots, optical and psychological, and one of its cleverest tricks is to confer on us the comforting delusion that we, personally, do not have any. In a sense, dissonance theory is a theory of blind spots—of how and why people unintentionally blind themselves so that they fail to notice vital events and information that might make them question their behavior or their convictions.

Just as we can identify hypocrisy in everyone but ourselves, just as it's obvious that others can be influenced by money but not ourselves, so we can see prejudices in everyone else but ourselves. Thanks to our ego-preserving blind spots, we cannot possibly have a prejudice, which is an irrational or mean-spirited feeling about all members of another group. Because we are not irrational or mean spirited, any negative feelings we have about another group are justified; our dislikes are rational and well founded. It's theirs we need to suppress.

Understanding how the mind yearns for consonance, and rejects information that questions our beliefs, decisions, or preferences, teaches us to be open to the possibility of error. It also helps us let go of the need to be right. … When confidence and convictions are unleavened by humility, by an acceptance of fallibility, people can easily cross the line from healthy self-assurance to arrogance.

At all ages, people can learn to see mistakes not as terrible personal failings to be denied or justified, but as inevitable aspects of life that help us grow, and grow up.
...more
4

Dec 17, 2011

Four words:

Cognitive dissonance
Confirmation bias

According to the authors, therein lies the explanation for people's unwillingness to admit mistakes, even to themselves, in a variety of realms. This far-reaching book tackles irrational prejudices, false memories, misjudgement as a psychotherapist, prosecuting the wrong individual, blaming one's spouse for marital problems, etc. And it offers a basic explanation: we have a difficult time integrating two conflicting beliefs, such as "I'm a great Four words:

Cognitive dissonance
Confirmation bias

According to the authors, therein lies the explanation for people's unwillingness to admit mistakes, even to themselves, in a variety of realms. This far-reaching book tackles irrational prejudices, false memories, misjudgement as a psychotherapist, prosecuting the wrong individual, blaming one's spouse for marital problems, etc. And it offers a basic explanation: we have a difficult time integrating two conflicting beliefs, such as "I'm a great person" and "I messed up" (cognitive dissonance). We will therefore respond by coming up with all kinds of creative ways to challenge the less desirable belief (usually "I messed up") in favor of clinging to the more desirable belief ("I'm a great person"). In an effort to convince ourselves that the more desirable belief is the correct one, we will selectively focus on evidence supporting the more desirable belief and deny, ignore, or minimize evidence supporting the less desirable belief (confirmation bias).

The authors' examples are fascinating and it's a great topic. Their explanations are arguably a little facile. Can we really know what's going on inside someone's head? Are all self-justifications a matter of cognitive dissonance? Are people ever correct for clinging to a belief or course of action even in the face of conflicting evidence? The fact that the points feel belabored at times suggests that the thesis may be too simple and one-dimensional to explain all the various anecdotes.

Criticism notwithstanding, this is a great topic. We could all do with a little more self-reflection when it comes to stubbornly clinging to beliefs or actions that may be detrimental. And although the actual explanations for this phenomenon are probably more complex and varied, the authors offer a good start at facing this problem and attempting to understand and hopefully challenge people's unwillingness to admit mistakes. ...more
4

Dec 07, 2013

The title of the book gives the impression that it's a self-help book. It's more of a psychology book explaining how people can make mistakes, think they are right, and honestly believe that. A good example is false memories. How often have you said, "I could have sworn I did that." You see the event in your head, yet evidence shows it didn't happen. You rationalize it ("someone must have moved it") instead of accepting the most obvious answer ("I was mistaken in thinking that I did it").

The The title of the book gives the impression that it's a self-help book. It's more of a psychology book explaining how people can make mistakes, think they are right, and honestly believe that. A good example is false memories. How often have you said, "I could have sworn I did that." You see the event in your head, yet evidence shows it didn't happen. You rationalize it ("someone must have moved it") instead of accepting the most obvious answer ("I was mistaken in thinking that I did it").

The book goes even further into big mistakes that people make and refuse to admit, such as in the criminal system where suspects are locked away for years ("I know he's the rapist so I'll interrogate him for hours until he finally confesses") until DNA finally proves their innocence. Fortunately for most people, they are not making mistakes that mean life or death. The book contains many extreme examples. Still, this is a great book to read to understand and recognize your own mistakes. For example, maybe a friend asked for a favor and you said no. Initially, you felt a little guilty for saying no. Then you start justifying the answer, "She wouldn't have helped me if I had asked for a favor. She's always looking for someone to do her work." So that your guilty feeling goes away. It's a rude awakening to realize how your feelings have completely changed -- from feeling guilty to thinking your friend is selfish and lazy. ...more
5

Jun 22, 2012

This is my favorite book, period! Carol Tavris and Elliot Aronson demonstrate how cognitive dissonance accounts for our inability to see our faults, from our personal lives all the way to the highest levels of government. This will change the way you view your own thoughts and actions, and make you a better person as a result.
4

Dec 24, 2011

The authors describe a "dissonance theory" of self-justification. We don't like thinking of ourselves as ignorant or ill-intentioned, so to avoid this dissonance, we try to convince ourselves and others that we are doing the right thing. We may justify to protect our high self-esteem or even our low self-esteem, if that is our default state that we are reluctant to leave.

Justification of incorrect beliefs or forbidden actions is easy when it is done incrementally, what we often call a "slippery The authors describe a "dissonance theory" of self-justification. We don't like thinking of ourselves as ignorant or ill-intentioned, so to avoid this dissonance, we try to convince ourselves and others that we are doing the right thing. We may justify to protect our high self-esteem or even our low self-esteem, if that is our default state that we are reluctant to leave.

Justification of incorrect beliefs or forbidden actions is easy when it is done incrementally, what we often call a "slippery slope". (The famous Milgram experiment in which college students were willing to electrocute other research subjects was an example of such incremental self-justification, because if the student can justify 50 volts than he can eventually justify 450 volts.) Depending on which way we first lean from the top of the pyramid, we can land at different sides of the pyramid, because once we start on a course of action we tend to continue justifying our actions in the same direction. As we self-justify and confabulate, we may develop false memories of things like having been abducted by aliens, molested as a child, imprisoned in a concentration camp or kept in an orphanage. We may unfairly persecute or wrongfully convict others.

Children under five have trouble differentiating between things they have heard and things they have actually experienced; in adulthood, we tend to forget details as years go by, so we wind up with a related problem of being unable to distinguish reality from our fantasized or chosen narratives. This is most apparent when comparing relationship narratives between happy couples and divorcing couples.

Under other circumstances, in a compressed time frame of interrogation, but according to a similar mental process, some people confess to crimes they did not commit.

Introspection, rather than fixing the problem, unfortunately often triggers even more self-justification. We all have blind spots, prejudices, and a tendency to prefer "us" over "them," but it's difficult for us to see our own limitations. Assuming we are reasonable by nature, we sometimes forgo the scientific method or engage in a biased version of it and assume that our thoughts must be reasonable because we, not someone else, thought them. We say "that's the way I am" to excuse our own behavior and we say "that's the way they are" to condemn others' behavior. ...more
5

Dec 24, 2008

This was by far the best book I have read in quite a few years. Highly recommended. It was so informative and engaging that I think I wore out my welcome reading it out loud to anyone who was nearby.

Written by two social psychologists and based on years of research, it provides a fascinating overview of cognitive dissonance, and how it applies to prejudice, memory, law, marriage, and war. The most chilling aspect of the book is that it points out how we all are subject to dealing with dissonance This was by far the best book I have read in quite a few years. Highly recommended. It was so informative and engaging that I think I wore out my welcome reading it out loud to anyone who was nearby.

Written by two social psychologists and based on years of research, it provides a fascinating overview of cognitive dissonance, and how it applies to prejudice, memory, law, marriage, and war. The most chilling aspect of the book is that it points out how we all are subject to dealing with dissonance (usually in self-justifying ways), what we think we know or remember is probably not the case, regardless of which side we're on, and most of our leaders and public figures shirk responsibility for mistakes.

Some highlights:
- Reasoning areas of the brain "virtually shut down" when we are confronted with dissonant information, and emotion circuits light up when consonance is restored. Basically, this shows that there is a neurological basis for the fact that once we make up our minds, it is pretty hard to change them.
- "Naïve realism" - the "inescapable conviction" that we all have, that we see things as they really are. If someone has a different opinion they obviously aren't seeing things clearly.
- Being "absolutely, positively sure" a memory is correct doesn't mean it is. We can even have vivid false memories full of emotion and detail. For example, people can recover memories of abuse, which is shown to be dubious (notably the authors mention how Martha Nibley Beck, Hugh Nibley's daughter, created memories of abuse by her father that she was convinced of). Some people even experience alien abduction without it actually happening. Basically, we can have experiences that we think are real, especially in the past, yet they never happened. Without some outside confirming source, we cannot trust our memories too much.
- "Parent blaming" - a convenient form of self-justification; it allows people to live with regrets or mistakes because all the mistakes were made "by them."
- Both Bill Clinton and George W. Bush have been guilty of self-justification and failure to admit their mistakes. In fact, the last president to clearly admit to a major mistake was John F. Kennedy. Really, are we convinced that no president since then has messed up? What was really interesting is that the two presidents to use the phrase "mistakes were made" the most, were none other than Richard Nixon (of course) and, wait for it, the beloved Ronald Reagan. What is so insidious about the phrase (which Clinton even joked about using it so much) is that it is a complete avoidance of responsibility.
- Finally, resolving dissonance is not completely bad, and does serve to preserve our beliefs, confidence, and self-esteem. However, it also gets us into trouble. Hence, the authors suggest that it is possible to remain committed to a religion, political party, or partner, yet understand that "it is not disloyal to disagree with actions or policies" that one believes are inappropriate, misguided, or immoral. ...more
5

Jun 23, 2007

I've been a longtime fan of both authors (especially Tavris), so my expectations were pleasantly met. Most of it, of course, is hammering away at how the fundamental attribution error influences relationships between couples, coworkers, or nations. They reframe the psychobabble as "self-justification" as the root of these conflicts and ongoing interpersonal difficulties. Their citations of clinical works also brings up the interesting possibility that mindfulness-based interventions may be most I've been a longtime fan of both authors (especially Tavris), so my expectations were pleasantly met. Most of it, of course, is hammering away at how the fundamental attribution error influences relationships between couples, coworkers, or nations. They reframe the psychobabble as "self-justification" as the root of these conflicts and ongoing interpersonal difficulties. Their citations of clinical works also brings up the interesting possibility that mindfulness-based interventions may be most effective when they undermine self-justification, inserting some space between events and affect, but that's fodder for a much longer post. ...more
4

Nov 29, 2016

Rather than actually write a review three years on, I will refer you to my colleague Susan Stepney's first-rate review: https://www-users.cs.york.ac.uk/susan...
"No-one is a monster in their own view, yet people do monstrous things. At a less extreme level, people do petty and mean things too. Why?

The thesis of this book is that we rewrite our memories to overcome cognitive dissonance. How can we have done a bad thing, if we are good people?"

The best review I saw here is by Trevor: Rather than actually write a review three years on, I will refer you to my colleague Susan Stepney's first-rate review: https://www-users.cs.york.ac.uk/susan...
"No-one is a monster in their own view, yet people do monstrous things. At a less extreme level, people do petty and mean things too. Why?

The thesis of this book is that we rewrite our memories to overcome cognitive dissonance. How can we have done a bad thing, if we are good people?"

The best review I saw here is by Trevor: https://www.goodreads.com/review/show...
"... a deeply troubling book. Parts of this book felt like a mirror had been held up to me and I have to say that I really didn’t like what I saw. But this is a very important book and one that demands to be read. I recommend it without hesitation." ...more
4

May 23, 2008

Mistakes Were Made is a tour through the different ways in which cognitive dissonance motivates otherwise normal, good people to do wretched things. Making such stops as the tragedies of recovering so-called repressed memories, the unfortunate bias of the parts of the legal system which are immune to criticism, and growing disparities of perception between perpetrators and victims, Mistakes Were Made also highlights many other scientific and psychological tidbits. Carol Tavris and Elliot Mistakes Were Made is a tour through the different ways in which cognitive dissonance motivates otherwise normal, good people to do wretched things. Making such stops as the tragedies of recovering so-called repressed memories, the unfortunate bias of the parts of the legal system which are immune to criticism, and growing disparities of perception between perpetrators and victims, Mistakes Were Made also highlights many other scientific and psychological tidbits. Carol Tavris and Elliot Aronson weave a slowly accelerating narrative of the power of cognitive dissonance in our lives, one that grows ever closer to home. At the end of each chapter and at the end of the text, the authors provide examples of people who chose the better path. As for an explanation of how to do so, it is lacking, but in my own reading I oft thought about my own dissonances and my own mistakes. Mistakes Were Made highlights the biases we have toward creation delusional or unproductive stories to guide our lives. It is up to us to create the more accurate stories.

"When a friend makes a mistake, the friend remains a friend, and the mistake remains a mistake." — Israeli Prime Minister Shimon Peres ...more
4

Apr 07, 2008

A highly engaging discussion on how people use self-justification to avoid admitting they've made a mistake or hurt someone or otherwise deal with the "cognitive dissonance" we encounter when one of our cherished beliefs runs aground on the rock of cold, hard reality. The one quibble I would have is the division the authors make of the world into "perpetrators" and "victims" -- a language that masks the real complexity of certain relationships and interactions in which both parties are one and A highly engaging discussion on how people use self-justification to avoid admitting they've made a mistake or hurt someone or otherwise deal with the "cognitive dissonance" we encounter when one of our cherished beliefs runs aground on the rock of cold, hard reality. The one quibble I would have is the division the authors make of the world into "perpetrators" and "victims" -- a language that masks the real complexity of certain relationships and interactions in which both parties are one and the same and at the same time neither -- which is when the conflicts become really intractable and the self-justifications that much harder to see through and to walk away from. Nonetheless, it is eye opening and provides a useful tool for self-diagnosis, even if the suggestions for how to deal with someone else suffering from "cognitive dissonance" and "confirmation bias" are little light (not that I wouldn't have minded having read that last chapter about a week earlier, considering some recent going-ons here and there). ...more
3

Jan 02, 2012

A bit uneven and towards the end a bit too Oprah-centric. Felt like the book drifted from a scientific/psychological work to a clinical/self-help piece (a rational, scientifically grounded self-help book, but still one regardless). It was interesting, but sadly disappointing too.
5

Jan 06, 2014

This book attempts to explain and provide an answer to the question, "how do you sleep at night?" Despite everything we do—even when it is at odds with our beliefs—cognitive dissonance allows us to say, "very well, thank you." As the authors write, "without self-justification, we might be left standing emotionally naked, unprotected, in a pool of regrets and losses." Although cognitive dissonance allows us to tell ourselves that we're decent human beings, it can lead to great contempt and This book attempts to explain and provide an answer to the question, "how do you sleep at night?" Despite everything we do—even when it is at odds with our beliefs—cognitive dissonance allows us to say, "very well, thank you." As the authors write, "without self-justification, we might be left standing emotionally naked, unprotected, in a pool of regrets and losses." Although cognitive dissonance allows us to tell ourselves that we're decent human beings, it can lead to great contempt and embarrassment that prevents the "truth" from ever setting us free.

The book does a great job of explaining cognitive dissonance—the tension between the beliefs one holds and the actions they take—and how we self-justify to vindicate our behaviour. The authors refer to a "pyramid of choice" to explain how dissonance comes not from abrupt decisions, but after many choices, in which our actions slowly shift away from our beliefs. Unlike a lie, which is something (1) we know is wrong and (2) tell to others, self-justification is something we tell ourselves, and we may not even know it is taking place. To an outsider, our actions may appear absurd, but to us they might seem completely rational—the "right" choice.

The book goes through myriad examples to explain how and why self-justification and cognitive dissonance exist and persist. From relationships to interrogations, the examples in this book can be directly related to your life experiences (though hopefully you haven't experienced too many interrogations). There's even some humour thrown in; the authors are telling a story of a man (Criner) who was wrongly convicted of rape, and the inability of the prosecutor (McDougal) to admit his error of the wrongful conviction. They write, "technically, of course, McDougal is right; Criner could have raped the woman in Texas and ejaculated somewhere else—Arkansas, perhaps."

I have read of cognitive dissonance in previous studies of social psychology, but never in as fulfilling a book as Tavris and Aronson's. There was an excellent balance of theory and example, which made it a book I didn't want to put down. It seems like every five pages had an "Ah ha!" moment that I could apply to my own life: the closed loop of confirmation bias; the pain of being the victim but the indifference of being the perpetrator; and the increased need to justify our actions to ourselves as we inflict more pain on others (or as we invest more in a project that is criticized by others).

As with (hopefully) any book, there were a few questions I had along the way as I read. First, what good can be served by cognitive dissonance? As I mentioned at the outset, it allows us to sleep at night. I'm sure all of us have done things in life we're not proud of; can you imagine if you reminded yourself every day how terrible you are? Sometimes, we need to lie to ourselves, but when is it easier/better to admit we're wrong than to take the emotional toll of lying to ourselves and others?

Second, if we tend to support our previous decisions by telling ourselves they were the right ones, why do we so frequently experience what I can only call "the grass is greener on the other side" thoughts? I wish the book had addressed that idea.

Third, how can this book be applied to matters that are seemingly intractable, such as abortion? When the two sides don't even agree over what the issue is (time of conception for pro-life; women's rights for pro-choice), it becomes very difficult, if not impossible, to compromise. And how do you "compromise" over such an important issue? This, along with the existence of God, is something I cannot imagine will ever be solved, but cognitive dissonance and confirmation bias make it that much more difficult.

Near the end of the book, I started thinking of the concept of karma. Here is an idea based in Eastern Religion that I can only imagine focuses on the self—"if I do this , I will somehow be affected in the future." Yet somehow, karma has become something we assign to others—"He'll get what's coming to him." In a way, we seem to justify our responses to others by what they've done: if he wronged me, it's simply karma that I'm doing harm back to him. The authors covered this very well in their discussion of the conflict between Muslims and Christians. Who started it, and when? Each side has drifted so far down (opposing sides of) the pyramid that they aren't even using the same rule book anymore.

At one point, the authors discussed how people sometimes become so entrenched in their position, that the position itself becomes secondary. Consider the Israeli demands for peace in the Middle East. When presented to Israelis, they are largely supported. However, when those same demands are presented to Israelis but tabled as if they were Palestinian demands, they are largely rejected by Israelis. This reminds me a lot of politics in North America: it doesn't seem to matter what is said anymore (the ideas of most politicians all tend to converge toward the center anyway), but who said it. An idea is heralded by Republicans when it came from Bush; the same idea is denounced as near-treason when it comes from Obama. Liberals fully support an idea from Trudeau but reject it from Harper. We need to recognize our cognitive dissonance and listen to the message.

If there's only one thing I'll take from this book, it's that we need to be aware of dissonance, expect it, and try to understand it—especially within ourselves—if we have any chance of overcoming it. It will destroy relationships, companies, and lives, and although it certainly has its place, it does more harm than good. ...more
5

Feb 20, 2018

I was (uncomfortably) shocked to the core reading this- others attest to same. Particularly disturbing are the passages on the judiciary. None of it sits comfortably on any level, and I found myself 'guilty' on most charges. Working on ways currently to mitigate the self-justification effect for myself: but its sooooo hard. Especially when you don't realise you're doing it.
3

Oct 19, 2017

"A tree Is Known By its Fruit " , so as this book
The title of the book was precisely picked .It was such a great experience and trip for me to read such a wonderful and well written socio psychological book. The Author really did a great job to the extent that every page was a new experience for me .

-Through this book You will understand "why some people try hard to justify their own mistakes and not to admit them"
-as the author wrote :
“In the horrifying calculus of self-deception, the "A tree Is Known By its Fruit " , so as this book
The title of the book was precisely picked .It was such a great experience and trip for me to read such a wonderful and well written socio psychological book. The Author really did a great job to the extent that every page was a new experience for me .

-Through this book You will understand "why some people try hard to justify their own mistakes and not to admit them"
-as the author wrote :
“In the horrifying calculus of self-deception, the greater the pain we inflict on others, the greater the need to justify it to maintain our feelings of decency and self-worth.”

-AND one of the best parts I admired in this book is the following part :
“This habit starts awfully early. Social psychologist Marilynn Brewer, who has been studying the nature of stereotypes for many years, once reported that her daughter returned from kindergarten complaining that “boys are crybabies.”25 The child’s evidence was that she had seen two boys crying on their first day away from home. Brewer, ever the scientist, asked whether there hadn’t also been little girls who cried. “Oh yes,” said her daughter. “But only some girls cry. I didn’t cry.” Brewer’s little girl was already dividing the world, as everyone does, into us and them. Us is the most fundamental social category in the brain’s organizing system, and it’s hardwired.” ...more
5

Jun 15, 2015

I have dubbed this book, 'The Analytical Sledgehammer.'

Mistakes Were Made has become one of my favorite books of all time. It should be required reading for every human being. At its heart, this book examines everything humans believe about their own selves and the world at large. How have we come to believe what we do about ourselves, the people we love, & those we punish? Where did ideas of fairness come from. Why is it so hard to admit fault? What does it all mean on a personal and I have dubbed this book, 'The Analytical Sledgehammer.'

Mistakes Were Made has become one of my favorite books of all time. It should be required reading for every human being. At its heart, this book examines everything humans believe about their own selves and the world at large. How have we come to believe what we do about ourselves, the people we love, & those we punish? Where did ideas of fairness come from. Why is it so hard to admit fault? What does it all mean on a personal and societal level?

This book will appeal to anyone with a human brain. The studies used in the book are sound and the authors take a wonderfully critically approach to everything they present. If you are capable of even the tiniest bit of self-reflection, this book will delight you in ways you never imagined. Each page will force you to ask if you really know yourself at all.

You might have read books about heuristics, but this book is more accessible than Kahneman's book (thought Kahneman will give you a more thorough education about various heuristics) and is more entertaining and empirically sound that You Are Not So Smart by McRaney.

A+ ...more
5

Oct 09, 2011

Cognitive dissonance is a topic everyone should look into, but people placed in positions of leadership or responsibility would really benefit from a study on the matter. I enjoyed the examples presented in this book and related with a few of them as well, which helped me really understand the concept better. The the main issue I had with this book was its diminishing marginal return, the more I read the less I got out of it. Once the concept of cognitive dissonance is explained (very well too), Cognitive dissonance is a topic everyone should look into, but people placed in positions of leadership or responsibility would really benefit from a study on the matter. I enjoyed the examples presented in this book and related with a few of them as well, which helped me really understand the concept better. The the main issue I had with this book was its diminishing marginal return, the more I read the less I got out of it. Once the concept of cognitive dissonance is explained (very well too), the additional proses is just more in-depth examples and case studies. These examples are actually pretty interesting in their own right and kept me engaged in the book, but did not introduce any new concepts. Highly recommended for people interested in human psychology, people in positions of power, and anyone who's ever made a mistake. I listened to this book on Audiobook. ...more

Best Books from your Favorite Authors & Publishers

compare-icon compare-icon
Thousands of books

Take your time and choose the perfect book.

review-icon review-icon
Read Reviews

Read ratings and reviews to make sure you are on the right path.

vendor-icon vendor-icon
Multiple Stores

Check price from multiple stores for a better shopping experience.

gift-icon

Enjoy Result