Home » , , » 9 Basic Ways We Fool Ourselves Into Believing Things That Aren’t True

9 Basic Ways We Fool Ourselves Into Believing Things That Aren’t True

Posted by Jerry De Luca on Tuesday, April 17, 2018



A cognitive bias is a fallacious way of thinking – deviating from what is rational and coherent. Since it is rare for critical thinking to be taught in schools, almost everyone is prone to this to some degree. Falling for an unintentional or a well-planned lie is human nature and often leads to regretful decisions. People create their own reality and disregard contrary information, sometimes without even realizing it. The following are just nine of the more prevalent mind biases – ways we fool ourselves.

Anchoring Bias –Believing and adopt as one’s own the first opinion heard on any particular issue. This happens more often today because of people’s social media “echo chamber” – only consuming information from those media outlets, bloggers and social media accounts that agree with their views. One example would be a person who knows nothing about GMOs, randomly picks up a magazine in the doctor’s office and reads an alarmist, unsubstantiated article on GMO food being harmful, and adopting that view, without evaluating the overwhelming contrary evidence that GMOs are safe and in many cases, necessary. 

Availability Heuristic Bias – Overestimating the significance of the information that one has. When a person does not think critically and doesn’t take the time to properly inform himself or herself on any issue, they fall for this bias and often end up with warped and unsubstantiated views of reality. A few examples:

---After seeing news reports about people losing their jobs, you might start to believe that you are in danger of being layed-off. You start lying awake in bed each night worrying that you are about to be fired.
---After seeing several television programs on shark attacks, you start to think that such incidences are relatively common. When you go on vacation, you refuse to swim in the ocean because you believe the probability of a shark attack is high.
---After reading an article about lottery winners, you start to overestimate your own likelihood of winning the jackpot. You start spending more money than you should each week on lottery tickets.
---After seeing news stories about high-profile child abductions, you begin to believe that such tragedies are quite common. You refuse to let your child play outside by herself and never let her leave your sight.


Bandwagon Effect – Choosing to believe something not because deep down you genuinely believe it, but because of social pressure from other people or groups. Standing out like a sore thumb, or standing up for one’s principles, takes courage. It is much easier to get along and follow the crowd. One example would be a person who has 4 or 5 good friends and all of them have bought into the snake-oil of naturopathic “medicine”. That person would be inclined to go along with their beliefs and adopt many of them as their own. A few other examples:

Fashions: Many people begin wearing a certain style of clothing as they see others adopt the same fashions.
Music: As more and more people begin listening to a particular song or musical group, it becomes more likely that other individuals will listen as well.
Social Networks: As increasing numbers of people start using certain online social networking websites, other individuals become more likely to begin using those sites as well. The bandwagon effect can also influence how posts are shared as well as interactions within online groups.
Diets: When it seems like everyone is adopting a certain fad diet, people become more likely to try the diet themselves.
Elections: People are more likely to vote for the candidate that they think is winning.


Availability Cascade – An alarming story hits the news which most people have very little knowledge or experience with. Because of perceived personal danger, anxiety levels rise and uninformed and irrational opinions take hold. Mass hysteria is often the result. A few examples are the year 2000 millennium bug, vaccines and autism, mad cow disease, bird flu, swine flu, the Ebola virus, the Zika virus, gluten, GM0s, to name a few. A good summary is from Daniel Kahneman’s popular book Thinking, Fast and Slow:

“An availability cascade is a self-sustaining chain of events, which may start from media reports of a relatively minor event and lead up to public panic and large-scale government action. On some occasions, a media story about a risk catches the attention of a segment of the public, which becomes aroused and worried. This emotional reaction becomes a story in itself, prompting additional coverage in the media, which in turn produces greater concern and involvement. The cycle is sometimes sped along deliberately by ‘availability entrepreneurs’, individuals or organizations who work to ensure a continuous flow of worrying news. The danger is increasingly exaggerated as the media compete for attention-grabbing headlines. Scientists and others who try to dampen the increasing fear and revulsion attract little attention, most of it hostile: anyone who claims that the danger is overstated is suspected of association with a ‘heinous cover-up’. The issue becomes politically important because it is on everyone's mind, and the response of the political system is guided by the intensity of public sentiment.”


Outcome Bias – Choosing a view or opinion solely on its apparent positive or favorable outcome. There is no attempt to honestly evaluate how one came to that conclusion. If it turned out right, then I must be right. The ends justify the means. In many cases things turned out right by pure luck. What will happen the next five times a similar decision needs to be made? Will I get lucky again? A perfect example is the placebo effect. A snake-oil peddler may sell you something and it may actually work because you believe in it strongly. The fact that you ingested a sugar pill is something too uncomfortable to accept. The outcome is all that matters, even though the path taken there was a lie.

Overconfidence Bias – Being too smart and having too much success can have its downsides. A person can become so overconfident that they will trust their gut instead of taking the time and effort to evaluate all the pros and cons. As well as get some advice from others. An example would be an entrepreneur who has so much initial success, that he or she soon falls flat on their face because of avoiding due diligence. A charismatic alternative medicine “doctor” can become overconfident due to placebo-causing successful results. Soon, however, overconfidence will lead to claims of healing for a wide range of health conditions for which there is no evidence for. The cure for cancer has been found!

One example is charismatic health gurus like Joe Mercola and Mike Adams of Natural News. They are outspoken against vaccinations, in spite of the evidence. When they come out with an exciting new “discovery” about vaccines doing harm, their followers on social media share and repost, without verification or corroboration. Their fearless leader is blindly followed because he has achieved an almost god-like status in regards to matters of health.

Backfire Effect – This takes place when a person is (surprisingly) evaluating convincing contrary evidence to his deeply cherished point of view. He has been forewarned to expect opposition, and when it comes, he can say “Aha! There it is!” This leads to his commitment to his beliefs getting only stronger, instead of awakening a tinge of doubt.  The deceiver has cleverly preprogrammed his followers to expect disagreement and discourages them from investigating and intelligently assessing the contrary claims. The deceiver also knows everyone has a morsel or a mountain of ego and being wrong about something important is quite uncomfortable. This is relied on to facilitate the deception. A few examples:

“Antivaxxers distrust big pharma and think that money corrupts medicine, which leads them to believe that vaccines cause autism despite the inconvenient truth that the one and only study claiming such a link was retracted and its lead author accused of fraud. The 9/11 truthers focus on minutiae like the melting point of steel in the World Trade Center buildings that caused their collapse because they think the government lies and conducts “false flag” operations to create a New World Order. Climate deniers study tree rings, ice cores and the ppm of greenhouse gases because they are passionate about freedom, especially that of markets and industries to operate unencumbered by restrictive government regulations. Obama birthers desperately dissected the president's long-form birth certificate in search of fraud because they believe that the nation's first African-American president is a socialist bent on destroying the country.”


Empathy Gap – This is all too common and everyone is guilty of it. It’s difficult to fully understand how we would feel or act in the other person’s shoes, especially if the circumstance is foreign to us. Breaking News of a monster tornado or hurricane with many casualties will elicit a different reaction from those that have been in one, and those who have never faced such overpowering fury from nature. Immigrants fleeing war or poverty will have more empathy – in most cases – when comfortably settled in their new country and seeing unimaginable suffering across the globe. Another good example is with parenting:

“When Kristin Lagattuta and colleagues (2012) asked parents to gauge their children's emotional lives, they found evidence of a mismatch between what parents believed and what children reported about themselves.

“Parents who reported feeling lots of negative emotions were more likely to overestimate their children's distress. But most parents showed a positivity bias -- i.e., they underestimated their children's anxieties and worries -- and the effect was related to parental optimism. If parents felt good, they tended to assume that their kids felt good, too. A subsequent study replicated these findings (López-Pérez and Wilson 2015).”


The Barnum Effect – This is deviously employed by master manipulators to dupe people into believing their occult practices. General statements are made which can apply to many, but the targets are made to believe it is specifically about them and no one else. Cold readings encompass most of this bias: 

“Cold reading is a set of techniques used by mentalists, psychics, fortune-tellers, mediums, illusionists (readers), and scam artists to imply that the reader knows much more about the person than the reader actually does. Without prior knowledge, a practiced cold-reader can quickly obtain a great deal of information by analyzing the person's body language, age, clothing or fashion, hairstyle, gender, sexual orientation, religion, race or ethnicity, level of education, manner of speech, place of origin, etc. Cold readings commonly employ high-probability guesses, quickly picking up on signals as to whether their guesses are in the right direction or not, then emphasizing and reinforcing chance connections and quickly moving on from missed guesses……

“Subtle cues such as changes in facial expression or body language can indicate whether a particular line of questioning is effective or not. Combining the techniques of cold reading with information obtained covertly (also called ‘hot reading’) can leave a strong impression that the reader knows or has access to a great deal of information about the subject. Because the majority of time during a reading is spent dwelling on the ‘hits’ the reader obtains, while the time spent recognizing ‘misses’ is minimized, the effect gives an impression that the cold reader knows far more about the subject than an ordinary stranger could.”





Related Posts

17 Simple Ways To Spot Fake News: Instructions For The Left and The Right   http://www.mybestbuddymedia.com/2018/04/17-simple-ways-to-spot-fake-news.html

Risk Perception: Six Fatal Flaws of the Anti-Vaccine Movement http://www.mybestbuddymedia.com/2018/03/risk-perception-six-fatal-flaws-of-anti.html 

Pseudo-Health: 6 More Ways Liars and Hustlers Use Confirmation Bias To Dupe The Public  http://www.mybestbuddymedia.com/2018/02/pseudo-health-6-more-ways-liars-and.html  

6 Ways Liars and Hustlers Use Confirmation Bias To Dupe The Public http://www.mybestbuddymedia.com/2018/02/6-ways-liars-and-hustlers-use.html

30 Prying and Probing Questions To Bolster Critical Thinking http://www.mybestbuddymedia.com/2016/10/30-prying-and-probing-questions-to.html

9 MORE Common Characteristics of People Who Get Duped http://www.mybestbuddymedia.com/2017/08/9-more-common-characteristics-of-people.html

9 Basic Ways Shameless Health Gurus Dupe Their Followers http://www.mybestbuddymedia.com/2018/02/9-basic-ways-shameless-health-gurus.html

Photo: http://anders.janmyr.com/2014/09/fallacies-and-biases-of-our-imperfect.html


Jerry De Luca is a Christian freelance writer who loves perusing dozens of interesting and informative publications. When he finds any useful info he summarizes it, taking the main points, and creates a (hopefully) helpful blog post.

0 comments :

Post a Comment

Feel free to leave any comments...