Thursday, December 22, 2016

Denialism and Cognitive Dissonance


There is nothing worse than being lied to while being unable to do nothing about it. Watching a liar get away with lying and deception hurts so much more when so many believe the liar. Despite the facts proving the person is lying, the liar is not punished. All you can do is feel beaten, weak and alone in an upside down world where the lines between reality and fantasy, fact and falsehood, lies and deceit no long exist nor mean anything. What do you you do now? Do you give up and give in to the non-reality? Or do you continue to fight falsehood with the truth?

All of my life I have notice humans tendency to lean on and in many cases depend on lies and fantasy. For example the fact that many believe in ghost and beings they have never observed is proof positive. 
If I even bring up the the fact that so many hold religious beliefs that if put to the test using not only historical facts, but mathematical scrutiny and the laws of physics I would cause many to question my sanity, nail me to a cross and many of you might not read any further.

My point here is to show how many of us can be led to believe just about anything regardless of it being a lie or not. In fact, many will not even check to see if much of what they have been told most of their lives is true or not. No research is done when a person's beliefs are challenged. They will argue and fight about a point they really have no true understanding of, but no research. Why? Simply put, FEAR!!!! Denialism!!! In the psychology of human behavior, denialism is a person's choice to deny reality, as a way to avoid a psychologically uncomfortable truth. Denialism is an essentially irrational action that withholds the validation of an historical experience or event, by the person refusing to accept an empirically verifiable reality.

 Sometimes we make our beliefs a part of ourselves; we identify as our political party, religion and a million other beliefs. People don't take well to what they perceive as a personal attack and consider an attack on their core beliefs to be the same as an attack on them. This causes people to ignore opposing evidence and search for facts confirming their beliefs. This is not to say that these are stupid people it's just possibly for most human nature to confirm our beliefs and fool ourselves.

In psychology, cognitive dissonance is the mental stress or discomfort experienced by an individual who holds two or more contradictory beliefs, ideas, or values at the same time; performs an action that is contradictory to their beliefs, ideas, or values; or is confronted by new information that conflicts with existing beliefs, ideas or values.[Festinger, L. (1957). A Theory of Cognitive Dissonance. California: Stanford University Press.][Festinger, L. (1962). "Cognitive dissonance". Scientific American. 207 (4): 93–107. doi:10.1038/scientificamerican1062-93]


Leon Festinger's theory of cognitive dissonance focuses on how humans strive for internal consistency. An individual who experiences inconsistency tends to become psychologically uncomfortable, and is motivated to try to reduce this dissonance, as well as actively avoid situations and information likely to increase it.
 Reducing 
Cognitive dissonance theory is founded on the assumption that individuals seek consistency between their expectations and their reality. Because of this, people engage in a process called "dissonance reduction" to bring their cognitions and actions in line with one another. This creation of uniformity allows for a lessening of psychological tension and distress. According to Festinger, dissonance reduction can be achieved in four ways. In an example case where a person has adopted the attitude that they will no longer eat high fat food, but eats a high-fat doughnut, the four methods of reduction are:

1)Change behavior or cognition ("I will not eat any more of this doughnut")

2)Justify behavior or cognition by changing the conflicting cognition ("I'm allowed to cheat every once in a while")

3)Justify behavior or cognition by adding new cognitions ("I'll spend 30 extra minutes at the gym to work this off")

4)Ignore or deny any information that conflicts with existing beliefs ("This doughnut is not high in fat")


Categorization is used by humans to simplify the world around them. How categorization happens is usually what is the most noticeable or basic categories; race, gender, and age. Once these groups are identified, a set of schema that involve attitudes (stereotypes) towards that group come into mind as well. These attitudes also involve negative emotional feelings toward those stereotypes (prejudices), or the fixed overgeneralized views held over the labeled group.

Jonathan Haidt is a social psychologist who wrote an interesting book, The Righteous Mind: Why Good People are Divided by Politics and Religion. In it, he describes the things that people believe are their greatest moral priorities. The six categories are Care/Harm, Fairness/Cheating, Liberty/Betrayal, Authority/Subversion, Sanctity/Degradation. For example, liberals may think that Fairness is the most important thing to them morally and conservatives may think that Loyalty is the most important thing to them morally. It is an interesting idea, and it comes into play with the Chris Kyle story. The one thing that does not appear on the Moral Foundations category list is…The Truth (honesty).


If The Truth were an option for moral priorities, it would not come in first for either liberals or conservatives. Try having a discussion with a liberal about Obama or race, for instance, and you will quickly find out that The Truth comes in a very distant third to fairness and care. Conservatives, at least in my experience, put both authority and loyalty above The Truth. I spoke with a conservative friend of mine recently and he talked about wanting to talk in public about some semblance of The Truth, but in the next breath he said he could "never bad mouth his country". This sort of thinking and struggle is too common, people have an interest in The Truth, just not when The Truth conflicts with another, more importantly held belief, and most certainly not when The Truth can make them either uncomfortable or unpopular, which it often can. People will do all sorts of logical and moral gymnastics to maintain their belief system and world view and to keep The Truth at arms length.
~from the article "THE CURIOUS CASE OF CHRIS KYLE: American Hero or Liar?"

Theory and research

Most of the research on cognitive dissonance takes the form of one of four major paradigms. Important research generated by the theory has been concerned with the consequences of exposure to information inconsistent with a prior belief, what happens after individuals act in ways that are inconsistent with their prior attitudes, what happens after individuals make decisions, and the effects of effort expenditure. A key tenet of cognitive dissonance theory is that those who have heavily invested in a position may, when confronted with disconfirming evidence, go to greater lengths to justify their position.

Belief disconfirmation paradigm

Dissonance is felt when people are confronted with information that is inconsistent with their beliefs. If the dissonance is not reduced by changing one's belief, the dissonance can result in restoring consonance through misperception, rejection or refutation of the information, seeking support from others who share the beliefs, and attempting to persuade others.

An early version of cognitive dissonance theory appeared in Leon Festinger's 1956 book When Prophecy Fails. This book gives an account of the deepening of cult members' faith following the failure of a cult's prophecy that a UFO landing was imminent. The believers met at a predetermined place and time, believing they alone would survive the Earth's destruction. The appointed time came and passed without incident. They faced acute cognitive dissonance: had they been the victim of a hoax? Had they donated their worldly possessions in vain? Most members chose to believe something less dissonant to resolve reality not meeting their expectations: they believed that the aliens had given Earth a second chance, and the group was now empowered to spread the word that Earth-spoiling must stop. The group dramatically increased their proselytism despite (because of) the failed prophecy.

Another example of the belief disconfirmation paradigm is an orthodox Jewish group which believed their Rebbe might be the Messiah. When the Rebbe died of a stroke in 1994, instead of accepting that he was not the Messiah, some of them concluded that he was still the Messiah but would soon be resurrected from the dead. Some have suggested the same process might explain the belief two thousand years ago that Jesus was resurrected from the dead.
Induced-compliance paradigm See also: Forced compliance theory
In Festinger and Carlsmith's classic 1959 experiment, students were asked to spend an hour on boring and tedious tasks (e.g., turning pegs a quarter turn, over and over again). The tasks were designed to generate a strong, negative attitude. Once the subjects had done this, the experimenters asked some of them to do a simple favour. They were asked to talk to another subject (actually an actor) and persuade the impostor that the tasks were interesting and engaging. Some participants were paid $20 (equivalent to $163 in present-day terms) for this favour, another group was paid $1 (equivalent to $8 in present-day terms), and a control group was not asked to perform the favour.

When asked to rate the boring tasks at the conclusion of the study (not in the presence of the other "subject"), those in the $1 group rated them more positively than those in the $20 and control groups. This was explained by Festinger and Carlsmith as evidence for cognitive dissonance. The researchers theorized that people experienced dissonance between the conflicting cognitions, "I told someone that the task was interesting", and "I actually found it boring." When paid only $1, students were forced to internalize the attitude they were induced to express, because they had no other justification. Those in the $20 condition, however, had an obvious external justification for their behaviour, and thus experienced less dissonance.

In subsequent experiments, an alternative method of inducing dissonance has become common. In this research, experimenters use counter-attitudinal essay-writing, in which people are paid varying amounts of money (e.g., $1 or $10) for writing essays expressing opinions contrary to their own. People paid only a small amount of money have less external justification for their inconsistency, and must produce internal justification to reduce the high degree of dissonance they experience.

A variant of the induced-compliance paradigm is the forbidden toy paradigm. An experiment by Aronson and Carlsmith in 1963 examined self-justification in children. In this experiment, children were left in a room with a variety of toys, including a highly desirable toy steam-shovel (or other toy). Upon leaving the room, the experimenter told half the children that there would be a severe punishment if they played with that particular toy and told the other half that there would be a mild punishment. All of the children in the study refrained from playing with the toy. Later, when the children were told that they could freely play with whatever toy they wanted, the ones in the mild punishment condition were less likely to play with the toy, even though the threat had been removed. The children who were only mildly threatened had to justify to themselves why they did not play with the toy. The degree of punishment by itself was not strong enough—so, to resolve their dissonance, the children had to convince themselves that the toy was not worth playing with.

A 2012 study using a version of the forbidden toy paradigm showed that hearing music reduces the development of cognitive dissonance. With no music playing in the background, the control group of four-year-old children were told to avoid playing with a particular toy. After playing alone, the children later devalued the forbidden toy in their ranking, which is similar findings to earlier studies. However, in the variable group, classical music was played in the background while the children played alone. In that group, the children did not later devalue the toy. The researchers concluded that music may inhibit cognitions that result in dissonance reduction.


Music is not the only example of an outside force lessening post-decisional dissonance; a 2010 study showed that hand-washing had a similar effect.

Free-choice paradigm
In a different type of experiment conducted by Jack Brehm, 225 female students rated a series of common appliances and were then allowed to choose one of two appliances to take home as a gift. A second round of ratings showed that the participants increased their ratings of the item they chose, and lowered their ratings of the rejected item.

This can be explained in terms of cognitive dissonance. When making a difficult decision, there are always aspects of the rejected choice that one finds appealing and these features are dissonant with choosing something else. In other words, the cognition, "I chose X" is dissonant with the cognition, "There are some things I like about Y." More recent research has found similar results in four-year-old children and capuchin monkeys.

In addition to internal deliberations, the structuring of decisions among other individuals may play a role in how an individual acts. Researchers in a 2013 study examined social preferences and norms as related, in a linear manner, to wage giving among three individuals. The first participant's actions influenced [clarification needed] the second's own wage giving. The researchers argue that inequity aversion is the paramount concern of the participants.

Effort justification paradigm
Further information: Effort justification
Dissonance is aroused whenever individuals voluntarily engage in an unpleasant activity to achieve some desired goal, and dissonance can be reduced by exaggerating the desirability of the goal. Aronson & Mills had individuals undergo an embarrassing "initiation" to join a discussion group. One group was asked to read twelve obscene words aloud; the other to read twelve words which were related to sex but not obscene. Both groups were then given headphones to listen in on a pre-recorded discussion "designed to be as dull and banal as possible" about the sexual behavior of animals. Subjects were told that the discussion was occurring in the next room. The individuals whose initiation required obscene words evaluated the group as more interesting than the individuals in the mild-initiation condition.

Effort justification is related to the idea of a sunk cost.


Washing one's hands has been shown to eliminate post-decisional dissonance, presumably because the dissonance is commonly caused by moral disgust (with oneself), which is related to disgust from unsanitary conditions.

Examples
"The Fox and the Grapes"

A classic illustration of cognitive dissonance is expressed in the fable "The Fox and the Grapes" by Aesop (ca. 620–564 BCE). In the story, a fox sees some high-hanging grapes and wishes to eat them. When the fox is unable to think of a way to reach them, he decides that the grapes are probably not worth eating, with the justification that the grapes probably are not ripe or that they are sour (hence the common phrase "sour grapes"). The moral that accompanies the story is "Any fool can despise what he cannot get". This example follows a pattern: one desires something, finds it unattainable, and reduces one's dissonance by criticizing it. Jon Elster calls this pattern "adaptive preference formation"
"The Fox and the Grapes" by Aesop. When the fox fails to reach the grapes, he decides he does not want them after all. Rationalization is often involved in reducing anxiety about conflicting cognitions, according to cognitive dissonance theory.
Other related phenomena
Cognitive dissonance has also been demonstrated to occur when people seek to:

1)Explain inexplicable feelings: When a disaster occurs in a community, irrationally fearful rumors spread in nearby communities not involved in the disaster because of the need of those who are not threatened to justify their anxieties.

2)Minimize regret of irrevocable choices: Bettors at a racetrack are more confident in their chosen horse just after placing the bet because they cannot change it (the bettors felt "post-decision dissonance").

3)Justify behavior that opposed their views: Students judge cheating less harshly after being induced to cheat on a test.

4)Align one's perceptions of a person with one's behaviour toward that person: the Ben Franklin effect refers to that statesman's observation that the act of performing a favour for a rival leads to increased positive feelings toward that individual.

5)Reaffirm already held beliefs: Congeniality bias (also referred to as confirmation bias) refers to how people read or access information that affirms their already established opinions, rather than referencing material that contradicts them. For example, a person who is politically right-leaning might only watch news commentary that is from conservative news sources just as left-leaning individuals might only watch news commentary from liberal news sources. This bias is particularly apparent when someone is faced with deeply held beliefs, i.e., when a person has 'high commitment' to their attitudes.


Balance theory suggests people have a general tendency to seek consonance between their views, and the views or characteristics of others (e.g., a religious believer may feel dissonance because their partner does not have the same beliefs as he or she does, thus motivating the believer to justify or rationalize this incongruence). People may self-handicap so that any failures during an important task are easier to justify (e.g., the student who drinks the night before an important exam in response to his fear of performing poorly).

Social engineering as applied to security is the exploitation of various social and psychological weaknesses in individuals and business structures, sometimes for penetration testing but more often for nefarious purposes, such as espionage against businesses, agencies, and individuals, typically toward the end of obtaining some illegal gain, either of useful but restricted or private information or for monetary gain through such methods as phishing to obtain banking account access, or for purposes of identity theft, blackmail, and so forth. Exploitation of weaknesses caused by inducing cognitive dissonance in targets is one of the techniques used by perpetrators.


No comments:

Post a Comment