Self-deception
From Wikipedia, the free encyclopedia
Contents
Definitional problems A consensus on the identification of self-deception remains elusive to contemporary philosophers, the result of the term's paradoxical elements and ambiguous paradigmatic cases. Self-deception also incorporates numerous dimensions, such as epistemology, psychological and intellectual processes, social contexts, and morality. As a result, the term is highly debated and occasionally argued to be an impossible phenomenon.TheorizationAnalysisThe traditional paradigm of self-deception focuses on interpersonal deception, best described by the Stanford Encyclopedia of Philosophy. In this paradigm, A intentionally gets B to believe some proposition p, all the while knowing or believing truly ~p. Such deception is intentional and requires the deceiver to know or believe ~p and the deceived to believe p. On this traditional mode, self-deceivers must (1) hold contradictory beliefs and (2) intentionally get themselves to hold a belief they know or believe truly to be false.The process of rationalization, however, can obscure the intent of self-deception. Dr. Brian McLaughlin illustrates that such rationalizations in certain circumstances permit the phenomenon. When a person, who disbelieves p, intentionally tries to make himself believe or continue believing p by engaging in such activities, and, as a result unintentionally misleads himself into believing or continuing to believe p via biased thinking, he deceives himself in a way appropriate for self-deception. No deceitful intention is required for this. PsychologySelf-deception calls into question the nature of the individual, specifically in a psychological context and the nature of "self". Irrationality is the foundation upon which the argued paradoxes of self-deception stem, and it is argued that not everyone has the "special talents" and capacities for self-deception. However, rationalization is influenced by a myriad of factors, including socialization, personal biases, fear, and cognitive repression. Such rationalization can be manipulated in both positive and negative fashions; convincing one to perceive a negative situation optimistically and vice versa. In contrast, rationalization alone cannot effectively clarify the dynamics of self-deception, as reason is just one adaptive form mental processes can take.Paradoxes of self-deception The works of philosopher Alfred R. Mele have provided insight into some of the more prominent paradoxes regarding self-deception. Two of these paradoxes include the self-deceiver's state of mind and the dynamics of self-deception, coined the "static" paradox and the "dynamic/strategic" paradox, respectively. Mele formulates an example of the "static" paradox as the following: If ever a person A deceives a person B into believing that something, p, is true, A knows or truly believes that p is false while causing B to believe that p is true. So when A deceives A (i.e., himself) into believing that p is true, he knows or truly believes that p is false while causing himself to believe that p is true. Thus, A must simultaneously believe that p is false and believe that p is true. But how is this possible? Mele then describes the "dynamic/strategy" paradox: In general, A cannot successfully employ a deceptive strategy against B if B knows A's intention and plan. This seems plausible as well when A and B are the same person. A potential self-deceiver's knowledge of his intention and strategy would seem typically to render them ineffective. On the other hand, the suggestion that self-deceivers typically successfully execute their self-deceptive strategies without knowing what they are up to may seem absurd; for an agent's effective execution of his plans seems generally to depend on his cognizance of them and their goals. So how, in general, can an agent deceive himself by employing a self-deceptive strategy? These models call into question how one can simultaneously hold contradictory beliefs ("static" paradox) and deceive oneself without rendering one's intentions ineffective ("dynamic/strategic" paradox). Attempts at a resolution to these have created two schools of thought: one that maintains that paradigmatic cases of self-deception are intentional and those that deny the notion—Intentionalists and Non-Intentionalists, respectively. Intentionalists tend to agree that self-deception is intentional, but divide over whether it requires the holding of contradictory beliefs.This school of thought incorporates elements of temporal partitioning (extended over time to benefit the self-deceiver, increasing the chance of forgetting the deception altogether) and psychological partitioning (incorporating various aspects of the "self").Non-Intentionalists, in contrast, tend to believe that cases of self-deception are not necessarily accidental, but motivated by desire, anxiety, or some other emotion regarding p or related to p. This notion distinguishes self-deception from misunderstanding. Furthermore, "wishful thinking" is distinguished from self-deception in that the self-deceivers recognize evidence against their self-deceptive belief or possess, without recognizing, greater counterevidence than wishful thinkers. Numerous questions and debates have continued to foment regarding the paradoxes of self-deception, however, a consensual paradigm remains intangible. Trivers' theory of self-deceptionIt has been theorized that humans are susceptible to self-deception because most people have emotional attachments to beliefs, which in some cases may be irrational. Some evolutionary biologists, such as Robert Trivers, have suggested that deception plays a significant part in human behavior, and in animal behavior, more generally speaking. One deceives oneself to trust something that is not true as to better convince others of that truth. When a person convinces her or himself of this untrue thing, s/he better mask the signs of deception.This notion is based on the following logic: deception is a fundamental aspect of communication in nature, both between and within species. It has evolved so that one can have an advantage over another. From alarm calls to mimicry, animals use deception to further their survival. Those who are better able to perceive deception are more likely to survive. As a result, self-deception evolved to better mask deception from those who perceive it well, as Trivers puts it: "Hiding the truth from yourself to hide it more deeply from others." In humans, awareness of the fact that one is acting deceptively often leads to tell-tale signs of deception, such as nostrils flaring, clammy skin, quality and tone of voice, eye movement, or excessive blinking. Therefore, if self-deception enables someone to believe her or his own distortions, s/he will not present such signs of deception and will therefore appear to be telling the truth. Self-deception can be used both to act greater or lesser than one actually is. For example, one can act overconfident to attract a mate or act under-confident to avoid a predator or threat. If a person is capable of concealing her or his true feelings and intentions well, then s/he is more likely to deceive others and succeed. It may also be argued that the ability to deceive, or self-deceive, is not the selected trait but a by-product of a more primary trait called abstract thinking. Abstract thinking allows many evolutionary advantages such as more flexible, adaptive behaviors and innovation. Since a lie is an abstraction, the mental process of creating a lie can only occur in animals with enough brain complexity to permit abstract thinking. Self-deception lowers cognitive cost; that is to say, it is less complicated for one to behave or think in a certain manner that implies something is true, if s/he has convinced her/himself that that very thing is indeed true. The mind will not have to think constantly of the true thing and then the false thing, but simply convince her or himself that the false thing is true. Evolutionary implications of Trivers' theory of self-deceptionBecause there is deceit, there exists a strong selection to recognize when deception occurs. As a result, self-deception evolved so as to better hide the signs of deception from others. The presence of deception explains the existence of an innate ability to commit self-deception to hide the indications of deceptions. Humans deceive themselves in order to better deceive others and thus have an advantage over them. In the three decades since Trivers introduced his adaptive theory of self-deception, there has been a lot of controversy over the question of such behavior having a genetic basis. The debate continues today.The explanation of deception and self-deception as innate characteristics is perhaps true, but there are very many other explanations for this pattern of behavior. It is possible that the ability to self-deceive is not innate, but a learned trait, acquired through experience. For example, a person could have been caught being deceitful by revealing her/his knowledge of information s/he was trying to hide. Her/his nostrils flared, indicating that s/he was lying to the other person, and s/he thus did not get what s/he wanted. Next time, to better achieve success, the person will more actively deceive her/himself of having knowledge to better hide the signs of deception. People therefore could have the capacity to learn self-deception. |
Though the term is difficult to define, examples of self-deception are abundant in varying degrees. Simple instances of self-deception include common occurrences such as: the alcoholic who is self-deceived in believing that his drinking is under control, the husband who is self-deceived in believing that his wife is not having an affair, the jealous colleague who is self deceived in believing that her colleague's greater professional success is due to ruthless ambition, and the AIDS victim who is self-deceived in believing that he has a fifty-fifty chance of surviving a disease.
The implications of self-deception increase in gravity when conducted on a political scale. Both policymakers and heads of state are capable of self-deception, which can affect both internal and foreign affairs. The 2003 US invasion of Iraq, based, in part, on the mistaken belief that Saddam Hussein harbored weapons of mass destruction, serves as a prominent—and controversial—case of self-deception that is still being examined.
In the same invasion, former Iraqi Information Minister Mohammed Saeed al-Sahhaf illustrated another well-known instance of self-deception. On April 7, 2003, al-Sahhaf claimed that there were no American troops in Baghdad, and that the Americans were committing suicide by the hundreds at the city's gates. At that time, American tanks were patrolling the streets only a few hundred meters from the location where the press conference was held. Despite empirical evidence contradicting al-Sahhaf's claims, he claimed that the reports were provided by "reliable sources".
Further examples can be found in various despotic countries around the world. Self-deception is typically evident in such environments, nourished by reluctance to leadership criticism from both the citizens and members of the regime. This notion is personified by Kim Jong-il, de facto leader of Democratic People's Republic of Korea (North Korea), though self-proclamations (such as his messianic title "Dear Father") and his entourage of primarily Korean War veterans. With heavy restrictions placed on free speech, and Kim Jong-il's noted intolerance for criticism, self-deception on the state of internal and external affairs can foment without advisory counterevidence.
Books
- Leadership and Self Deception, by Arbinger Institute - talks at length about self-deception and its implications for leaders - in personal and public life. ISBN 978-1576759776
- Anatomy of Peace - Resolving the Heart of Conflict, by Arbinger Institute ISBN 978-1576753347
- McLaughlin, Brian P. & Amélie Oksenberg Rorty (eds.) (1988). Perspectives on Self-Deception. California UP: Berkley etc.
Journals
- Teorema, Vol. XXVI/3, Monographic on Self-Deception: Conceptual Issues, Autumn 2007
- Behavioral and Brain Sciences, Vol. 20 (1), 1997.
- Philosophical Psychology, Vol. 20 (3), 2007
No comments:
Post a Comment