Which Is The Bigger Issue – Willful Ignorance or Self-Deception and Why?

Spacer

Spacer

Lets Take A Look At This From Different Perspectives To Find Out.

The WHY Never Changes And It Is Always Without Exception Because We Can’t Handle The Truth At The Present Moment.

The First Theory Is The Most Widley Held and is Described Here taken from a 2017 https://www.psychologytoday.com/us Article.

Many of the things we believe about ourselves and our experiences turn out to be false. Sometimes this is due to innocent memory failures or to the lack of needed information. Suppose that Charles believes that he failed his biology test because the professor asked obscure and ambiguous questions. Charles believes this because he doesn’t realize that he got the lowest score out of the 100 students who took the test, and that most people did quite well. If Charles had this information, he would realize that he failed the test because he didn’t study hard enough, or because he’s not very good at biology.

On the other hand, if Charles continues to believe that the test was unfair after seeing the grade distribution, he is either severely challenged in his capacity for rational calculation or he is the perpetrator of willful ignorance. Willful ignorance occurs when individuals realize at some level of consciousness that their beliefs are probably false, or when they refuse to attend to information that would establish their falsity.

People engage in willful ignorance because it is useful. Growing up, my best friend’s parents always had classical music playing in their house. My friend’s father, who was a bit unyielding in his pronouncements and views, would say something about the music such as “that’s Mozart at his best.” His mother, who had a much better musical ear, would wait until he left the room and giggle: “It’s Brahms.” If you asked her, she would claim that her husband knew more about music than her. Deep down, though, I suspect that she knew this wasn’t true, but convincing herself to believe it served to maintain marital harmony as well as her desire to see her husband as a connoisseur of music, wine, poetry, the Yankees—you name it.

People can sometimes be pulled out of their willful ignorance with a modicum of probing, or with contradictory data. It might not be too difficult to convince the couch potato, who claims that exercise is a waste of time, that some form of activity is better than sitting in a recliner, watching TV, and eating copious amounts of junk food chased by Bud Light.

In contrast to this sort of willful ignorance, self-deception occurs when individuals believe false things with complete conviction. Consider a mother who truly believes that her sour-pussed baby is adorable. If you placed her visage-challenged baby in a lineup with cute babies, she would think that hers was the cutest. For real. As another example, consider the sports fan who “sees” the opponent commit a foul in a crucial moment of a basketball game, when in reality, it was the player on his own team, not the opponent, who committed the foul. Because this type of self-deception occurs at the perceptual level, the sports fan actually “sees” the opponent commit the foul, and no amount of replay or argumentation will convince him otherwise.

Often, it is difficult to judge whether somebody is willfully ignorant or truly deluded. A student once dragged her roommate up with her after my class and complained that there was something seriously wrong with my test. The problem was that she had studied for 10 hours and her roommate had hardly studied at all and her roommate did much better. Now, a professor in these circumstances is prohibited from stating the obvious (i.e., “your roommate is smarter than you”), and the roommate, who is probably quite aware of this, is unlikely to say it. In fact, one reason why people maintain erroneous beliefs about themselves is that their friends and relatives enable it. Brutal honesty is socially unacceptable in our culture. If the woman with the sour-pussed baby is a friend, and approaches you to ask whether her newborn baby is adorable, you will probably inhibit your first response, especially if your response is: “So, you’re telling me that the thing in that baby carriage is human?”

The difference between willful ignorance and true self-deception is subtle, but important. Willful ignorance tends to be more adaptive than self-deception. Willful ignorance is a cognitive strategy that people adopt to promote their emotional well-being, whereas self-deception is less controllable and more likely to be detrimental. Although willful ignorance and self-deception sometimes help individuals to avoid unpleasant facts, in the long run, it is usually better to confront reality than to avoid or deny it. Because the self-deceived person fully believes things that are untrue, she has fewer resources for correcting her course when her erroneous beliefs lead her astray.

Consider a highly intelligent young woman who falls in love with a man who is attractive, charming, and according to virtually everyone who knows him, a colossal jerk. The willfully ignorant woman tries to overlook the fact that the man has the intelligence of a tomato, a bad temper, and few ambitions other than to get drunk with his friends. At vulnerable moments, though, perhaps in her dreams, she suspects the truth. By contrast, the self-deceived woman truly believes that the man is intelligent despite appearances to the contrary, that he is actually quite lovable and displays a bad temper only when pushed to the limit, and that he just likes to have fun and will eventually grow out of his adolescent proclivities. Since the self-deluded woman truly believes these things, it will probably take her longer to read the warning signs and terminate the relationship.

The distinction between willful ignorance and self-deception has interesting implications for moral judgment. In general, we probably blame willfully ignorant people more for their actions and attitudes than those we suspect of self-delusion. Suppose that you are annoyed by a friend’s enthusiastic support of a political candidate who you believe is racist, sexist, and lacks any sense of human decency. If you believe that your friend is self-deceived—that he actually believes that the candidate is well-intentioned, has a master plan for the country, and harbors no racial or gender bias—then you may be willing to tell yourself a story such as that your friend is generally a good person who has fallen under some bad influences. But if you believe that your friend actually does know better—that he willfully ignores the candidate’s flaws because he tacitly approves of what you view as the candidate’s bigotry and unsavory character—then you are less likely to excuse your friend’s advocacy.

Of course, if the distinction between willful ignorance and self-deception were this tidy, philosophers would have nothing to write about and psychologists would have less to research. In closing, I will consider one qualification of the distinction I have drawn between willful ignorance and self-deception.

Original Source: https://www.psychologytoday.com/us/blog/why-we-blame/201709/willful-ignorance-and-self-deception

I have suggested that willful ignorance is under the individual’s control to a greater extent than is self-deception. In other words, people could, if sufficiently motivated, change behaviors and attitudes that stem from willful ignorance more easily than they could alter their self-deceived actions. If it is true that self-deception is outside people’s control, then it seems like we should show them some leniency in our moral evaluations.

But what if we blame them for the motives that led them to be self-deceived in the first place? What if we think that they became self-deceived because it served their interests? Consider an example of a father who hits his children because he can’t control his temper, but who truly believes (i.e., is self-deceived) that he does it to teach them a lesson. If the father’s inability to control his temper is obvious and well-known, then his belief that he is just trying to instill discipline is probably not going to garner much sympathy, whether he actually believes it or not. In this case, the father’s belief that he is teaching discipline is just a little too convenient. What this example highlights is that in some instances of moral evaluation, we pay more attention to the motives that we believe drive people’s actions than we do to their awareness of those motives.

Credit to the Author: https://www.psychologytoday.com/us/contributors/mark-alicke-phd
Source: https://www.psychologytoday.com/us/blog/why-we-blame/201709/willful-ignorance-and-self-deception

Willful Ignorance (A Thought Experiment)

Estimated read time (minus contemplative pauses): 13 min.

Source: https://www.untrammeledmind.com/2020/05/willful-ignorance-a-thought-experiment/

UNTRAMMELED MIND

music & musings of dan jacob wallace

Willful Ignorance vis-à-vis Belief and Identity

Imagine someone convinces you that they are you from the future. I don’t know how. Maybe they time travelled or are a computer-generated projection (even if only into your mind). The main thing is that you believe you’re talking to Future You.

Future You hands you a small piece of paper. Says, “You will one day come to believe B. And it’ll make you happier. I wish I’d believed B sooner. Here’s a list of readings that I’m sure will convince you of B. You’d have come upon and read them eventually. But now you can admire them much sooner.”

It’s impossible for you to imagine yourself believing B. You take for granted not only that B is false, but that the world, and history itself, would have to be very different than you thought in order for B to be true. You might have even long taken for granted that anyone who believes B is probably a bad, maybe even an evil, person. At any rate, you don’t see yourself as a B-believer sort of person.

You’re sure something must have gone wrong for any version of yourself to believe B. You think Future You must be mistaken about being happier in that condition. But you do take very seriously, and maybe even fear, the power of the reading list to convince you of B. Or at least to set you in the direction of believing it—of becoming Future You.

Do you seek out the sources to get it over with? Or to prove that you’re not as gullible as, or are not the person who, Future You thinks you are (or remembers themselves being)? If you seek out the sources, do you prepare yourself to withstand and refute them? Do you not seek them out and let time run its course, doubling down on readings and behaviors that confirm your current, not-B picture of the world?

(Did Future You also go through this? If not, how can this really be your future you?)

Or do you make a strong effort to avoid the readings and any talk of B all together? In other words, do you strive to remain willfully ignorant?

Is there a general principle for what to do? Would you need to know what B is before advising someone else on what to do in this situation?

You have the reading list. What’ll you do with it?

Willful Ignorance: Background Points to Ponder

(1) Are we all disposed to believe anything if the right series of buttons is pushed? Does this mean we in fact do have something like the above reading list?

(2) Much belief is passive. Look at your surroundings and then try to believe you’re not in those surroundings. We can’t turn all of our beliefs off and on at will. (Which is why I take the height of fascism to entail the establishment and enforcement of social norms that demand we literally believe a certain set of propositions, rather than merely behave as if we believe them.) But we might have some control over what beliefs are formed and sustained, over which buttons are pushed and in what order: e.g., if you never consume the readings on the list, they won’t push whatever buttons they pushed in Future You.

(There’s no “fate” here. If you lock yourself into a sensory deprivation tank for the rest of your life, you won’t spontaneously form the belief that B is true.)

That said, some people do seem to think people are responsible—morally responsible—for their most important beliefs. I disagree but won’t push it here. I imagine one’s opinion on this will influence their notions about willful ignorance.

(3) Willful ignorance is usually flung as an insult or moral failing. But what are we really accusing someone of when accusing them of willful ignorance?

(4) There’s certainly nothing wrong about willful ignorance in general. I sustain a willful ignorance of lots of things—like most of the books in every bookstore and library I’ve browsed, the details of most conspiracy theories, lots of areas of academic or intellectual study I don’t find particularly interesting, even more areas I consider to be pseudoscience or worse, and thousands of languages because I don’t have time to study them all.

I’m willfully ignorant of the utterances of many political and social provocateurs because I assume those utterances amount to mental pollution, or are at least a waste of time. I could be wrong about some of them, but it’s not feasible—not worth the time or mental health risk—for me to figure out which.

I’m willfully ignorant about anything I know I’ve forgotten but do not make an effort to remember.

(5) Willful ignorance is sometimes virtuous.

A friend once accidentally texted me a long message meant for their romantic partner. I willfully ignored its content by deleting it as soon as I noticed the mistake.

Examples are easily produced. Here are two more. Avoiding leaked sex tapes. Stopping your friends when they’re about to tell you someone else’s personal business.

(6) Wicked willful ignorance seems to involve a lot of non-ignorance. As when someone believes or suspects a proposition to be true, but would rather it not be true, or I would rather be able to maintain plausible deniability, so avoids learning facts about the proposition.

But this requires knowledge of some of the facts about the world that make the proposition true. If there’s any willful ignorance here, it’s only of those facts’ finer or explicit details.

For example, when there is explicit evidence that will settle a disagreement between two people, but one of them refuses to look at it because they believe it will prove them wrong.

Such (non-ignorant) willful ignorance tends to infuriate me, particularly when the facts are cut and dry. So whenever confronted with it, I reflect on Ted Chiang’s fantastic short story, “The Truth of Fact, the Truth of Feeling” (from his 2019 collection, Exhalation: Stories), and am reminded that such maneuvers are often beside the point; digging too hard into them can go terribly.

(7) I think what we often really mean by “willful ignorance” is pretend ignorance. In other words, lying. The “willful” part, then, refers to the behavior of appearing ignorant, as well, perhaps, to the behavior of ignoring or dismissing or avoiding facts surrounding a proposition.

In other terms, and perhaps more subtly, it seems that to accuse someone of willful ignorance amounts to a claim that they harbor a low-level, nagging voice in their head telling them what the truth is, but that voice is being disregarded on some rationale or another (or maybe on no explicit rationale). This implies, however, that the person is, in a literal sense, not ignorant.

(8) “Willfully ignoring” and “willfully ignorant” are not identical. That is, I can willfully ignore someone whose trying to get my attention—i.e., I can pretend like I don’t hear them. Actually, this does not need the word “willful” in front of it (you don’t “accidentally” ignore someone; rather, you don’t notice them, which is not willful). So I take it that “willfully” ignoring involves something stronger, more pervasive—and more vague.

For example, in his book The Signal and the Noise: Why So Many Predictions Fail—But Some Don’t (2012), Nate Silver writes:

We forget—or willfully ignore—that our models are simplifications of the world. We figure that if we make a mistake, it will be at the margin. (p 45)

It strikes me that Silver refers to a deep, systematic ignoring that you can do—in fact can do best—without being ignorant of the thing ignored. We ignore things all the time with models, which in fact is the point of a model. But what, to Silver’s point, we shouldn’t ignore is that a model is the map—or one of many possible maps!—and not the territory.

Seems to me such willful ignorance doesn’t just apply to forecasting or mathematical models. We see this in political ideology, in science, in the scripts that guide the handing of disputed sports calls, in how we categorize individuals and social groups, and on and on.

This is the stuff of every day living, often as a kind of much-needed lubricant. Often as a regrettable toxin.

(9) Catholic theology makes an interesting distinction between vincible and invincible ignorance.

From the Wikipedia entry on “Vincible Ignorance“:

…ignorance that a person could remove by applying reasonable diligence in the given set of circumstances. It contrasts with invincible ignorance, which a person is either entirely incapable of removing, or could only do so by supererogatory efforts (i.e., efforts above and beyond normal duty).

It’s worth also citing the Wikipedia entry on “Invincible Ignorance“:

Invincible ignorance is used in Catholic moral theology to refer to the state of persons (such as pagans and infants) who are ignorant of the Christian message because they have not yet had an opportunity to hear it.

This points to a long history of questions about moral culpability or blameworthiness in light of ignorance. That’s a complicated topic, with some ties, I think, to the difficult topic of moral luck.

I won’t get into that here except to say that it seems to me that, sometimes, by calling someone “willfully ignorant,” we are looking for a way to assign blame despite a person genuinely being ignorant. This is different than believing someone to not really be ignorant, only pretending to be. I think both things occur.

In other words, being willfully ignorant implies (moral) responsibility. But how?

(10) The above Catholic designation is the source of the so-called “Invincible Ignorance Fallacy,” which also has a Wikipedia entry:

…a deductive fallacy of circularity where the person in question simply refuses to believe the argument, ignoring any evidence given. It is not so much a fallacious tactic in argument as it is a refusal to argue in the proper sense of the word…

This is a broader, in some cases less cut-and-dry take on my earlier point about someone refusing to look at obviously dispositive evidence.

The broader Invincible Ignorance Fallacy, however, seems to be what we accuse someone of when they review the evidence that we are sure will sway them to our belief—i.e., to the “correct” belief—yet they claim not to be swayed. We must then wonder if they are being sincere or if they are lying, or if it’s something in between. Worst of all, and I bet the option of last resort, is the possibility that our own beliefs aren’t as rock solidly grounded as we originally thought.

Here’s a disgusting example. I’m certain that human excrement tastes awful. If someone swore to me that I’m in error, but that the question can be settled once and for all if I’d taste some, I would happily remain ignorant. Though what if I try it and the person claims I’m lying when I maintain my disgust at this experiment?

And, of course, what if it weren’t excrement? What if it were some cheese that smells like death?

There’s an excellent short story for this one as well. “How Monkey Got Married, Bought a House, and Found Happiness in Orlando” by Chuck Palahniuk, in his 2015 collection Make Something Up: Stories You Can’t Unread.

(11) There is also much to say about the legal dimensions of (willful) ignorance. Wikipedia has a brief entry “with multiple issues” (as of today, 5/6/20) called “Willful Blindness” that touches on this question in the context of drugs and file-swapping:

Willful blindness (sometimes called ignorance of law, willful ignorance or contrived ignorance or Nelsonian knowledge) is a term used in law to describe a situation in which a person seeks to avoid civil or criminal liability for a wrongful act by intentionally keeping himself or herself unaware of facts that would render him or her liable or implicated. In United States v. Jewell, the court held that proof of willful ignorance satisfied the requirement of knowledge as to criminal possession and importation of drugs.

I’ll look at two more sources before wrapping up.

(12) From the RationalWiki entry, “Willful Ignorance“:

Willful ignorance is the state and practice of ignoring any sensory input that appears to contradict one’s inner model of reality. At heart, it is almost certainly driven by confirmation bias.

This implies “ignore” in the sense of being aware of something but behaving as though you are not aware of it. The reference to confirmation bias implies that this can occur without explicitly noticing. But it does require consciously processed sensory input. Which is to say, some significant degree of awareness (i.e., non-ignorance) of the thing ignored.

Under this definition, you could read every book on a subject, and then be willfully ignorant of it.

Unless by “ignore” this definition really does mean to imply that I can, for example, take in the sensory input of the banana sitting in front of me, but somehow not allow it to go any further than my iconic memory, so that it is not incorporated in my model(s) of the world. I doubt this is the idea here.

(13) From a Psychology Today article by Mark Alicke Ph.D. called “Willful Ignorance and Self-Deception” (9/10/17):

Willful ignorance occurs when individuals realize at some level of consciousness that their beliefs are probably false, or when they refuse to attend to information that would establish their falsity.

People engage in willful ignorance because it is useful. …

People can sometimes be pulled out of their willful ignorance with a modicum of probing, or with contradictory data.

This definition strikes me as fairly in line with what I take casual usage of the term to often mean (see my aforementioned notion of harboring a nagging voice). I’m not sure what to make of the explanation for why we engage in it or several of the article’s examples, but fair enough.

The portrayal here of willful ignorance is a negative (and, I’m happy to say, compassionate) one. Is this to suggest that the other uses of “willful ignorance” I’ve discussed here—whether wicked or virtuous—describe phenomena that should be called something else? Is “willful ignorance” a term of art psychologists use consistently—or a concept they operationalize consistently—with respect to a specific behavior or cognitive activity/strategy? Or is the article just short?

Intentionally not reading a text someone accidentally sent me certainly counts as a literal instance of willful ignorance. But nothing like that is discussed in the article. Though it does make a distinction that I hadn’t yet considered (notice the phrase “sort of” in the first sentence; what is its function there?):

In contrast to this sort of willful ignorance, self-deception occurs when individuals believe false things with complete conviction. … consider the sports fan who “sees” the opponent commit a foul in a crucial moment of a basketball game, when in reality, it was the player on his own team, not the opponent, who committed the foul. Because this type of self-deception occurs at the perceptual level, the sports fan actually “sees” the opponent commit the foul, and no amount of replay or argumentation will convince him otherwise.

I’ve long been fascinated by the phenomenon of disputed fouls, which I alluded to above as a kind of willful ignorance, partially because I believe we demand this sort of mass delusion: imagine the violence a player would incur were they to concede a championship game by saying, “you know what, it was really me who committed the foul!.” I believe we do the same politically, and there is research to back this up, but I’ll save that for another day.

Back to the article. Alicke elaborates:

The difference between willful ignorance and true self-deception is subtle, but important. Willful ignorance tends to be more adaptive than self-deception. Willful ignorance is a cognitive strategy that people adopt to promote their emotional well-being, whereas self-deception is less controllable and more likely to be detrimental.

…The distinction between willful ignorance and self-deception has interesting implications for moral judgment. In general, we probably blame willfully ignorant people more for their actions and attitudes than those we suspect of self-delusion.

All right. Willful ignorance is not as bad as self-deception. But self-deception is not generally volitional or, at least, is less willful.* And so we’re likely to blame people more for willful ignorance than for self-deception; or, perhaps, accuse them of willful ignorance in order to justify blaming them for something more like self-deception (this is along the lines of what I suggested in (9) above).

[*Which raises interesting questions for what is meant by “self” here, compared to, say, in the term “self-flagellation.”]

This distinction comes down to whether we think a person “actually does know better,” as Alicke puts it. To go further into this thought would take us deeper into questions about how much we blame people for their beliefs—something I explored in detail in a post called “Four Dimensions of X-ism (and ‘Seminal’ Is Sexist).”

I won’t venture into that here. But will note that Alicke duly acknowledges that “if the distinction between willful ignorance and self-deception were this tidy, philosophers would have nothing to write about and psychologists would have less to research,” as well as the fact that we’ll blame for self-deception given that “in some instances of moral evaluation, we pay more attention to the motives that we believe drive people’s actions than we do to their awareness of those motives.”

This suggests to me that we can be willfully ignorant or self-deceived about the willful ignorance or self-deception of others. Rinse and repeat.

On further reflection, Alicke packs a good number of thought-provoking ideas into this short piece. Such as his claim that “it is usually better to confront reality than to avoid or deny it.”

I don’t know about that, but it recalls to mind the thought experiment.

(14) So. B is a proposition that you think only an evil person, or at least a highly misguided person, could believe. You are also convinced that, while you genuinely reject B, and are maybe even horrified by the thought of B, you are also disposed to believe it. What do you do with Future You’s reading list?

(I know how I’d answer.)

Stanford Encyclopedia of Philosophy Describes Self-Deception As Follows

First published Tue Oct 17, 2006; substantive revision Mon Nov 7, 2016
Virtually every aspect of self-deception, including its definition and paradigmatic cases, is a matter of controversy among philosophers. Minimally, self-deception involves a person who seems to acquire and maintain some false belief in the teeth of evidence to the contrary as a consequence of some motivation, and who may display behavior suggesting some awareness of the truth. Beyond this, philosophers divide over whether self-deception is intentional, involves belief or some other sub-or-non-doxastic attitude, whether self-deceivers are morally responsible for their self-deception, and whether self-deception is morally problematic (and if it is in what ways and under what circumstances), whether self-deception is beneficial or harmful, whether and in what sense collectives can be self-deceived, how this might affect individuals within such collectives, whether our penchant for self-deception was selected for or merely an accidental byproduct of our evolutionary history, and if it was selected, why?

The discussion of self-deception and its associated puzzles sheds light on the ways motivation affects belief acquisition and retention and other belief-like cognitive attitudes; it also prompts us to scrutinize the notion of belief and the limits of such folk psychological concepts to adequately explain phenomena of this sort. And yet insofar as self-deception represents an obstacle to self-knowledge, both individually and collectively, it is more than just another interesting philosophical puzzle. It is a problem of existential concern, since it suggests that there is a distinct possibility that we live with distorted views of our selves, others and the world that may make us strangers to ourselves and blind to the nature of our significant moral engagements.

1. Definitional Issues
What is self-deception? Traditionally, self-deception has been modeled on interpersonal deception, where A intentionally gets B to believe some proposition p, all the while knowing or believing truly that ~p. Such deception is intentional and requires the deceiver to know or believe that ~p and the deceived to believe that p. One reason for thinking self-deception is analogous to interpersonal deception of this sort is that it helps us to distinguish self-deception from mere error, since the acquisition and maintenance of the false belief is intentional not accidental. If self-deception is properly modeled on such interpersonal deception, self-deceivers intentionally get themselves to believe that p, all the while knowing or believing truly that ~p. On this traditional model, then, self-deceivers apparently must (1) hold contradictory beliefs, and (2) intentionally get themselves to hold a belief they know or believe truly to be false.

The traditional model of self-deception, however, has been thought to raise two paradoxes: One concerns the self-deceiver’s state of mind—the so-called ‘static’ paradox. How can a person simultaneously hold contradictory beliefs? The other concerns the process or dynamics of self-deception—the so-called ‘dynamic’ or ‘strategic’ paradox. How can a person intend to deceive herself without rendering her intentions ineffective? (Mele 1987a; 2001)

The requirement that self-deceivers holds contradictory beliefs raises the ‘static’ paradox, since it seems to pose an impossible state of mind, namely, consciously believing that p and ~p at the same time. As deceiver, she must believe that ~p, and, as deceived, she must believe that p. Accordingly, the self-deceiver consciously believes that p and ~p. But if believing both a proposition and its negation in full awareness is an impossible state of mind to be in, then self-deception as it has traditionally been understood seems to be impossible as well.

The requirement that the self-deceiver intentionally get herself to hold a belief she knows to be false raises the ‘dynamic’ or ‘strategic’ paradox, since it seems to involve the self-deceiver in an impossible project, namely, both deploying and being duped by some deceitful strategy. As deceiver, she must be aware she’s deploying a deceitful strategy; but, as the deceived, she must be unaware of this strategy for it to be effective. And yet it is difficult to see how the self-deceiver could fail to be aware of her intention to deceive. A strategy known to be deceitful, however, seems bound to fail. How could I be taken in by your efforts to get me to believe something false, if I know what you’re up to? But if it’s impossible to be taken in by a strategy one knows is deceitful, then, again, self-deception as it has traditionally been understood seems to be impossible as well.

These paradoxes have led a minority of philosophers to be skeptical that self-deception is conceptually possible or even coherent (Paluch 1967; Haight 1980; Kipp 1980). Borge (2003) contends that accounts of self-deception inevitably give up central elements of our folk-psychological notions of “self” or “deception” to avoid paradox, leaving us to wonder whether this framework itself is what gets in the way of explaining the phenomenon. Such skepticism toward the concept may seem warranted, given the obvious paradoxes involved. Most philosophers, however, have sought some resolution to these paradoxes, instead of giving up on the notion itself, not only because empirical evidence suggests that self-deception is not only possible, but pervasive (Sahdra & Thagard 2003), but also because the concept does seem to pick out a distinct kind of motivated irrationality. Philosophical accounts of self-deception can be organized into two main groups: those that maintain that the paradigmatic cases of self-deception are intentional, and those that deny this. Call these approaches intentionalist and revisionist respectively. Intentionalists find the model of intentional interpersonal deception apt, since it helps to explain the selectivity of self-deception and the apparent responsibility of self-deceiver, as well as providing a clear way of distinguishing self-deception from other sorts of motivated belief such as wishful thinking. Revisionists are impressed by the static and dynamic paradoxes allegedly involved in modeling self-deception on intentional interpersonal deception and, in their view, the equally puzzling psychological models used by intentionalists to avoid these paradoxes, such as semi-autonomous subsystems, unconscious beliefs and intentions and the like. To avoid paradox and psychological exotica revisionist approaches reformulate the intention requirement, the belief requirement or both.

Read The Rest At: https://plato.stanford.edu/entries/self-deception/

Visit Our Sister Sites

Paradise Getaways Paradigm Shift Private Consulting

Newest Posts

Categories

0 Comments

Related Posts