Some time ago, Psyblog posted a series called "What Everyone Should Know About Their Own Minds". Appropriately enough, it covers different ways in which humans typically completely misunderstand (or misjudge) components of their own behavior such as motivations, reasons, and predictions about their own responses. Sometimes this involves resolving cognitive dissonance, where a subject changes some of her beliefs in order to better match other beliefs they had been opposing; other times the explanation is less clear.
Monday, November 30, 2009
The whole subject is a curious one, since, subjectively, most of us feel like we have pretty transparent access to the inside of our own heads. For example, if asked to give reasons why we find one face more attractive than another, we usually think we can do so so; and we feel moreover that the reasons we come up with will be true to whatever processes actually go on inside our heads. Not so, as one of the Psyblog posts reports; at least not with the accuracy that we'd think we would have.
My impression is that we are sometimes alien to our own minds, insofar as we don't understand most of our inner machinations. Yet, strangely, we have this need to "tell stories", even to ourselves, about why we've done things a particular way: a need for explanation or justification of our inner landscape, much in the fashion of the need that we have to explain (and understand) the outer world. This isn't such a bad thing, since seeking explanations is the heart of science (and rational inquiry). But it does emphasize the necessity of being relentlessly critical and skeptical if your aim is truth -- skeptical even about your own thought processes -- lest you settle too firmly on the first story that seems plausible to you.
A downside of the above skeptical strategy, in my own experience, is that it tends to drive you a little bit crazy. Constantly doubting your own motivations, values, judgments, and ostensible motivations is not a very enjoyable way to go about your day. After a certain point, if you lose too much faith in your own understanding with regard to the contents of your own head, you may end up impeding your own progress, due to constantly looking for nonexistent solid ground. From a practical "getting things done" standpoint, it may be better to be wrong about a few little details here and there if you're still able to function well in the main.
We may suppose, perhaps, that this was why we did not evolve to be more naturally self-critical and self-reflective creatures. Don't get me wrong, compared to any other form of intelligence that we know about, we still go pretty far toward it -- even the least reflective of individuals tries to purge logical inconsistencies from her thoughts, though the amount of tolerance for inconsistency obviously varies from person to person. But, based on experiments like those linked above, plausible-seeming beliefs about your own motivations were evolutionarily much more relevant to fitness than numerous safeguard mechanisms to ensure internal accuracy. Apparently.