Jeffrey Mishlove in The Roots of Consciousness describes an overview of cognitive science in psychology and provides an outline for consideration of how the intervention of cognitive restructuring works.

Psychological research, he tells us, has identified numerous risks of assessing evidence by subjective judgement. These risks include information processing or cognitive biases, emotional self-protective mechanisms, and social bias.

The investigation of biases in judgement has followed from the study of perceptual illusions. Our understanding of the human visual system, for example, comes in part from the study of situations in which our eye and brain are “fooled’ into seeing something that is not there or not seeing what is there. In the Muller-Lyer visual illusion, for example, the presence of opposite-facing arrowheads on two lines of the same length makes one look longer than the other. We generally do not realize how subjective this construction is. Instead we feel as if we are seeing a copy of the world as it truly exists.

Cognitive judgements, such as thoughts about self, others and the future have a similar feeling of ‘truth’ – it is difficult to believe that our personal experience does not perfectly capture the objective world. We can check with a ruler that the two lines are the same length, and we believe the formal evidence rather than that or our fallible visual system. With cognitive biases, the analogue of the ruler is not clear. Against what would we validate our judgmental system? This is an important issue, since a restructuring of thoughts is contingent upon a belief in ‘formal evidence’. Yet without a ruler that is not ‘bent’ by judgmental errors, how can we be sure?

One of the basic errors typical to intuitive judgment is called the confirmation bias. If you hold a theory strongly and confidently, then your search for evidence will be dominated by events that confirm your theory. Such events will be more attractive and attention getting. People trying to solve logical puzzles, set out to prove their hypothesis by searching out confirming examples, when they would be more efficient if they would search for disconfirming examples.

It seems more natural to search for examples that ‘fit’ with the theory being tested, than to search for items that would disprove the theory. If we remember Hofstadter’s suggestion regarding the way we develop our ‘theories’ of the world, he told us: “The way this works is to let perceptual glue of various sorts bubble up in parallel in different regions of the experience, with a tendency but not a rule for sameness glue to emerge the fastest, …each dab of glue then acts as a small local pressure towards building a particular type of island of order in a particular location. This way, natural perceptual biases can be respected but not slavishly so, and diverse ideas – “hunches’ …- can arise independently and be explored simultaneously in different regions of the experience (italics added).

Yet, psychologist, as Mishlove is revealing, are very clear that people become ‘slavish’ in their confirmation bias. In fact, it is this rigidity, which in its extreme is characteristic, of mental disorder. Perhaps, the flexibility is lost as the person matures if there is no experience of ‘dispute’ that requires the need to develop and explore alternative explanations. Buddhist monks for example are very oriented to the “real world” and seem not to have the same need for confirmation. Their whole culture, however, is a process of dispute, and conflicting concepts of truth, as reflected in the following Zen story.

The emperor, who was a devout Buddhist, invited a great Zen master to the Palace in order to ask him questions about Buddhism. ‘What is the highest truth of the holy Buddhist doctrine?’ the emperor inquired.

‘Vast emptiness … And not a trace of holiness.’ the master replied.

‘If there is no holiness,’ the emperor said, ‘then who or what are you?’

‘I do not know,’ the master replied.

Certainly a contrarian’s answer and unexpected. Yet Zen stories continually emphasize two concepts; first, that all things are one and second, that you must keep a ‘beginner’s mind’. If all things are one, death and life are the same, yet how can this be? Is this not a process of dispute of the ordinary set of beliefs in most of us? And if we keep a ‘beginner’s mind’, aren’t we are required to put aside the preconceived notions which result from our mental contexts and listen to the reality of the event?

The story is told, not to make us all into Buddhist, but to indicate that some people do not fall into the trap of a confirmation bias, and that this can be learned. Yet, despite this potential, Mishlove reports that people in virtually all professions [except horse-racing handicappers and weather forecasters, who receive repeated objective feedback] are much more confident in their judgements and predications than their performance would justify. The good omen is that when people are required to receive objective feedback and to consider that their thoughts may be inaccurate, there is an ability to minimize the bias.

While it is difficult to get one who is rigid in their interpretation of events to accept feedback as at least somewhat objective, one of the few ways to temper this overconfidence is to explicitly ask them to list the ways that they might be wrong – for, unless prodded, each of us will only consider the confirmatory evidence. The explicit prodding and the ‘psychological mirror’ through a trusted other, not only requires that the person look at the ‘truth’ of their thoughts, but also the ‘fitness’ of those thoughts in helping them to predict and control events in a manner which enables them to reach their goals. This is a part of the cognitive restructuring process. We should note, that the process is a self-reconstruction process as the helper can only dispute and offer alternative explanations, but the person makes their own decisions.

A dramatic example for the confirmation bias is the problem of the ‘self-fulfilling prophecy. In 1798, Rosenthal and Rubin reported the results of a meta-analysis of 345 studies of expectancy effects and demonstrated that our biased beliefs cause us to work towards the creation of that reality. Again, in Buddhist tradition, one ‘creates the future’ through the thoughts about that future. One might also suggest that Jesus suggested the same thing when he stated that ‘the Kingdom of God is here’ in each person as belief in the Kingdom is cognitively equivalent to the creation of it.

Mishlove rightly points out that experimental expectancy effects are a potential source of problem in any research area, but they may be especially influential in… areas lacking well-established findings, and one might add, most particularly in research regarding human beings. This is because the first studies on a given technique are typically carried out by creators or proponents of the technique who tend to hold very positive expectations for the efficacy of the technique. Human services research literature abounds in studies, which support a methodology in which the creator is making money based upon his/her which proof of the effectiveness of the technique.

Another typical error of intuition, pointed out by Mishlove, is our tendency to overgeneralize and stereotype. In one particular unsettling set of experiments, people were willing to make strong inferences about prison guards and about welfare recipients, on the basis of one case study, even when they were explicitly forewarned that the case study was atypical. This suggests that people find it relatively easy to ‘think the worst of others’ once they have had a bad experience or two. Once the stereotype is made, the confirmation bias encourages all evidence, which might support the stereotype and ignores all the other evidence. Since the mental schema in regard to others is one of the three major ones [the others being self and prospects], it follows that this is of particular concern in cognitive restructuring.

Mishlove suggests that the most serious, common error in our understanding of probability is the over interpretation of coincidence. Our sense of probability is quite skewed and we tend to see two occurrences as much more important then they might be. Thus, if we happen to have two interpersonal experiences in close proximity that go awry, we are unlikely to consider them as separate occurrences, but rather as a pattern of rejection. From this we may overgeneralize the situation and decide that “people can’t be trusted”. Such a ‘truth’ then is easily supported by the confirmation biased evidence and real supporting evidence which our attitude towards others is likely to engender.

The building of mental contexts which are unhelpful in interpersonal relations becomes easier and easier to understand. One source of overconfidence in our own judgments is the belief that we can always search our minds for the evidence and reasoning upon which these judgements are based. We sometimes mistakenly believe that we know whether we are biased and emotionally involved or evaluating objectively. Thus, we are assured by careful scrutiny of our own thinking that we are correct in our observations that 1) ‘it can’t be coincidence that we were treated badly. 2) that ‘people can’t be trusted’, and 3) all of the subsequent evidence supports that theory. We therefore behave in a manner which supports the theory, distrusting others and treating them in ways that support a 4) self fulfilling prophecy, thus enhancing the cognitive bias and the evidence.

Psychological studies indicate that this compelling feeling of self-awareness regarding our decision process is exaggerated. Unless a specific method is used to force us into quantification and dispute of our estimates, we tend not to do well. It is not surprising, therefore, that decision-making training emphasizes making percentage guesses about the potential of one thing or another happening.

In another area of cognitive qualification, psychologists consider it normal behavior to distort our images of reality in order to enhance our own self-image. In fact the inability to create such protective distortions may lead to depression. Depressives seem willing to accept that they do not have control over random events. On the other hand, more typical people maintain an illusion of control over chance events that have personal relevance. While depressives appear to be less vulnerable to this illusion normal people will see themselves ‘in control’ whenever they can find some reason.

This, of course, raises the comparison with the Buddhist monk; why do both see reality so well, but the monk seems not to be affected in the same way as the depressive personality. We would suggest that it is connected again with the degree of rigidity or flexibility. People with depressive personalities not only see the lack of control over the present reality, they envision it as permanent and pervasive; while the Buddhist sees it as part of life; control and noncontrol being the same. It hardly needs to be stated that the Buddhist tradition is based on cognitive dissonance.

Yet, Mishlove points out that one of the most powerful forces maintaining our beliefs in spite of others’ attacks, our own questioning, and the challenge of new evidence – is the need to maintain cognitive consistency and avoid cognitive dissonance. Leon Festinger’s theory of cognitive dissonance explains apparently irrational, acts in terms of a general human need for consistency. Because of this, insight, untested and unsupported, is an insufficient guarantee of truth, in spite of the fact that much of the more important truth is first suggested by its means. These are the problems that arise when intuition replaces logic as the arbiter of truth.

Intuitive processes based on personal experience sometimes seem designed as much for protecting both our sense of self-esteem and our prior opinions as for generating accurate predictions and assessments. The maintenance of the personality is incorporated into the degree of ‘rightness’ in our beliefs and attitudes. To change a major belief after the age of seven or eight requires that we change who we are; and this is a very threatening place to be. Uncertainty is not something that most of us are comfortable with. We often cling to the ‘devil that is known’ rather then test the uncertainty of change.

The power of our beliefs is profound, as indicated by the placebo effect. Irving Kirsch reports that such effects are automatic consequences of the person’s beliefs; a response expectancy. In a meta-analysis of literature on the ability of such response expectancies to elicit automatic responses in the form of self-fulfilling prophecies Kirsch found that depression is among the conditions in which it is particularly pronounced. Kirsch and Saperstien reported that the effect size for pretreatment to post treatment changes in depression in patients given antidepressant drugs was 1.55. This is a very large effect and indicates substantial improvement. However, they also report the effect size for placebo was 1.16.

This indicates that 75% of the effect of antidepressant drugs may have come from the persons own expectancy response. More remarkable, the proportion of the effect size duplicated by placebo was virtually identical across medication type [range 74% to 76%]. They conclude that ‘it is possible that all of these drugs function as active placebos. An active placebo is an active medication that does not have specific activity for the condition being treated. Thus, the apparent drug effect of antidepressants may in fact be a placebo effect, magnified by differences in experienced side effects. Intuitive self-knowledge of the type required for a wide variety of higher mental functions requires a healthy respect for our natural human biases and attention and memory.

One must wonder, of course, as to whether Kirsch, etal and Mishlove are in conflict. But they are really talking about different cognitive errors. Depressives are obviously prone to ‘self fulfilling prophecies’ even though they may not be as affected by protective distortions. In fact, the ‘self fulfilling prophecy’ aspect may be the place that Buddhist and depressives separate. While both see reality a little clearer, depressives make it into a ‘self-fulfilling prophecy, while the Buddhist keeps a ‘beginner’s mind’.

A key difference between intuitive and scientific methods is that the measurements and analyses of scientific investigations are publicly available for criticism and review, while intuitive hypothesis-testing takes place inside one person’s mind. In the process of cognitive restructuring, the personal thoughts become public to only a limited degree. The addition of a ‘cognitive mirror’ (the helper) not only alerts to the person with problems in living to become aware of and attend to their thoughts, s/he asks that they be made public and be shared. Both the individual with problems in living and the helper become aware of and attend to the automatic thoughts that occur around events and experiences. Instead of simply categorizing the experience as pro or con, the person seeking to restructure these thoughts is required to attend and analyze these thoughts in the presence of the helper and to make determination about truth and fitness within the context of potential dispute.

The analogy of the ruler then is the public debate of intuition about self, others and the future. While there is no guarantee that the individual with problems in living will agree, the conscious mind that is used for ‘debugging’ the nonconscious processes we have call mental contexts, is attracted to innovative and novel information. In many ways, the public debate to the thoughts is from the perspective of two new minds as the helper is looking at the thoughts through his/her own cognitive contexts and the person with problems in living is looking at the thoughts with a new mind since they have probably not been aware of these thoughts in this way for a long time. Further, as the person with problems in living is taught to be aware of the errors described, learns methods of analysis and creation of alternative solutions and problems solving, they become much more able to examine the thoughts for fitness. The helper is also suggesting that the thought be compared to the personal goals and the ‘fitness’ in helping to reach those goals. Since personal goals are also often nonconscious, this is another process of focus. The real analogue is focus combined with new language and constructs and perhaps, some additional skill in the scientific methods of analysis.