Anchoring Biases of Mental Health Professionals

Mental health professionals typically reach judgmental conclusions very early in their interviews; and then, they cling to those impressions even when confronted with contrary evidence.  At PsychLaw.net we consider, for example, a 1964 experiment done by Jerome Bruner at Harvard University[1].  In this experiment, one group of participants (the control group) viewed slightly blurred slides.  Nevertheless, the control group participants could identify the objects on the slides with a relatively high degree of accuracy.  For the other group of participants (the experimental group), the slides were initially so blurry, subjects could not accurately identify the objects.  The experimental group then saw the slides again at a level of clarity equal to what the subjects in the control group saw.  At equal levels of clarity, the experimental participants committed a significantly greater number of identification errors compared to the control group. Why did the experimental group commit this greater frequency of errors?

Bruner explained that when initially presented with the blurred slides, participants in the experimental group developed preliminary but typically inaccurate impressions. These preliminary impressions interfered with the subject’s accurately identifying the slides at a level of clarity equal to the control group. In other words, their thinking was “anchored” to their preliminary impressions; and as a result, they continued to make identification errors.

Like the experimental group in this 1964 study, the clinical judgments of mental health professionals frequently succumb to anchoring biases.  Anchoring biases lead them into attributing excessive significance to information obtained in the earlier stages of an interview.  Consequently, these professionals encounter considerable difficulty adjusting their opinions in response to subsequent information that is inconsistent with their earlier impressions. At PsychLaw.net we note that  mental health professionals who are “anchored” by their preconceived notions, or the earlier information, present the examiner with a rule 703 dilemma.[2]  “Anchored” mental health professionals too often rely on information that is inherently unreliable.[3]

Assume, for example, that Professional A interviews a patient who seems to express himself with considerable verbal fluency.  As a result, Professional A concludes that this patient possesses an above‑average vocabulary.  This preliminary conclusion will make it very difficult for Professional A to alter her subsequent opinions if necessary.  Even in the face of contrary evidence encountered later in the interview, Professional A’s initial impressions will continue to “anchor” her subjective view of the patient.

A 1983 experiment, using a sample of 46 mental health professionals (psychologists, psychiatrists, and social workers), dramatically demonstrated the effects of anchoring bias[4].  These professionals read summaries of five therapy sessions for two different clients.  For each client, information regarding the client’s suicidal thinking, or anorexia, was reported in either the first, or fourth, therapy session.  All of the other information was exactly the same for each client.

When the professionals reviewing these treatment notes encountered information regarding suicidal thinking ‑ or anorexia ‑ in the first session compared to the fourth session, they rated the clients as more disturbed, and with a worse prognosis.  In other words, encountering information indicative of serious psycho-pathology in the first therapy session ‑ compared to the fourth session ‑ anchored how these professionals responded to subsequent information. Interestingly, a subsequent study demonstrated that when compared to mental health professionals, college students do not exhibit this kind of anchoring bias[5].

At PsychLaw.net we understand that as a result of their anchoring biases, mental health professionals commonly overestimate the amount of information they process during their evaluations.[6] They assume they weigh multiple factors in making their judgments, but research evidence demonstrates that they rely on minimal data[7].  In fact, research informs that mental health professionals frequently arrive at their diagnostic impressions within the first two to three minutes of an interview, and sometimes as rapidly as 30 seconds[8].

Footnotes

[1]. Bruner, J.S. & Potter, M.C. (1964).  Interference in visual recognition.       Science. 144, 424-425.

[2]. See, e.g., United States v. Tran Trong Cuong, 18 F.3d 1132, 1143-44 (4th Cir. 1994) (although ex­perts may consider hearsay, including reports of other experts, in reaching their opinions, the reports must qualify as data Aof a type reasonably relied upon by experts in the particular field”.  An example:  a physician in the field of family medicine would not usually rely upon forensic medical opinions Aspecifically prepared for purposes of litigation”). See also Marsee v. United States Tobacco Co., 866 F.2d 319, 323 (10th Cir. 1989) (excluding expert’s testimony regarding conversations with other experts about cases that supported his opinion).

[3]. The attack on an expert who is Aanchored” should not go to Aweight” alone.  The essential attack should be on the issue of admissibility. This distinction was illustrated in Viterbo v. Dow Chemical, 826 F.2d 420.(5th Cir. 1987). On that subject, the court said: As a general rule, questions relating to the bases and sources of an expert’s opinion affect the weight to he assigned that opinion, rather than its admissibility, and should be left for the jury’s considerations… In some cases. however, the source upon which an expert’s opin­ion relies is of such little weight that the jury should not be permitted to receive that opinion. Expert opinion testimony falls into this category when that testimony would not actually assist the jury in arriving at an intelligent and sound verdict. CJ Weinstein & U. Berger, Weinstein’s Evidence ‘ 702[1] (1985) (assistance of trier of fact is central concern to federal rules of evidence regarding opinion witnesses). If an opinion is fundamentally unsupported, then it offers no expert assistance to the jury. See also: Hyatt v. Sierra Boat Co., 145 Cal. Rptr. 47 (Ct. App. 1978) – An expert’s assumption of facts contrary to the proof destroys the opinion. Likewise, if the opinion is not based on facts otherwise proved, it cannot rise to the statute of substantial evidence.  Id. at 55. And see: Pacific Gas & Electric Co. v. Zuckerman, 234 Cal. Rptr. 630 (Ct. App. 1987) – Cases in which the expert’s conclusions are based upon assumptions not supported by the record; upon matters not reasonably relied on by other experts; or by factors so speculative, remote, or con­jectural that the conclusion has no evidentiary value – prohibited because such evidence is not substantial. Id. at 643.

[4]. Friedlander, M.L. & Stockman, S.J. (1983).  Anchoring and publicity effect in clinical judgment.  Journal of Clinical Psychology, 39, 637-643.

[5]. Friedlander, M.L. & Phillips, S.D. (1984).    Preventing anchoring errors in clinical judgment.  Journal of Consulting and Clinical Psychology, 52, 366-371.

[6].  There are many examples of Aexperts” unreasonably relying on data that cannot support their inferences. Here are some examples of unsubstantiated facts as improper source. 1st Circuit   See, e.g., Ricciardi v. Children’s Hosp. Medical Ctr., 811 F.2d 18. 24-25 (1st Cit. 1987) (district court properly excluded medical expert’s opinion based on handwritten treatment note of doctor who had no personal knowledge of event described and who could not recall source of information.  Note was not the type of data that an expert would rely on to form opinion). 3rd Circuit  Shaw by Strain v. Stackhouse, 920 F.2d 1135, 1142 (3d Cir. 1990) (trial court properly disregarded plaintiffs expert’s conclusion that rested on assumption shown contrary to fact by record). l0th Circuit See. e.g., Lima v. United States, 708 F.2d 502, 508(10h Cir. 1983)(trial court properly excluded testimony of specialist in who relied on data that was not of type reasonably relied upon by experts in fields of epidemiology and neurology). D.C. Circuit See Ealy v. Richardson-Merrell, Inc., 897 F.2d 1159, 1161-1162 (D.C. Cir.) cert. denied 498 U.S. 950 (1990) (court excluded testimony as without scientific foundation in face of wealth of published contrary data).

[7].         Fisch, H.U., Hammond, K.R., & Joyce, C.R. (1982).  On evaluating the severity of depression: An experimental study of psychiatrists.  British Journal of Psychiatry, 140, 378-383. ~ and ~

Gillis, J.S. & Moran, T.J. (1981).  An analysis of drug decisions in a state psychiatric hospital.  Journal of Clinical Psychology, 37, 32-42.

[8].         Yager, J. (1977).  Psychiatric eclecticism: A cognitive view.  American Journal of Psychiatry, 134, 736-741.

Leave a Reply

Your email address will not be published. Required fields are marked *