Cognitive Assessment

Advice for psychological evaluation reports: Make every sentence worth reading

I have made this [letter] longer, because I have not had the time to make it shorter.

– Blaise Pascal, “Lettres provinciales”, letter 16, 1657

The secret of being a bore is to tell everything.

 – Voltaire, “Sixième discours: sur la nature de l’homme,” Sept Discours en Vers sur l’Homme (1738)

A little inaccuracy sometimes saves tons of explanation.

– Saki, The Square Egg, 1924

When we get together, we psychologists often lament that we spend a lot of time writing psychological evaluation reports that no one reads, at least not in full. I have come to believe that this is mostly our fault. Much of what we write in our reports is boring (e.g., describing each test), canned (e.g., describing each test), confusing (e.g., describing each test), and irrelevant (e.g., describing each test). It would be an understatement to say that I am not the first to voice such opinions.

If we want people to read our reports carefully, we must write reports that are worth reading all the way through. If you insist on including boring, canned, confusing, and irrelevant content, consider tucking it away in an appendix.

Explain what you know, not how you know

As students we are rewarded for “showing our work.” We are encouraged to state a position and then provide data and arguments that justify our claims. The resulting literary form (the student position paper) aligns well with the objectives of the course but it rarely aligns with the purpose of psychological evaluation reports. Reports should focus on communicating to the reader something that is useful and true about an individual. Presenting observations and data and then walking the reader through the steps in our diagnostic reasoning is rarely helpful to non-specialists. Most readers need the results of our assessment (our interpretations and suggestions), not an account of our process.

My old reports are embarrassing

My earliest reports contained mini-tutorials on operant conditioning, attachment theory, psychometrics, and specific details about the tests I administered (e.g., the structure and format of WISC subtests). I naively thought that this information would be interesting and helpful to people. In retrospect, I think that writing these explanations may have helped me more than the reader. Bits and pieces of my newly acquired expertise were not fully integrated in my mind and writing everything out probably consolidated my understanding. Whatever the benefit for me, I cannot remember a time in which the inclusion of such details proved crucial to selecting the right interventions and I can remember times in which they were confusing or alienating to parents.

Bad habits I let go

Over the years, I began a long, slow process of letting go of the report templates I was given in graduate school and unlearning bad habits of my own invention.

  • I stopped talking about the names, content, and structure of tests and measures and focused on the constructs they measured. I stopped organizing my reports by test batteries and instead used a theoretical organization. If I learn something important about the evaluee’s personality during the academic achievement testing, I weave that information into the personality section (and I rarely explain how such information was obtained).
  • I stopped talking about numbers (e.g., standard scores and percentiles). Instead I describe what a person can or cannot do and why it matters. I still make extensive use of numbers in the initial stages of case conceptualization but at some point they fade into the background of the overall narrative.
  • I stopped talking about the details of my observations and simply stated the overall conclusions from my observations (combined with other data).
  • I stopped including information that was true but uninformative (e.g., the teen is left-handed but plays guitar right-handed). My “Background Information” section became the “Relevant Background Information” section. I often re-read reports after I am finished and try to remove details that clutter the overall message of the report. Often this means bucking tradition. For example, I was trained to ask about a great many details, including allergies. If a child’s allergies are so severe that they interfere with the ability to concentrate in school, they are worth reporting. However, in most cases a person’s mild allergies are not worth reporting.
  • I stopped merely reporting information (e.g., the scores may be underestimates of ability because sometimes the evaluee appeared to give up when frustrated by the hard items on a few tests) and instead focused on contextualizing and interpreting the information so that the implications are clear (e.g., outside of the testing in which situations and on which tasks is the evaluee likely to underperform and by how much?).
  • I stopped explaining why certain scores might be misleading. For example, if the WAIS-IV Arithmetic was high but other measures of working memory capacity were low (after repeated follow-up testing), I no longer explain that follow-up testing was needed, nor that at some point in the assessment process I was unsure about the person’s working memory abilities. I just explain what working memory is and why the person’s weakness matters. I do not feel the need, in most cases, to explain that the WAIS-IV Working Memory Index is inflated because of a high score on Arithmetic.
  • I stopped explaining why scores that measure the same thing are inconsistent. Non-professionals won’t understand the explanation and professionals don’t need it. If the inconsistency reveals something important (e.g., fluctuating attention), I just state what that something is and why it matters.
  • I stopped treating questionnaire data as more important and precise than interview data. I came to treat all questionnaires, no matter how long, as screeners. In most cases, I do not treat questionnaire data as a “test” that provides information that is independent of what the person said in the interview. Interview data and questionnaire data come from the same source. If the questionnaire data and the interview data are inconsistent, I interview the person until the inconsistency is resolved.
  • I stopped sourcing my data every time I made a statement. For example, I stopped writing, “On the MMPI-2 and in the interview, X reported high levels of depression. In an interview, X’s husband also reported that X had high levels of depression.” It does not usually matter where or how I obtained the information about the depression. What matters is whether the information is accurate and useful. In the narrative, I only report my final opinion of what is going on based on the totality of evidence, not the bits and pieces of information I collected along the way.
  • I stopped sourcing interview data when I was quite sure that it was correct. For example, I no longer write: “Susie’s mother reported that Susie’s reading difficulties were first noticed when she was in the first grade.” If I have every reason to believe that this is true, I simply say, “Susie’s reading difficulties were first noticed when she was in the first grade.” However, if I am uncertain that something Susie’s mother said is true or if I am reporting Susie’s mother’s opinion, I attribute the statement to her.
Advertisements
Standard

19 thoughts on “Advice for psychological evaluation reports: Make every sentence worth reading

  1. FCF4 says:

    Thanks for this blog. Very edifying, useful, and enjoyable for a clinical psychologist with an assessment heavy practice.

  2. Ruben Lopez says:

    I almost feel like Moses climbing off of Mount Sinai with tablets in hand–of course only almost. Psych reports are not read because they’re unreadable; they need to be de-templatized. If for most things MDs find the prescription pad adequate, why shouldn’t psychs strive for such brevity?

  3. Leonard Harris says:

    I really appreciate your post, Joel, and found the template excellent. I am sharing it with my students. Thanks.

  4. Pingback: What makes a psych report readable and useful | Canadian School Psychology Blog

  5. Phil Young says:

    Love your thoughts. I am curious to know how others are conveying assessment results in multidisciplinary eligibility reports where their information is only a portion of the information presented.

  6. Holly says:

    Wow! I feel like I’ve been given a permission slip to “cheat” on my reports! And the Blaise Pascal quote dead on reflects my current report-writing life. After 10 years, it’s time to go with what my heart has been telling me. I need to simply and succinctly answer the question that teachers and parents are asking, which is, “So what?”

  7. Brant says:

    I think the questions that are being asked are “Why is my child struggling, and what can we do about it?” I remember letting myself off the leash, and allowing my reports to answer those questions, rather than the grad school way . I have a section titled “Factors that are affecting learning”, which itemizes and explains how each factor affects learning (working memory, phonological processing, attention, executive function, practice effects, anxiety, etc.). What I would really like is if my report could basically be a transcript of the interpretation conference with the parents and teachers. When speaking in person, the psycho-babble gets dropped, and language that allows understanding abounds.

  8. Tom says:

    Wow! This is amazing work. When I did my first assessment class I was told that they would not give us a template and that we had to discover our own way of writing. I wrote a good report but I wrote it my like a narrative and directly addressing the concerns brought up by the parents in the conclusions part, more like an essay conclusion. This was seen as being completely wrong and they eventually made us all change it to be exactly the “bad way” you mention, like exactly like it. I’m going to be working on changing these habits over the next few years now. It’s just not really readable unless you are a researcher or an academic when you write it like they taught us all.

    • Thanks!

      I think that the highly technical writing style makes it easy for the instructor to verify that the student has considered all the right information and has interpreted it correctly. With a narrative report, it can be hard to tell whether certain details were omitted carelessly or judiciously. To solve this problem, my students will be writing a narrative report, but each section will be annotated with a description of their interpretive process (which would be visible to me but not on the final version of the report). That way I can tell that they have mastered the technical side of things without them having to ruin their reports with irrelevant details.

      • Marc says:

        Great idea! Thank you for a fantastic resource. I’m going to make it required reading for the interns I supervise.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s