In the previous post, I advocated the marriage of our clinical intuition and the use of outcome informed data.
Do not let me convince you. Let your experience convince. Put it to the test. Try them on for size.
I call this the Rate & Predict exercise. There are two parts:
A. Using an Outcome Measure (Outcome Rating Scale, ORS; Clinical Outcome Routine Evaluation, CORE),
1. Rate: After the first session, ask your client to RATE the outcome measure in subsequent sessions;
2. Predict: Before you see your client’s score, PREDICT what they would score. It is important that you write down scores for each of the sub-scales, if any. (for ORS, Individual wellbeing, close relationships, Social, General). This prevents us from falling into the “I knew it all along” hindsight bias effect.
3. Evaluate: Compare and contrast the scores. See what stands out. Talk about the with your client.
B. Using an alliance measure (e.g., Session Rating Scale, SRS),
1. Rate: At the end of the first session, ask your client to RATE how they feel about the level of engagement in the session;
2. Predict: Before you see your client’s score, PREDICT what they would score. It is important that you write down scores for each of the sub-scales (for SRS, level of emotional connection, goals, approach/method, overall).
3. Evaluate: Compare and contrast the scores. See what surprised you. Form your feedback questions from there.
What’s interesting from our study of highly effective therapists, the top performers are more likely to report being surprised by client’s feedback that their counterparts. It does seem to suggest that the highly effective therapists are more willing to be corrected. They have a sense of openness to receive and consider client’s viewpoints, even if it may be contradictory to the therapist existing expectations.
Janet Metcalfe and colleagues suggests that individuals are more likely to correct errors made with initial high confidence than those made with low-confidence, so long as the corrective feedback is given (Barbie & Metcalfe, 2012; Butterfield & Metcalfe, 2001; Butterfield & Metcalfe, 2006; Metcalfe & Finn, 2011). Although it may seem intuitive that deeply held beliefs are more entrenched and are the hardest to change, experimental studies have indicated that individuals are more likely to overwrite their responses and correct their beliefs (Butterfield & Metcalfe, 2001; Butterfield & Metcalfe, 2006), and are more likely to retain the correct answer compared to knowing the correct answer at the outset (Barbie & Metcalfe, 2012).
In other words, the Rate & Predict Exercise is set up to intentionally create
a context for hypercorrection, not hyper-confirmation.
When we experience hypercorrection, we learn more deeply, as it enhances the memory encoding system. We learn nothing as long as we keep seeking to confirm what we already know.
Eugene Gendlin, pioneer of the focusing approach in therapy, said in a handful of years ago that when he’s asking questions in therapy, he no longer seeks to be confirmed like he used to in the past. Rather, he’s now intentionally seeking to be disconfirmed by his client.
Try this out. There’s a lot to gain from a simple exercise like this, with little trade-off. If you are bold enough to give this a go, I would love to hear your experience about how it turned out for you. A simple two-three sentences about your experience in the comments below will do.
–Daryl Chow, Ph.D.
(Note: The ORS/SRS and CORE are free to use. If you are keen to use these on a web-based platform, and with much more flexibable features customised to your agency’s needs, check out Pragmatic Tracker. Drop them an email to find out more. Take the free trial for a test drive. You can even add other measures in your outcome tracking system.
Full disclosure: I am an affiliate with them. The only reason I have linked up with them is because I truly like their product after mucking around with my own home-brewed methods and other outcome monitoring products. My team and I in Australia use Pragmatic Tracker in our current clinical practice, and we are underway with my colleagues back in Singapore. Each has its separate needs and design. The key feature is that the system is flexible and customisable).
More Related Articles:
1.Superforcasting: The Art and Science of Prediction by Philip Tetlock and Dan Gardner. What an experiment! When put to the test, pundits that we deem as top forecasters aren’t as good as what they really made out to be. Instead, other ordinary “non-experts” rise to the occasion to become “super forecasters”
2. The Signal and The Noise: Why so Many Predictions Fail – But Some Don’t by Nate Silver: Nate’s a maverick at prediction. He aced at baseball games, and even had a near perfect prediction at the US 2012 election. In this book, he goes at length to explain why most predictions fail, and how we can parse the noise from significant signals.
3. The Art of Thinking Clearly by Rolf Dobelli. If you liked Dan Kahneman’s classic Thinking, Fast & Slow, you are going to appreciate this book. Every chapter offers a bite-size approach to counteracting our inherit biases. It’s chalk full of good advice.
If this post piques your interest, don’t forget to sign up for not-so-frequent updates for the latest professional development tips in your inbox.