Throughout this two-course sequence we have explored evidence and its relationship to strong social work practice. You have learned about how to locate, analyze, compare, and generate evidence to inform and evaluate practice using a host of critical thinking skills. These processes are important aspects of thoughtful, reflective practice. This point is emphasized in the observation that in arenas where social workers often practice, a large, severe gulf exists between evidence and practice:
“with the majority of services delivered in usual care settings having little or no relation to practice supported by research” (Chorpita, Bernstein, & Daleiden, 2012, p. 470).
Our clients deserve better than this; they deserve access to the best possible practices and services. Helping clients select from among options is informed by evidence, as well as the social worker’s experience and the client’s values and preferences. These three pillars of evidence-based practice place responsibility on social work professionals to present evidence in a manner that facilitates client understanding.
In this chapter you:
- Learn a format for coordinating evidence from the literature about intervention options;
- Review a format for presenting evidence from practice or program evaluation;
- Recall what it means to identifyas an evidence-informed social worker;
- Review major topics learned throughout this and the previous course;
- Consider the future in terms of professional development.
Coordinating Intervention Evidence from Literature
Presented here is a 7-part format for coordinating intervention evidence located in the literature. This format is not necessarily appropriate for presenting complex information to clients, program leaders, policy decision makers, and potential funders. The actual presentation needs to be tailored to the audience (as learned in Module 5 of our first course).
- Specify the intervention question: Using COPES or a similar approach, clearly identify the practice question about which intervention evidence is sought.
- Locate, review, and summarize available evidence: Applying skills learned throughout these two courses, identify literature/evidence sources relevant to the specified intervention question. For each source, collect and record the following information.
- Type of evidence: Identify the intervention(s) the evidence concerns, the intervention questions the evidence addresses, and the approach(es) used to develop the evidence.
- Generalizability: Specify the population to which the evidence applies (i.e., the “sample,” “subjects,” or “participants” involved, and where limits to generalizability might exist.
- Intervention elements: Identify the theory or logic model underlying the intervention, critical elements of the intervention, and requirements for implementing the intervention with fidelity (the who, how, when, where, what aspects).
- Strength of evidence: Specify whether the evidence is represented in single studies, studies with comparable or competing outcomes, systematic review, meta-analysis, and/or scoping review. Make a determination about strength of the evidence based on the design and analysis methods used to develop the evidence.
- Conclusions: Identify conclusions that are appropriately drawn from the evidence (and which conclusions are not supported by evidence).
This information can be organized in a table like this, adding rows as needed for additional sources evidence and for additional intervention options.
|Intervention Option 1: (specify)
|Intervention Option 2: (specify)
|Intervention Option 3: (specify)
Analysis Report for Audiences
The evidence table generated through the previously described activities may or may not be appropriate for presenting to the intended audiences. The following 5-part outline describes what might be included in a report tailored to specific audiences (clients, colleagues, program administrators, policy decision makers, or funders).
This type of five-step summary helps the social worker organize a complex body of information. In this way, social workers support informed choices.
- Part 1. Clearly state the intervention question being addressed.
- Part 2. Summarize the evidence reviewed (as in the previous section).
- Part 3. Present a relevance analysis. This analysis is about assessing the available evidence in terms of how well it applies to the specific client, client system, or population identified in the first step. This goes back to the generalizability issue and comparability of the research participants or samples to the clients for whom intervention questions are being asked. This might include assessing demographic and situational characteristics, such as:
“presenting problem(s), age, gender, ethnicity, or clinical service setting” (Chorpita, Bernstein, & Daleiden, 2012, p. 472).
- Part 4. Summarize implementation details/elements, costs, benefits, and feasibility of each analyzed option. Remember that cost/benefit analysis is not simply about financial costs and savings; important aspects concerning quality of life, time and effort expended/saved, goodness-of-fit dimensions (including “cultural” relevance with culture broadly defined) are also important aspects. Feasibility involves professional competencies and training for providing the intervention with sufficient fidelity, as well as other required resources (time, space, tools). Feasibility also addresses fit with professional ethics, regulations, policies, and billing/funding criteria that might be involved.
- Part 5. Identify outstanding, remaining, uncertain, or unanswerable questions about the intervention evidence gathered.
The format for such a report should be clearly structured, following a logical outline. The 5-part list above could serve the purpose of structuring the outline.
Stop and Think
Take a moment to complete the following activity.
Visit the website for Practicewise.com, an interactive site synthesizing a vast amount of evidence concerning mental health treatment options for children, adolescents, and their families. Many millions of dollars in funding supported the development of the PracticeWise contents and tools—the result far exceeds what any one practitioner could be expected to generate. Practicewise is a fee-for-use service for practitioners in different disciplines who provide mental health services to children, adolescents, and their families. Since we are not PracticeWise members, we cannot use the services but there are important lessons to be learned from reviewing the PracticeWise service offerings.
- View the overview video in the PWEBS Database when you choose the “Our Services” menu. This 4-minute recording talks about how the literature is summarized for practitioner use in treatment planning and decision-making with clients. What does this recording tell you about the kinds of information you want to elicit from the literature and present to your audiences?
- Select the PracticeWise Practitioner Guidelines next for the demonstration of practice guides and process guides. Try entering a search command like “Anxiety” and see what comes up as Practice Guide options. In the Process Guide menu, try entering the search command for “Diversity” and see what comes up. (You will not be able to actually open the .pdf files, unfortunately, since you are not a PracticeWise member. They include detailed, step-by-step guidelines for practitioners to follow.)
- Returning to the main menu, select MATCH to see what it is about. What does the decision-tree/flowchart approach suggest to you about organizing treatment options?
Presenting Evidence from Practice Evaluation
In addition to learning how to identify and critically review existing evidence to inform social work practice, you also developed a set of skills related to evaluating practice. In this section we look at different ideas for presenting the evidence that you have generated through your evaluation efforts.
The format of your feedback to clients depends, to a great extent, on the clients themselves—what they already believe and understand about the practice question, their expressed preferences, aspects of their specific circumstances, and the circumstances of the feedback situation. The information needs to be tailored to clients’ cognitive abilities for processing the information (e.g., age, cognitive impairment, emotional state, and more).
A generic framework for presenting/discussing evaluation results with clients might include the following:
- Specification about the variable(s) measured in the evaluation effort.
- Specification about the strengths and limitations of the measurement tool(s) used in the evaluation effort (reliability and validity, as well as sensitivity to change measurement).
- Presentation of the evidence/data and how the results might be interpreted.
- Specification about the strengths and limitations of the evaluation design—how this might influence the conclusions drawn from the evaluation effort.
- Conclusions/recommendations developed together with the client(s) based on the evaluation results, your practice expertise, and their preferences.
In your evaluation work, you may be called on to present results to professional audiences—colleagues, agency or program administrators, community leaders, policy decision makers, or funders. In our first course you learned about the structure of written and presented research reports:
- Abstract or Initial Summary
- Results or Findings
- Discussion or Recommendations
In presenting evidence from your own intervention or evaluation research efforts for understanding intervention, these elements remain an excellent outline. What you have learned throughout this and the prior course have provided you with the necessary knowledge and skills to create such a report or presentation. Review the topics presented in Module 5 of our first course related to making strong presentations, including how to create graphs, charts, figures, tables, and infographics.
As a reminder, the Social Work Code of Ethics emphasizes that it you need to ensure that individual clients/participants are not identifiable in any data files or summary reports that you share with others.
5.02.n Social workers who report evaluation and research results should protect participants’ confidentiality by omitting identifying information unless proper consent has been obtained authorizing disclosure.
Not only are we concerned about the obvious identifiers (name, address, phone numbers), we are concerned about the ways that individuals’ demographic data could be assembled to make an individual identifiable. For example, the combination of information about ethnicity, age, and gender might make an individual stand out and become identifiable to an audience, particularly in a small population or sample.