0
Community News
Post-9/11 MH Intervention Evaluates Its Performance
Psychiatric News
Volume 41 Number 17 page 18-18

In the early days and weeks following any disaster, according to researcher Susan Essock, Ph.D., "there has always been a notion that conducting research involving disaster response and following victims' responses to trauma along with long-term outcomes are just unconscionable."

In the early days and weeks following September 11, 2001, this dictum was little challenged, with one notable exception—Project Liberty, which became the largest federal disaster mental health program in history. It was funded through grants totaling $155 million from the Federal Emergency Management Agency (FEMA) and led by the New York State Office of Mental Health (OMH) in collaboration with nearly 200 local agencies. To Susan Essock's way of thinking, a good chunk of scarce public funds were being fed into the effort, so "it seemed to me to be unconscionable to not learn something from the effort."

Essock is a professor of psychiatry and director of the Division of Health Services Research at Mount Sinai School of Medicine in New York City.

OMH officials in New York were successful in requesting that FEMA designate part of the total Project Liberty funding for quality assurance/quality improvement activities. As a result, in addition to numerous other metrics tied to recipients of the program's services, such as demographic variables, symptom clusters, variables predictive of further need versus recovery, and longterm outcome measures, Project Liberty included monitoring mechanisms to gauge how well the services rendered adhered to treatment manuals and guidelines.

In a report in the September Psychiatric Services, Essock and her Project Liberty colleagues highlighted one of the monitoring mechanisms by describing outcomes associated with how faithfully clinicians adhered to the key elements of the cognitive-behavioral treatment intervention developed for Project Liberty's enhanced services counseling program (see story above). All recipients of enhanced services were invited to participate in a telephone interview involving only six questions. Five questions rated how often their clinician (counseling was largely provided by nondoctoral licensed counselors and social workers) provided each of five components of the intervention, using Likert scales ranging from 0 (not at all) to 3 (a lot). A sixth question asked how often the clinician gave homework, also a required component of the intervention.

In an effort to tie performance directly to training on the Project Liberty model of CBT for posttraumatic stress reactions, the researchers looked at responses for those clinicians at sites where all clinicians received training, compared with those clinicians at sites where only some clinicians received training. Essock and her colleagues were not surprised to find that interviewees who received services at the partial-training sites were less likely to report that their clinician adhered to all five techniques considered central to the intervention. Similarly, homework was given less frequently by counselors at sites where only some clinicians were trained.

"In five short questions, we were able— with good confidence—to identify the people who got their intervention at a site where all clinicians were trained versus a site where only some clinicians were trained," Essock told Psychiatric News. "So, through an easy, inexpensive means, we were able to be quite confident that our training program was effective. OMH contracted with these providers to do a specific intervention. This measure asked, `Did they do it?' Purchasers of health care services all over the country are interested in answering that question." ▪

Interactive Graphics

Video

NOTE:
Citing articles are presented as examples only. In non-demo SCM6 implementation, integration with CrossRef’s "Cited By" API will populate this tab (http://www.crossref.org/citedby.html).
Related Articles
Articles