Feasibility of Screening Adolescents for Suicide Risk in "Real-World" High School Settings
PIRE Chapel Hill Center
Abstract available at www.ajph.org/cgi/content/abstract/96/2/282
In September 2005, the U.S. Substance Abuse and Mental Health Services Administration awarded the first-ever federal grants under a three-year, $82 million program created by the Garrett Lee Smith Memorial Act of 2004 to fund suicide prevention and screening interventions for youth.
On the heels of those awards comes a warning from the research field: Some screening tools designed to identify youth within target populations (such as schools) who are at risk of committing suicide may be overly sensitive, resulting in high numbers of “false positive” identifications that overwhelm – and even shut down – the systems charged with responding to the needs of truly troubled youth.
In this study, published in the February issue of The American Journal of Public Health, Denise Hallfors, a senior research scientist at PIRE Chapel Hill Center, and her colleagues administered the Suicide Risk Screen (SRS), a component of the High School Questionnaire developed by Leona Eggert and Elaine Thompson at the University of Washington.
Of 1,323 youth in 10 urban high schools, the SRS identified a whopping 29 percent as being at risk for suicide.
“There were just too many of them,” Hallfors said. “This was a ‘real world’ study.” In order to conduct the follow-up interviews with youth identified as at risk, “we had to rely, just as school districts would, on the nurse or the counselor or the social worker who was available and could actually do them.”
Overwhelmed by the idea of having to conduct 10 to 25 face-to-face suicide assessments each week, the schools failed to stick with the protocol. The counselors stopped doing follow-ups, and the schools withdrew from the study after two semesters.
“The counselors said, ‘I’ve got too many; I just can’t do this,’ ” Hallfors said. “It’s just not feasible to have a broad screen in schools.”
The problem, according to Hallfors, is that some screening tools are designed to be so highly sensitive that they won’t miss anyone who might be contemplating suicide. Such tools, including the SRS, often contain questions concerning substance use, feelings of anxiety and other risky behaviors that trigger red flags when answered a certain way – falsely identifying many youth as being at least “moderately at risk” of suicide.
But the vast majority of youth – even those with several red flags – would not meet the highest criteria, Hallfors said. Those students at highest risk for suicide, according to the study, have attempted suicide in the past year and have high levels of suicidal ideation and/or serious depression.
When Hallfors’ team statistically “tightened” the SRS criteria to identify only those at highest risk of suicide, only 11 percent of the “at risk” students were identified as being in critical need of a follow-up assessment.
“This doesn’t say to me that we shouldn’t be doing this kind of screening,” Halfors said. “But it does say to me that we have to tighten up the screen.”
She warns programs considering the purchase of adolescent suicide screening tools “not to pick one just because it’s out there … and you can find and pay for it. You want to make sure that there has been some sort of an independent evaluation. I really stress that.”
Contact: Denise Hallfors, (919) 265-2612.
National Job Corps Study: Findings Using Administrative Earnings Records Data
Mathematica Policy Research Inc.
Available at www.mathematica-mpr.com/publications/pdfs/jobcorpsadmin.pdf
In a stunning reversal of previous findings, a newly released study says the costs of putting youths through Job Corps exceed any benefit to society.
Although Mathematica Policy Research reported in 2001 that the net benefit to society for each youth enrolled in Job Corps (after program costs) was $16,800 per year, it now says the annual cost of each youth’s Job Corps participation exceeds the benefits by $10,200.
“Contrary to what we had thought and expected … the impact [of Job Corps on earnings] did not persist, but declined quite precipitously,” said John Burghardt, one of the lead researchers.
Job Corps provides residential employment training to more than 60,000 disadvantaged 16- to 24-year-olds each year. Mathematica conducted the National Job Corps Study from 1994 to 2005 under contract with the U.S. Department of Labor (DOL). The new report was published in October 2003, but – for reasons that remain unclear – was not released to the public until January 2006.
Asked about the delay, a spokesperson for Mathematica deferred to the DOL. The DOL said no one was available to comment.
Mathematica based its first cost-benefit analysis exclusively on earnings and employment data collected through surveys of Job Corps participants and a control group of nonparticipants, but in 2003 it also looked at data supplied by the U.S. Social Security Administration (called SER data) and 22 state unemployment insurance agencies (UI data) from 1993 to 2001.
That uncovered a few discrepancies, to say the least.
In the surveys, Job Corps participants reported average calendar year earnings of approximately $10,300 in 1998 – $970 more than nonparticipants. But SER data for that year show annual earnings of only $5,500 for Job Corps members – a mere $220 more than the control group. And SER and UI data for 1999 to 2001 show average earning differences ranging from $33 more to $194 less in annual income for Job Corps members than for nonparticipants.
Why such differences? The study says the main factors are over-reporting of working hours by Job Corps participants, and a bias that exaggerates the earnings of participants who responded to the survey versus the lower estimated earnings of those who didn’t. Other possible factors, it says, include errors in reporting Social Security numbers, a failure by state unemployment insurance agencies to track some formal jobs, and casual or cash-only employment that doesn’t get counted.
The two main factors, and the earnings trends documented in the administrative data, heavily influenced the new cost-benefit analysis of Job Corps. Although still based on the earnings data reported in the original surveys, the new analysis includes an assumption that participants over-reported their working hours, and a gradual “decay” in the earnings impact of the program in the years after participants leave.
The researchers found that “the benefits to society of Job Corps are [$10,200] smaller than the substantial program costs.” There was an exception: For those who entered Job Corps as 20- to 24-year-olds, the cost of participation exceeded benefits by $500.
“The overall study findings suggest that there is promise in the Job Corps model,” Peter Schochet, the principal researcher and a senior economist at Mathematica, said in a statement issued in January. “The challenge is to … make the program cost-effective for a population that has been extremely hard to serve.”
Contact: Peter Schochet, (609) 936-2783.