Audit Released on OJJDP’s Recovery Act Mentoring Program

The U.S. Department of Justice Office of the Inspector General (IG) just released its audit of some Recovery Act spending by the Bureau of Justice Assistance (BJA).

Specifically, it assessed awards made under the Byrne Competitive Grants program. In a normal year, sadly, Byrne grants are eaten up by earmarks. But the Recovery Act grants offered a rare open competition for the dollars.

Why is this relevant to the JJ world? Because the money for Recovery Act mentoring grants in the Office of Juvenile Justice and Delinquency Prevention (OJJDP), both national and local, was technically part of that Byrne allocation. Both BJA and OJJDP are divisions of the Office of Justice Programs, led by Assistant Attorney General Laurie Robinson.

The auditors liked OJJDP’s handling of the mentoring grants more than BJA’s effort with the rest of it.

There were 158 applicants for the national mentoring funds (four of whom won) and a whopping 1,459 submissions for local grants (26 of which got funded). There was a lot of money to give away, a lot of groups that wanted it and a short time frame in which to give it.

OJJDP used a “normalized” method of scoring, which uses a mathematical equation that “reduces the impact of outlying scores.” In plain English, it chucks scores from peer reviewers who routinely score far lower or higher than the rest of their colleagues. BJA did not normalize its scores.

The IG report said OJJDP might have flubbed the calculations a little, but not so much that it would have changed the grant winners. But when the IG went back and normalized the scores on the BJA applicants, the result was staggering: Fewer than half (66 out 145) of the top-ranked programs across the eight Byrne grant categories would have ranked near the top of their respective lists.  (Check page 91 of the audit for a breakdown of this calculation.) In one category, only three of 20 would have made the list.

That led the IG to recommend in the report that “the Office of Justice Programs consider standardizing the circumstances under which normalization of peer review scores should be used for all bureaus and program offices.”

OJP responded supportively: “The Office of Justice Programs agrees with this recommendation. By December 31, 2010, the OJP will internally discuss and consider the circumstances under which normalization of  peer review scores should be used for all bureaus and program offices.”

If that happens, this audit may one day be responsible for a significant procedural change in how business is done at the Office of Justice Programs. OJJDP and the Sex Offender Sentencing, Monitoring, Apprehending, Registering and Tracking (SMART) office were the only ones who normalized scores in the Recovery Act cycle, so the majority of OJP divisions would have new protocol on its hands.

It’s worth noting that not everyone is big on normalizing scores. The counter-argument to that process is that while it accounts for aberrant scores, what if those scores are aberrant for a reason? If you have a good, top-to-bottom peer review team, normalizing the scores can lessen the value of their collective wisdom.

A few other critiques from the IG:

1)  This will be interesting for those who followed the 2007 OJJDP grant scandal: “We recommend that the Office of Justice Programs establish a requirement that future funding recommendation memoranda include explanations for all applications not recommended for funding that received an equal or higher score than the lowest scoring application recommended for funding.”

Translation: If you skip over an applicant with a higher score than someone you fund, you better make it crystal clear why.

2) OJJDP allowed applicants who didn’t meet some of the “basic minimum requirements review” – documents such as a list of resumes for personnel, or a program budget –  to advance to peer review while others were screened out. “To ensure fairness,” the report said, “the OJJDP needs to clearly define in its solicitation what requirements are significant enough to result in the rejection of applicants that fail to provide those requirements.”

3) OJJDP is not providing explanations to programs that apply for funding and are screened out before peer review. The IG office looked at a random sample of 30 applicants who did not make it to review; all received notice that they were denied, none received an explanation as to why.

OJJDP staff told the IG that “it was their practice to not inform applicants of the reason for denial if the applications are denied during the basic minimum requirements review or internal review processes.”

Comments
To Top
Skip to content