from the $5-billion-can’t-buy-you-a-usable-database-these-days dept
Sept. 20, 2013
Of all the unsurprising news of inflated numbers and double-counting contained in the Office of Inspector General’s (OIG) audit of the DOJ’s reported terrorism statistics, this sentence from the introduction of the report is the least surprising — and by extension, the most damning.
Department resources devoted to preventing terrorism and promoting the nation’s security have increased from approximately $737 million in fiscal year (FY) 2001 to approximately $5.26 billion in FY 2012, an increase of 614 percent.
Despite the exponentially-increased budget and the fact that the EOUSA (which provides terrorism numbers to the DOJ) has had more than six years to get its case tracking system under control, the audit finds there has been little to no improvement. In fact, there’s evidence the EOUSA (Executive Office for United States Attorneys) may be getting worse.
We found that although EOUSA revised its procedures for gathering, classifying, and reporting terrorism-related statistics based on the recommendations from our 2007 audit, EOUSA’s implementation of the revised procedures was not effective to ensure that terrorism-related statistics were reported accurately. Specifically, we found that EOUSA inaccurately reported all 11 statistics we reviewed during this follow-up audit. Most of these 11 statistics were inaccurately reported by significant margins…
The continued inaccurate reporting by significant margins indicates that EOUSA needs to strengthen its implementation of controls for gathering, verifying, and reporting terrorism-related statistics.
Great, except that it’s had since 2007 to improve and simply hasn’t. The OIG’s report shows that the EOUSA has trouble performing even the most basic of tasks, like updating numbers annually or attaching supporting documentation for the numbers in its reporting.
Because the log and corresponding support were not previously maintained as required, EOUSA had difficulty providing us accurate lists for 5 of the 11 statistics selected for testing. An official of the Data Analysis Staff told us that, as the result of an oversight, the FY 2010 U.S. Attorneys’ Annual Statistical Report was not recorded in the FY 2011 log. The official subsequently showed us that this oversight was corrected by updating the FY 2011 log to reflect the FY 2010 U.S. Attorneys’ Annual Statistical Report.
The official believed that the schedules listed in the annual report were the support for the reported statistics. However, the schedules in the annual reports only show the numbers reported for each statistic, but do not show necessary supporting details such as case numbers, defendant names, and disposition dates.
Further complicating the matter is the fact that the EOUSA entered case info into the NLIONS (National Legal Information Office Network System) in a completely arbitrary manner, which resulted in inaccurate case counts.
[A]n EOUSA official told us that while the defendant filed an appeal on June 23, 2007, the appeal was not entered into LIONS by the applicable USAO until January 22, 2009, and therefore the appeal was counted as filed in FY 2009. For the other eight cases, the official explained that these were the first appeals filed in FY 2009 for these cases.
When asked how this explanation was consistent with the NLIONS business rules, which state that an appeal should only be counted if it is the first appeal filed in the case, the official told us that a new methodology was used to count the appeals by which EOUSA had selected the defendant with the best disposition, first filing date, latest close date, and the highest participant identification or defendant number.
However, we noted that in a case with multiple defendants there is no guarantee that these criteria will be sufficient for EOUSA to identify a single defendant – for example, the defendant with the “best disposition” may not have the “first filing date.” Therefore, this methodology appears likely to produce arbitrary results and therefore does not appear sufficient to us. Moreover, the official had no documented procedures to show the methodology had been changed from that described in the NLIONS business rules.
This isn’t the only way the EOUSA screwed with/screwed up the NLIONS database. It also decided that the “smartest” way to track closed cases was by the date it was entered in the system.
We discussed these discrepancies with an EOUSA official who said that EOUSA’s terminated statistics are based on the date that USAO personnel enter the disposition into LIONS instead of the date when the case was actually terminated. The official told us that the system disposition date is used for reporting purposes because this date cannot be changed, whereas the actual termination date can be changed in the system, and therefore using the system disposition date improves the accuracy of the reported statistic.
While relying on a static, unchangeable date might protect the integrity of the data, the delays between case closure and entry in the system completely undermines the data’s accuracy, especially in terms of determining annual budget outlays.
[W]e found that the time lag between the date the 13 cases were actually terminated and the date the terminations were entered into the system by the USAOs ranged from 10 to 483 days, and averaged 266 days with a median of 314 days…
Because of this discrepancy (which is putting it nicely), the OIG expanded its spot check to cover all 258 cases listed as terminated in 2010.
As a result, we concluded that EOUSA overstated by 32, or 14 percent, the reported number of cases actually terminated during FY 2010. We consider this amount of deviation to be significant. The time lag between the date these 32 cases were actually terminated and the date the cases were entered into the system by the USAOs ranged from 1 to 706 days, and averaged 201 days with a median of 176 days.
That’s just this one instance dealing with these specific cases. More examples are scattered throughout the report. Looking at pending cases, the OIG cross-referenced NLIONS with PACER and uncovered some truly lengthy delays.
PACER showed that one case was disposed of on June 25, 2008, but the USAO did not enter the disposition into LIONS until May 2, 2011, which was 1,041 days after the district court terminated the case.
PACER showed that one case was disposed of on December 19, 2007, but the USAO did not enter the disposition into LIONS until February 11, 2011, which was 1,150 days after the district court terminated the case.
PACER showed one case involving six defendants was disposed of on August 6, 2007, but as of February 1, 2013, the USAO had not entered a disposition for any of the six defendants into LIONS, and therefore the case was still shown as pending in LIONS.
When auditing the numbers provided for suspects charged under the National Security Infrastructure statute, the system lag ran right off the charts.
Three defendants’ cases were filed in FY 2008, 1 case was filed in FY 2001, and 2 cases were filed in FY 2000. For each of these cases, USAO personnel entered case data into LIONS during FY 2009. Data entry delays ranged from 323 to 3,391 days and averaged 1,848 days with a median of 1,743 days.
This lack of timely data entry not only skewed annual numbers but has also resulted in double counting. In addition, a dozen or so cases were found to have been miscoded as “terrorist-related,” including those covering such non-terrorist activities like animal fighting, narcotics possession and bank robbery.
As the OIG states early in the report, the accuracy of these numbers is crucial to making “informed operational and budgetary decisions.” Unfortunately, it appears a truly informed decision hasn’t been made for nearly the entire lifetime of the DOJ’s War on Terror. If a report in 2007 found similar problems and nothing’s changed over the past six years, it’s safe to assume the problem runs all the back to the initial post-9/11 response.
Thirteen straight years of running a crooked game using ever-increasing amounts of taxpayer funds. In the private sector, this sort of thing would put someone out of business, if not at the receiving end of a class action suit or fraud charges. But here in our government, it’s just one of those things — the endless cycle of audits and corrective actions, neither of which have propelled the EOUSA to anything approaching excellence, much less mediocrity.