Tampilkan postingan dengan label Public reporting. Tampilkan semua postingan
Tampilkan postingan dengan label Public reporting. Tampilkan semua postingan

Minggu, 06 Maret 2011

Bundle the baby

There's a new paper in Pediatrics that evaluates implementation of central line insertion and maintenance bundles across all referral NICUs (n=18) in New York state. Surveillance for infections followed NHSN methodology and evaluated a 12-month period prior to implementation of the bundles to a 10-month period after the bundles were implemented. Overall, there was a 40% reduction in CLABSI. Higher volume NICUs demonstrated lower infection rates and less variation in performance. For each standard deviation increase in maintenance checklist usage, there was a 16.5% decrease in CLABSI rate. However, the authors point out that "in light of some agencies’ considering CLABSIs to be “never events,” it is important to note that no NICU achieved an overall CLABSI rate of 0."

Jumat, 31 Desember 2010

California's Healthcare Associated Infections Report: For what it's worth...

California has just released its first statewide report on healthcare associated infections (you can view it here). The metrics reported are healthcare associated VRE bloodstream infections per 1,000 inpatient days, healthcare associated MRSA bloodstream infections per 1,000 inpatient days, and CLABSI in ICUs per 1,000 central line days. The report has major problems as evidenced by the disclaimer on every table of rates that says that the data should not be compared between hospitals, which is generally the whole purpose of public reporting. However, since the reporting period for this report ended, the state mandated that all hospitals join NHSN, which they anticipate will improve the quality of the data reported.

Rabu, 10 November 2010

Public Reporting of CLABSI: Is it a valid measure for hospital comparison?

Overview of the computerized CLABSI agorithm
In today's JAMA, Michael Lin and authors from four CDC Epicenter academic hospitals (2 in Chicago, 1 in Columbus, OH and 1 in St. Louis) compared annual IP-determined CLABSI rates in 20 ICUs during 2004-2007 with a computer-generated reference standard. The median CLABSI rate was 3.3/1000 central line days. Overall correlation with the computer algorithm was poor at 0.34 and ranged from 0.83 in one center down to 0.10 at another. Interestingly, the center with the lowest reported CLABSI rate by IP had the highest computer rate. (2.4/1000 CL-days vs. 12.6/1000 CL-days)

I have posted the schematic of the computer algorithm and also the link to the code (below).  My only methods question (at this moment) is why did they limited the analysis to yearly comparisons and not quarterly or monthly comparisons.  I would have liked to see that level of data analyzed even though it would be noisier.  It was interesting how the IP-reported rates were narrowly clustered around each other while the computer-generated rates were widely distributed.  The findings should make us pause when we consider public reporting of these rates.  If so much emphasis is being placed on CLABSI rates at the state and national level for comparison and reimbursement, there should be funded validation of the reported rates and also consideration of other measures (outcome or process) that might be more reliable. 

Lin et al JAMA November 2010
Link to computer algorithm code (looks like you might need to apply for a password)

Selasa, 07 September 2010

The “vicious cycle of pseudoimprovement”

What am I talking about? Read this JAMA commentary from Drs. Muller and Detsky to find out.

Done reading? Now ask yourself these questions: is my hospital following indicator-based or evidence-based strategies to improve patient outcomes? What happens when publicly-reported indicators improve dramatically, but patient outcomes do not?

Kamis, 27 Mei 2010

17 states report CLABSI rates. Why only 17?

The CDC just released the First State-Specific Healthcare-Associated Infections Summary Data Report, which focuses on CLABSI. HHS press release is available here. Needless to say, we are all disappointed with the rates here in Maryland given how hard the State and hospitals have worked at preventing these infections. No one seems more disappointed than Peter Pronovost. No excuses.

Senin, 17 Mei 2010

Did someone get fired for using NHSN or NNIS definitions to count CLABSIs?

Wow. I just read the second article that Mike posted Saturday out of the Chicago Tribune and I kinda wish I didn't. I'll paste the section that scared me a bit below, but it appears that someone might have been fired for "over-reporting HAIs", which is a bit scary to me. There is tremendous pressure to not call a BSI , a CLABSI, which I think is now part of the getting to zero culture. I wonder what percent of the way towards zero will be paved with these sorts of statements:

"Thorek's infection rate was the highest of all medical centers in Illinois. Frank Solare, Thorek's president and chief executive officer, said hospital officials have collected medical charts for the 22 infected patients and have "started an independent review … to try and understand this."

Asked why the Lakeview hospital didn't take action last year, Thorek's compliance officer Morgan Murphy said a former employee didn't alert senior management to the problem. "It wasn't making its way up the chain, unfortunately," he said.

Senior management also suspects that the employee may have counted central line infections incorrectly, inflating the hospital's numbers. "There may have been over-reporting," Murphy said."

Selasa, 30 Maret 2010

VAP: Do you know it when you see it? (Again!)

There's a new paper on the utility (or lack thereof) of CDC's definition of ventilator-associated pneumonia. In this study 4 persons reviewed 50 cases of ventilated patients with respiratory deterioration >48 hours after intubation. Two reviewers were experienced infection preventionists who applied the CDC definition. A third IP used a modification of the CDC definition that was more quantitative, and the fourth reviewer was a physician board-certified in infectious diseases and critical care who used clinical judgment to define VAP. Using the standard definition, one IP assigned the VAP diagnosis to 11 patients and the other, 20 patients. The IP using the modified definition assigned 15 cases as VAP. The physician diagnosed VAP in 7 patients. The IPs agreed on 62% of cases (kappa=0.40). All 4 reviewers agreed on the VAP diagnosis in only 4 cases. This is not the first study to show how complex assigning the diagnosis of VAP can be.

Given that public reporting has raised the stakes to high levels, the CDC can no longer ignore this issue. Consumers cannot make choices on where to receive care if inter-hospital comparisons of infection rates are not valid. There needs to be a convening of IPs and hospital epidemiologists who use the definitions on a daily basis to thoroughly assess each of the HAI case definitions and begin to work on the development of new ones that will be fair to hospitals and helpful to consumers. Otherwise, it's garbage in, garbage out, and the entire concept of public reporting is undermined.