ADVERTISEMENT
Harnessing Data for Real Improvements
A lot of smart people do a lot of innovative things to advance healthcare and technology these days. Too often their contributions remain unrecognized or underappreciated.
That’s why ImageTrend, a Minnesota-based developer of EMS data management and related software, created its Hooley Awards. First given in 2015, the honors recognize noteworthy users of ImageTrend products in three categories:
- Innovation Awards, for use of products to meet the needs of a service, department, hospital or state in a new or innovative way;
- Service Awards, for using data to improve community safety or accomplish other goals; and
- New Frontier Awards, for services, departments, hospitals or states that break new ground or go above and beyond in their humanitarian efforts.
EMS World helped judge last year’s awards, which were respectively won by the Missouri Time Critical Diagnosis Team; Emory University EMS; and the East Baton Rouge (LA) EMS Community Integrated Health Program. All the finalists submitted worthy efforts, and for this year’s technology issue, we profile three additional outstanding ones.
DO THE RIGHT THING
The problem: In New Hampshire state EMS leaders were dismayed to discover, in reviewing providers’ run data, that fewer than half of reports they examined documented administration of aspirin to patients with cardiac chest pain. The rate was better than it had been a few years earlier, when the state established that as one of five key quality benchmarks for EMS systems, but it wasn’t high enough to leave anyone satisfied. The same was true of another key metric, performance of a stroke scale on potential CVA patients.
Were state providers really complying that poorly with their protocols? Officials with the state EMS division’s Trauma and EMS Information System (TEMSIS) doubted it. Instead, they surmised, crews probably just weren’t documenting fully. They needed a way to help them do it better.
The solution: One fast and inexpensive answer was a series of reports, assembled from state data, to illustrate the problems and how to correct them.
“What we did was, using the plan/do/check/act cycle of CQI, put out fact sheets for the entire state—all the EMS providers and service leaders—and showed them exactly how to document stroke scales and how to document aspirin appropriately in their incident reports,” says Todd Donovan, NRP, who led the project.
Some earlier work had preceded this, including a video course, and roughly doubled rates that had once been under 20% on both interventions. But there they stuck—until Donovan’s reports.
“Todd really brought in this CQI process, and we started communicating as closely as we could with the services and evaluating all those runs,” says Chip Cooper, NRP, head of research and quality management for the division. “By communicating what we found to the services, we really saw an improvement. We were able to put out a clear document: ‘Hey, we’re not doing this right. We know you’re treating the patients well; you’re just not putting it down, so it’s not obvious.’”
How it was implemented: Reports were distributed to every agency and service in New Hampshire. An accompanying video offered additional help with documenting aspirin, and information was included with rollout videos when protocols were updated, but it didn’t take much more than that.
An additional benefit was gaining some insight into why crews might not always document things like giving aspirin or performing a stroke scale.
With stroke, one problem was the design of the run form: the location where you documented the stroke scale was in an odd, nonintuitive place—it seemed providers were just missing it.
“We moved it to a really obvious place that was always available, and that also helped a lot,” says Cooper. “We changed the run form, and we did the video education. There were a number of efforts, but probably the biggest one was just simply making people aware of how they were doing. Across the board, with any kind of QA efforts we’ve done in the state, that usually is the most powerful way we’ve found.”
With aspirin an issue was discovered with dispatch: New Hampshire has a statewide 9-1-1 PSAP, and as part of their EMD efforts, dispatchers there advised callers with cardiac chest pain to take aspirin. So some patients were taking it on their own before EMS arrival, and hence EMS wasn’t giving it.
Run forms also lacked any easy way to document a patient not getting aspirin for reasons beyond allergy—because they were unconscious, for instance, or had an oral issue that prevented it. The third version of NEMSIS, which went live in the state June 1, includes pertinent negatives to help clarify such issues.
There was also a challenge in tracking back data through third-party software, which accounts for about 30% of New Hampshire’s EMS data. Says Cooper: “It’s hard to track back through the source software, and maybe why someone didn’t document, when we can only see what came into our system.” That ultimately required individual phone calls to service leaders, hospital coordinators and medical directors to ensure the picture was complete.
How it’s worked: “It’s been unbelievable,” says Donovan. “Within 20 days we saw our aspirin administration rates go from the mid 40s to the 70s and finally to the 80s. And the same thing happened with stroke. Now we’re documenting aspirin administration in the 84%–88% range. With the stroke scale we had similar results—documentation increased twofold.”
Bigger picture, it’s represented a step forward for EMS education throughout the state.
“It’s become a much more focused education process,” says Cooper, “where we’re really getting right down to it and getting things back to the services. I think what it comes down to is that providers, for the most part, want to do the right thing. They just don’t always know what that is.”
QUANTIFYING STEMI DELAYS
The problem: “National progress has been achieved in the timeliness of treatment of patients with ST-segment–elevation myocardial infarction who undergo primary percutaneous coronary intervention.”
That was the happy conclusion of a 2011 review in Circulation.1 It cited a decrease in median door-to-balloon times from 96 minutes in the year ending December 31, 2005, to 64 minutes in the three quarters ending September 30, 2010. The percentages of patients with D2B times less than 75 minutes and less than 90 minutes both soared during the interval. And data within the piece showed the improvements occurring in rural systems as well as urban.
That didn’t sit right with Don Rice, MD, then EMS medical director for the state of Nebraska. “I’ve worked in rural areas,” Rice says. “I know many patients there are not seen in a timely fashion—there’s no way.”
Still, it was a claim he heard often—particularly from cardiologists and hospital executives who opposed a statewide STEMI system Rice was tasked with developing in the early 2000s.
Some deep data diving into the AHA’s data and Nebraska’s helped Rice rebut some of those claims of improvement and get the state’s system up and running.
The solution: The problem was, back in the days before direct transport to STEMI centers, patients could get delayed in small critical-access hospitals and not get the fast treatment they needed.
The AHA’s data told of overall D2B improvements, but if you looked closely, there were some catches. First, the way it counted the D2B interval started with arrival at the receiving hospital—it carved out all prehospital time.2 And in documenting those rural improvements, it defined small hospitals as having up to 300 beds.1
“In Lincoln,” notes Rice, “all three hospitals have fewer than 300 beds. So in this urban environment, all of those would be considered small hospitals.” Lumping very small hospitals in with hospitals like those, he contended, skews the results and masks poor performance at those very small ones.
Third, rural hospitals accounted for just 6.8% of the almost 900 hospitals the study looked at. But in Nebraska, they account for roughly a third of all hospital beds. And fourth, in supplemental material for the study released later, it was revealed that more than 50,000 STEMI cases were excluded from analysis for reasons like missing inital EKG interpretations or other key data elements.
“If you have a heart attack patient and limited staff, your time is best spent to get the patient out as quickly as possible,” Rice says. “So a lot of times rural docs will send an EKG. But maybe they didn’t have a chance at the copy machine to quickly sign the EKG. And if they didn’t do that, it didn’t get included in this study.
“My point is that many of the things that happen in a rural environment are also the very things that make you get excluded from the American Heart Association data. If you look at the reasons why they kicked people out, it would disproportionately affect rural hospitals. I used to work in a small hospital, and this is the type of mistake they make! In larger hospitals that have better QA systems in place, these mistakes would not happen. So what I’m seeing is a large skewing of data.”
To convince the doubters, Rice conducted a grassroots survey of Nebraska’s 65 critical-access hospitals, which was eventually published as the RAMIS study.3 AHA reps maintained those hospitals all had thrombolytic policies, an accepted alternative at the time for STEMI patients seen at facilities not capable of PCI, and standing transfer agreements with PCI-capable hospitals. The RAMIS survey found:
- While 98% of critical-access hospitals had thrombolytic policies, 23% said they had providers who “trend(ed) toward not administering thrombolytics.” Just 60% could definitively say they didn’t.
- Just 45% said they had standing transfer agreements with regional STEMI centers.
Says Rice: “Basically, if you had a heart attack in rural Nebraska and went to a critical-access hospital, you had a 23% chance the doctor there was too afraid to use thrombolytics on you, and a 45% chance they didn’t even have a preexisting transfer agreement. That meant you’d be subject to the standard EMTALA pathway for transfer. And that can take several hours.”
Paramedics in the state echoed the claim that getting STEMI patients to cath labs could take hours, so Rice also crunched some state data. The math was pretty straightforward: He took one well-reputed hospital, then looked at all the patients taken there for cardiac issues. Subtracting their arrival time from their departure time revealed how long they were staying.
“The average for a heart attack patient,” Rice says, “was 4½ hours! Now you understand why this was so important.”
How it was implemented: At the hospital they were shocked and abashed. They didn’t know the delays were happening or that you could even have standing transfer arrangements.
Within a matter of weeks, work began on the statewide STEMI system.
“I was able to use the database system to prove patients were lingering in ERs and not getting treatment,” Rice concludes. “I’d been fighting politicians and cardiologists for 10 years to create a statewide STEMI system. I ran one report, and in two weeks they came to me and said, ‘What can we do to fix this?’ I said, ‘Let’s develop some protocols.’”
How it’s worked: Those were drafted in two weeks and trialed for 18 months, and though EMS lacks the hospital outcome data to prove their benefit, anecdotal evidence told of success: MI patients were coming home faster, with stents instead of damaged hearts, and getting back to work. Instead of losing revenue, as they’d feared, the critical-access hospitals benefited from a renewed faith in the system that helped business grow. The protocols subsequently went statewide.
“What we needed was the data, and we didn’t have the data in the early 2000s,” Rice says. “Once we got the system and started collecting data, it was easy to prove the point. So the importance of the data system was, it gave us the ability to quantify what was actually going on, and prove to people that what they thought was happening was not happening.”
PUT THE DATA IN, GET THE DATA OUT
The problem: When value-based reimbursement finally comes to EMS, we’ll need to be ready to demonstrate the good job we do. How? By developing quality metrics that define good patient care and then meeting and exceeding them.
Lots of people have developed key performance indicators for EMS, including NHTSA/NASEMSO in 2009, the Metropolitan Medical Directors in 2007, and professional associations in areas like trauma, cardiac and stroke.
Today’s EMS Compass Initiative is a federally funded effort to develop performance measures that can help systems gauge and improve the quality of their care. Its measures will be based on NEMSIS and help local and state systems utilize their own data to get better.
In Washington they had a similar idea a while ago. The work leaders there have done to develop KPIs and facilitate reporting and benchmarking will put their state’s services ahead of the game once the transition to version 3 of the National EMS Information System (NEMSIS) is complete.
The solution: The Washington State Prehospital Technical Advisory Committee began work on EMS KPIs as part of a three-year strategic plan in 2011. Work groups toiled for three years before the state’s EMS and Trauma Steering Committee okayed them for use in 2014.
“The importance of having the clinical measures is really why we did it,” says Melissa Belgau, administrator for the Washington EMS Information System (WEMSIS). “We think EMS will be wanting to do pay-for-performance types of things, and we’ll need proof we’re providing quality care. The idea is that we’re going to decide what to measure before someone tells us what to measure.”
The 27 KPIs that emerged meet EMS Compass performance measure criteria, are supported by substantial evidence in medical literature and can be measured and reported from WEMSIS data. A range of stakeholders from across the state helped craft them, in particular medical program directors (MPDs), physicians for each county who oversee the clinical performance of EMS personnel. The desire was a strong clinical focus that was “relevant, useful and achievable.” WEMSIS would be the primary source for data collection and analysis, and hopefully the result could serve as a model for other states.
The KPIs were drafted relatively broadly and can be adopted in part or whole or modified for local use. The resulting reports can be configured by local queries so agencies can compare their own data to others similar or nearby. What they’ll ensure, however, is that services measure the same things the same way, allowing effective benchmarking.
State leaders enlisted ImageTrend to add that benchmarking functionality to its Report Writer program, which helps administrators and data managers gauge KPIs and track QA/QI elements. They also built a dashboard for easy bundling of clinical impressions.
How it was implemented: Well, about that: Because not all systems in Washington are compliant with NEMSIS v3, it hasn’t been—yet.
That is, the KPIs are available for use, and some jurisdictions have embraced them, but the full reporting/benchmarking capability is still waiting to be tapped.
“If you’re the only agency in your county using the new system, there’s nothing to benchmark against,” notes Belgau. “So while the KPIs won’t been used on a state level until NEMSIS v3, counties and some EMS agencies have been using them at a local level with their v2 data.”
Agencies face a January 2017 deadline to upgrade to v3. But the state may experiment with some early reporting later this year; its biggest ePCR vendor converts to v3 in September and can start populating data then. “Those initial reports,” Belgau says, “will get other people excited about it.”
Seventeen hundred miles to the east, they’re excited already. When EMS leaders from Nebraska learned of the Washington project at ImageTrend’s 2015 Connect Conference, they eagerly adopted the KPIs. As a condition of the NHTSA funding that enabled the project (through the Washington Traffic Safety Commission), anything that results from the grants must be made available to anyone in the country.
How it’s worked: Ultimately this should all help measure, compare and improve EMS performance in key areas. But on a more basic level, it will provide a fuller and more thorough picture of what and how EMS in the state is doing.
“WEMSIS was always kind of incomplete because we don’t have a mandate to collect this data,” says Belgau. “So we were always strategizing on how to incentivize people to give us data. And one of the things we hear is, ‘Well, you’re not doing anything with the data.’ But it’s kind of a chicken-and-egg problem: You’re not giving us data that’s meaningful! We’re trying to get the data in there!
Indeed, per numbers presented at last year’s Connect Conference, just 45% of all the agencies in Washington had ever reported data to WEMSIS, and just 23% in the preceding six months. Most state systems are using ePCRs; they’re just not taking that extra step to send the data to the state. And with a dozen different vendors sharing the market, that’s a lot of siloed data that can only connect in WEMSIS.
Right now it generally falls to motivated MPDs to drive their system’s participation. Some are more motivated than others. By building the KPIs into WEMSIS, it allows agencies to take their own lead and do their own reports and comparisons. “They can put the data in and get the data out,” says Belgau.
The Hooley Awards
The Hooley Awards are given in conjunction with ImageTrend’s Connect Conference. This year’s events, with another set of winners, will be held July 20–22 at the RiverCentre in St. Paul. For more information on the Connect Conference, see www.imagetrend.com/news-and-events-connect-conference. For more information on the Hooley Awards, see www.imagetrend.com/news-and-events-connect-conference/hooley-awards.
References
1. Krumholz HM, Herrin J, Miller LE, et al. Improvements in Door-to-Balloon Time in the United States, 2005 to 2010. Circulation, 2011; 124: 1,038–45.
2. Diercks DB. American Heart Association Mission Lifeline: Developing a STEMI Regional Care System, www.emcreg.org/publications/monographs/acep/2009/acep2009_dbd.pdf.
3. Nudell N, Rice D, Gale JA, Wingrove G, Bouthillet T. Rural acute myocardial infarction survey (RAMIS). International Paramedic Practice, 2013 Jan; 2(1): 3–10.