Skip to main content

Advertisement

ADVERTISEMENT

Original Contribution

How To Measure and Improve EMS Systems

January 2016

When staff in the National Highway Traffic Safety Administration (NHTSA) Office of EMS decided to support EMS Compass, it made perfect sense. For decades, NHTSA has funded projects that have helped make the nation’s EMS systems what they are today. Twenty years ago, the Agenda for the Future presented a vision of a data-driven EMS system; more recently, the National EMS Information System (NEMSIS) established data standards and the Prehospital Evidence-Based Guidelines (EBG) project established treatment guidelines based on research findings. Each has built on those that came before it, and EMS Compass is no different.

It’s the logical next step—applying those standard data elements and medical research to create performance measures that will help EMS agencies meaningfully use their data to improve patient care.

In December 2014 the National Association of State EMS Officials (NASEMSO), through a cooperative agreement with NHTSA, publicly launched EMS Compass. Since then, the initiative has tackled difficult issues, from how to decide which measures to use to how to be inclusive of the entire EMS community. But the focus of EMS Compass has been creating a replicable process for identifying, designing and testing performance measures—a process that can be used well into the future by organizations hoping to develop measures that support improving prehospital medical care.

“In order to take the next step as a profession, EMS must embrace a culture of improvement—one where we measure the things that truly matter to patients and we work to improve the processes that result in the best possible patient care,” says Bob Bass, MD, former director of the Maryland Institute for EMS Systems who is serving as the EMS Compass Steering Committee Chair.

On January 13 the EMS Compass Steering Committee meets in person for the third time to discuss the progress of the measure development process. The committee, composed of several experts in performance measurement and improvement from both inside and outside the EMS community, is also expected to be reviewing and prioritizing the first sets of performance measures,

Building Measures

But the bulk of the work for EMS Compass has occurred between those meetings, when dozens of volunteer members of the initiative’s working groups have spent countless hours designing, refining and testing the measures.

When the EMS Compass Measurement Design Group first gathered in Washington nearly a year ago, its members knew they had a challenge before them. Since then, they have helped create a measurement design process that is transparent and evidence-based, with opportunities for members of the public to participate.

In addition, they have sifted through hundreds of potential measures submitted by the EMS community and present in the literature, choosing the ones that EMS agencies can use to help them in their pursuit of providing high-quality care to patients and making a difference in their communities.

“Choosing the ‘vital few’ measures and making sure we specify clear operational definitions to know the data to include, what to exclude and how to calculate the measures are essential to ensuring EMS agencies can use the measures to support improvement,” says David Williams, PhD, executive director at the Institute for Healthcare Improvement and chair of the EMS Compass Measurement Design Group. “Measurement is vital to knowing how an agency is reliably delivering results and whether efforts to improve processes are moving their dots on their charts, and EMS Compass will help provide a good place for agencies to start to look at meaningful process and outcome measures that can enable them to focus on improvement and reliability.”

From the start, EMS Compass has focused on using NEMSIS data points in its measures when possible. Because NEMSIS creates a standard for collecting EMS patient data, the vast majority of U.S. EMS agencies are gathering the same information on each patient. NEMSIS-compliant electronic patient care reports (ePCRs) also ensure that the data are stored in the same way, so they can be sent to state and national databases and used for research and analysis.

But for EMS Compass, the NEMSIS standard also means that clinical performance measures can be designed so that any agency using NEMSIS-compliant PCRs can use the measures in the same way. They can even be built-in to software that automatically pulls the information from individual ePCRs and calculates how an agency is performing on a specific measure.

To support EMS Compass’s efforts to design measures that can be automated when using NEMSIS data, the project has enlisted a group of volunteers from several leading EMS technology and software companies. Chaired by FirstWatch Product Strategist Debbie Gilligan, the EMS Compass Technology Developers Group has focused on matching the proposed performance measures to NEMSIS data points, and testing them to make sure EMS agencies could implement the measures with the EMS clinical data they already have.

“The possibilities created by having electronic records, a uniform data standard, and performance measures using that data are pretty exciting,” Gilligan says. “It’s been fun to get in a room with people from all these innovative companies who are in many ways competitors, but who all have one goal in mind—finding ways to make it easier for people to get more out of their data.”

Building consensus around performance measures is not easy, as many other areas of healthcare have already discovered. During the EMS Compass process, the committee members and project leaders have navigated some complicated questions, from what level of evidence review was needed to which sources of data could be considered. Early on, it became clear that many members of the initiative recognized some key measures—clinical outcomes—would rely on EMS agencies being able to access data from outside sources, such as hospitals.

“Information should flow seamlessly from prehospital care to hospital care and back,” Bass says. “To me I see that as a really important part of what we’re doing—to say here’s the science, here are the measures, here’s the way it needs to be done.”

At the same time, he acknowledged, most EMS agencies continue to struggle to get access to outcome information from hospitals. Recognizing that, many of the measures were designed to serve as surrogate measures that use use data currently collected by providers when they complete patient care reports only until they are able to link with hospital data to collect the complete measures.

“Maybe you have to do that initially just so you get some baseline on the EMS processes and start measuring,” Bass says. “But we have to be willing to say that the right way to measure performance often includes patient outcomes, such as survival-to-discharge for cardiac arrest. And maybe having a measure that says that will help improve the state of data-sharing between hospitals and EMS systems.”

Establishing a Foundation Based on Evidence

Equally important to using available data is creating measures that are based on the latest evidence and best practices in prehospital care. The ultimate purpose of performance measures is to improve patient care and EMS practice—and therefore the patient experience and outcomes. Performance measures often drive programmatic changes, so it was essential to the EMS Compass leaders to only measure processes that have a demonstrated positive impact.

For example, while measuring IV success rates has long been a standard for many EMS performance management programs, there has never been a proven association between IV success rates and patient outcome. While it is clearly important that paramedics can competently perform skills they are expected to perform, EMS Compass leaders chose to focus on measures linked to patient outcomes, such as the ability to accurately identify stroke patients, or administration of aspirin for heart attack victims. Members of the initiative looked to sources such as American Heart Association guidelines and articles published in the peer-reviewed medical literature to ensure the EMS Compass measures would be assessing evidence-based practices.

“A key opportunity in improvement is to use measurement to support EMS systems to reliably deliver evidence-based care,” Williams says. “Not measuring the care processes that matter will not improve outcomes.”

Focus on Local Improvement

EMS Compass leaders say the initiative is focused solely on creating measures that support systems to improve care, and the project has no ability or authority to require agencies to use or report the measures. But with changes to healthcare funding occurring across the industry, many members of the EMS community and EMS compass team have acknowledged that in the future, healthcare payers, local governments and other entities may look for additional measures to use to EMS payments or hold systems accountable—in a sense, this is already happening in cities across the country that have response time requirements tied to contract payments for EMS services. If insurance companies and the U.S. Centers for Medicare and Medicaid (CMS) later link payments to performance, some EMS leaders argue, it’d be better for them to use measures developed by the EMS community and based on solid medical research.

“Our goal is to create evidence-based measures that support the improvement of the quality of care at the local level, period,” says Dia Gainor, executive director of NASEMSO. “If agencies, regulators or communities choose to use the EMS Compass measures, then they will be using measures that are good for patients and good for EMS.”

Involving the entire EMS Community and building consensus around measures was a priority for the initiative, regardless of potential other uses of the measures. From the start, EMS Compass has involved dozens of EMTs, paramedics, educators, administrators, medical directors and other experts on its various committees. The steering committee also included several experts from outside EMS, including Kedar Mate, MD, a physician and improvement expert with the Institute for Healthcare Improvement (IHI); Patria de Lancer Julnes, PhD, a performance measurement expert and researcher at Penn State Harrisburg; Todd Olmsted, PhD, a health economist at the University of Texas; and Martha Hayward, a patient advocate with IHI.

A Community Effort

Beyond the impressive roster directly associated with the project are the dozens of individuals and agencies who submitted measures during the public “Call for Measures.” In order to be as open and inclusive as possible, EMS Compass began its measurement design process by asking members of the public to submit ideas for measures. The response exceeded even the expectations of the initiative’s leaders when they received more than 400 submissions.

“From the beginning, I’ve been receiving so many emails, seeing great turn-out at meetings, hearing from so many different members of the EMS community,” says Nick Nudell, the EMS Compass project manager. “It’s exciting to see so many people engaged and interested in contributing.”

Since then, EMS Compass has hosted several webinars and conference sessions to share information and receive feedback, and the proposed measures are all available for public comment on the initiative’s website, emscompass.org, prior to review by the steering committee. EMS Compass has also encouraged agencies to test the specific measures to ensure that agencies are able to access the data and calculate the measures as intended.

Even with a process that is so focused on inclusiveness and evidence and testing, it is clear that EMS performance measures created today cannot be thought of as permanent. As research reveals new findings and different data becomes available, the EMS community must be willing to adapt and re-evaluate performance measures. What EMS Compass has focused on, rather than specific measures, is a process for developing and revisiting measures. Based largely on the model used by the National Quality Forum and the larger healthcare community, the members of the EMS Compass team hope that the process they’ve created and fine-tuned will be used in the future by the EMS profession to develop new measures, to re-assess old measures and ensure that EMS Agencies continue to have the tools they need to support improvement.

“The real legacy of EMS Compass, and our main measure for knowing it is a success, will be a culture of performance improvement across EMS, from volunteers in the smallest rural agencies to chiefs in the busiest urban departments,” Gainor says. “Everyone in EMS shares the same goal—providing the best care to our patients—and EMS Compass will help us do just that.”

In fact, creating a sustainable process for designing measures is only part of the EMS Compass initiative—a result of the project will be a guide to using performance measures for improvement. Frequently in the past, efforts have focused solely on the measures. While agencies have started collecting the data and even calculating measures, many struggle with implementing change based on what they learn.

A critically important part of that process will be a shift away from thinking of measurement for compliance, accountability or judgment. Measurement for improvement focuses less on people and errors to understanding process reliability that enables outcomes. This is a major cultural shift for EMS, which has not had widespread experience of using measurement for improvement.

“Figuring out how to calculate the measures might seem like the difficult part, but it’s just the first step,” Bass says. “What’s usually the real challenge is knowing what those numbers means and recognizing when improvements can be made and figuring out the best way to drive that change.”

The EMS Compass Legacy

Over the next several months, the EMS Compass team will also be leading discussions about how to ensure the increased focus on measurement for improvement doesn’t end when the funding for the current initiative runs out. Some leaders in the EMS community hope that federal funding will be used to extend the EMS Compass process. Others have suggested that EMS stakeholder associations could work together to keep EMS Compass alive. Another possibility is that the EMS Compass process of designing performance measures is used by different organizations looking to create measures.

While the exact future of EMS Compass beyond its initial funding is not decided, what is clear is that the EMS profession is ready to move beyond simply measuring IV rates and response times. By using evidence-based and thoughtfully designed performance measures, EMS agencies have the opportunity to improve the quality of patient care and enhance the service to their communities. And that’s a goal that every EMS provider can agree on.  

How EMS Leaders Use Data to Improve EMS Systems

“If we have measures that are truly patient-centered…I believe we’ll make much better decisions and make much more significant improvement,” said Mike Taigman, General Manager of AMR in Ventura County, CA, during one of a series of 10 webinars held in June 2015 by EMS Compass as part of the initiative’s effort to engage and receive input from the EMS community. Each webinar tackled one of the 9 domains of measures EMS Compass is addressing.
Taigman admitted that measuring patient outcomes is a struggle in EMS and medicine generally. But that doesn’t mean EMS shouldn’t strive to have performance measures that are as patient-centered as possible.

During the session on population and public health, Taigman described the process measures that he believes EMS can use when true patient outcome measures are difficult to assess. A process measure, EMS Compass Project Manager Nick Nudell explained, is one that evaluates a step in the process that is linked to outcomes but is not the outcome itself.

Taigman’s examples included measuring the time from the first 9-1-1 call to certain evidence-based procedures for time-sensitive conditions, such as:

  • First chest compression for cardiac arrest;
  • Restoration of blood flow for STEMI;
  • First CT scan interpretation for stroke;
  • Infusion of two liters of fluid for sepsis.

There are six domains of measures that correlate to the priorities in the U.S. Department of Health and Human Services National Quality Strategy. Population and public health is one; the other five are patient and family engagement, patient safety, care coordination, efficient use of healthcare resources, and clinical process and effectiveness. In keeping with the project charter to address all aspects of an EMS system, the initiative added three other domains: workforce, fleet and data.

In the EMS workforce webinar, Daniel Patterson, PhD, a senior scientist with the Carolinas Healthcare System in Charlotte, NC, discussed ways to measure fatigue and the safety culture within an organization—all of which have been linked to safety outcomes by research, he said.

While in the past some of these measures have used questionnaires that can be time-consuming to administer to staff, Patterson and other researchers have been working to refine those surveys to make them shorter but still valid. Other ways of measuring these factors, such as using text messaging to assess fatigue, are currently being investigated.

Mike Ragone, director of system design for AMR, spoke about the difference between measuring “on scene” times and “at patient” times during the session on efficient use of healthcare resources. “On scene time versus at patient side, as we all know, can be a huge difference,” Ragone said, citing the example of responding to a casino where it may take 15 minutes to reach the patient after arriving. “If we do not separate them, we won’t be able to draw conclusions” about the clinical relevance of response times, he added.

Ragone discussed some of the different ways systems are trying to measure accurate at-patient times, from communicating via the radio in order to let dispatchers mark the time to using handheld devices, such as smart phones, that allow the practitioners in the field to accurately record the time.

Although EMS Compass aims to create performance measures around the data standards established by NEMSIS, it was clear that this effort to use data to assess system performance may also drive changes to how data is collected, as EMS systems determine which elements are most critical to analyze.

Editor’s note: The EMS Compass Steering Committee is scheduled to meet on Jan. 13 to review and prioritize several performance measures. For the latest update, visit emscompass.org.

 

 

 

 

 

 

Advertisement

Advertisement

Advertisement