For supply chain executives, one equation can help them navigate through many of their challenges: Data (generated through evaluations, unbiased studies, etc.), plus Patience (so that well-executed studies and thoughtful discussion among various stakeholders can take place without undue haste), equals Confidence that the provider has selected the right product or device. D+P=C, for short. It’s an equation that value analysis professionals hold dear.
Wanda Lane, RN, MaED, value analysis manager, Regional One Health, Memphis, Tenn., explained the equation and its significance in a recent webinar, “Using Quantifiable Data to Validate Product Claims.” The webinar was sponsored by the Association of Healthcare Value Analysis Professionals, or AHVAP.
The value analysis professional faces a number of challenges today, said Lane. Costs to operate healthcare facilities keep rising, but reimbursement does not. For that reason, the executive team expects value analysis to help hold the line on expenditures. Meanwhile, hospitals and health systems are increasingly held accountable for favorable patient outcomes and patient satisfaction. For their part, clinicians continue to crave high-quality products and equipment that can increase their efficiency.
The equation D+P=C can help value analysis professionals deal with these sometimes-colliding pressures, said Lane. To illustrate her point, she offered three case studies: 1) bed frames and surfaces; 2) alcohol end caps; and 3) a mechanical device to measure and record urine output.
Bed frames and surfaces
Two years ago, Regional One Health was looking at extensive repair or replacement of many of its bed frames (including those in trauma ICU) as well as surfaces. In its frames, the IDN sought durability and versatility; among surfaces, it sought durability, pressure redistribution and infection prevention. All in all, the replacement project would call for significant dollars being spent.
Beginning with a list of five potential frame vendors, the value analysis team narrowed it to two (primarily because of the traction requirements), said Lane. The team and vendors agreed to a three-week, side-by-side evaluation. “I pulled in the two vendors, and we all sat at the same table and discussed the processes we would go through in the evaluation.” One must-have: Complete transparency. For example, the Regional One Health team told the vendors that if either issued an e-mail to hospital staff about its competitor, the IDN would share that e-mail with the competitor. “It keeps everyone honest,” she said.
The IDN devised a blind surface study, in which volunteers from non-clinical areas agreed to lie on the competing surfaces/frames (with vendor names hidden) for 30 minutes in four different positions to allow for mattress pressure redistribution and skeletal settling. Mapping photos were taken upon placement, at 15 minutes, and 30 minutes. The results were tallied by another unbiased third-party, and delivered – without divulging the manufacturer names — to the chief nursing officer and wound care director. It was only after they delivered their choice to materials management that the names of the vendors – with corresponding test results – were divulged. It turned out that the winner not only had the older technology, but the cheaper one as well, allowing the IDN to replace more of its surfaces than originally planned.
“We considered [the process] a success all the way around,” said Lane. “If there was a question of bias, or a question from one of the vendors, we had the photographic evidence of pressure mapping to show everyone. The data spoke for itself.”
Alcohol end caps
Typically supported by nursing and physicians, alcohol end caps are designed to assure the caregiver that the vascular access point has been sanitized, said Lane. The IDN was very interested in reducing central-line-associated bloodstream infections, or CLABSIs, but infection prevention was skeptical that alcohol end caps would help them do so. “We didn’t have evidence to support their efficacy,” said Lane. Further, infection prevention raised a concern that with the end caps in place, nursing might discontinue “scrubbing the hub” between cap uses.
The IDN chose a long evaluation period – 120 days – in order to get an accurate picture of the effect the alcohol end caps would have on infection prevention; and compared the results (in terms of infection prevention) with a similar 120-day period from the prior year. At the end of the evaluation, the Regional One Health team found the device had minimal impact, and they declined to implement it, avoiding expenses of about $250,000.
“We had an evidence-based decision-making tool in the room, and we could show the data,” said Lane. “It set up less of an ‘us-vs.-them’ argument, with less conversation [to the effect that] ‘It’s about the money.’ Instead, it was about the data, and the data didn’t support the project.”
Implementing a methodical evaluation process, with data collection, doesn’t always lead to a clean, uncontested outcome, said Lane. She shared the IDN’s experience evaluating the efficacy of a mechanical reading device for urine output (as well as a urine temperature sensor).
Many nurses advocate for the device, because it saves them the time of physically dumping and measuring urine on an hourly basis. Physicians believe it leads to more accurate measurements, particularly among trauma and burn patients. But others believe that because the device gives retrospective measures, it can lead to patients retaining Foley catheters longer than necessary. What’s more, infection prevention raised questions about whether the device could be adequately cleaned. All this is not to mention the rental and repair costs associated with it.
The decision was made to compare the performance of the device vs. the practice of manually measuring fluid intake and urine output (I&O) over a 180-day period. The comparison showed that the manual method led to a reduction in time Foleys were in place, and that caregivers had no difficulty getting accurate I&O measurements using the manual method. Nor were there any challenges regarding temperature findings.
Nevertheless, despite the recommendation to discontinue use of the mechanical reading device, that has not occurred, primarily because of strong misgivings by clinicians. “We are still fighting this battle,” Lane said.
Value analysis steps for success
Wanda Lane, RN, MaED, value analysis manager, Regional One Health, Memphis, Tenn., offers the following suggestions to value analysis colleagues about how to make the equation – Data + Patience = Confidence — work in practice:
- Take the time to build the correct team to evaluate new technologies. Don’t be afraid to include the devil’s advocate, that is, the person who is averse to change. “Many times, the devil’s advocate helps us avoid buyer’s remorse,” she said.
- Prior to beginning an evaluation, identify the provider’s goals. If a vendor makes a product claim, the team should create an evaluation process that will measure it. Establishing quantifiable goals may take a little more time, effort and accountability, but the outcomes are worth the effort.
- Collect adequate baseline data, to ensure a thorough comparison between the existing technology and new one under evaluation. Compare data from the evaluation period with data from a similar, earlier period, and enlist the help of IT (to help analyze financial implications) as well as infection prevention.
- Partner with the vendor. Evaluations can be lengthy, Lane pointed out. “Rather than say, ‘We do not pay for evaluations,’ we sit down with our vendors on the front end to discuss exactly how we are going to approach the evaluation, including the length of time it might be expected to take. Then we look at shared responsibility [for the cost of the evaluation].”
- Determine the quasi-experimental design (essentially, the scientific method). Collect data, then use an unbiased third-party to analyze the results. This could be someone within the IDN who will never use the product.
- Make recommendations.
- Track results.
- Above all, resist the time pressures often foisted upon value analysis teams. Confidence occurs when the value analysis professional can walk into a meeting with the executive staff; show data from the evaluation compared with retrospective data; and then allow the executive team to make a clear choice whether or not to support the professional’s recommendations.