Tag Archives: patient safety

It’s Faster to Send a Rescue Mission to the International Space Station Than to the South Pole

By ThinkReliability Staff

Yes, you read that correctly. Says Ron Shemenski, a former physician for the station, “We were stuck in a place that’s harder to get to than the International Space Station. We know we’re on our own.” A sick astronaut on the International Space Station can jump in the return vehicle permanently parked at the station and make it back to earth in about 3.5 hours. In contrast, just to get a plane to the Amundsen-Scott South Pole research station takes 5 days – in good weather. Which is not at all the situation right now – at the South Pole it’s the very middle of winter.

This makes for an incredibly risky evacuation. It’s so risky that the scientists at the station expect to stay there from February to October, no matter what. The on-site physician biopsied and administered chemotherapy to herself in 1999. A scientist who suffered a stroke in 2011 had to wait until the next scheduled flight. However, winter medical evacuations have been performed twice before in the history of the station (since 1957), in April 2001 and September 2003. These two evacuations were performed by the same company that will perform this rescue. On June 14, the National Science Foundation (who runs the station) approved the medical evacuation of a scientist there. Two flights left Calgary, Canada that same day.

What makes the evacuation so risky that there is a debate over whether or not to rescue an ailing scientist? There are multiple factors that are considered in the decision. These issues can be developed within a cause-and-effect diagram, presented as a Cause Map. The first step in the process is to determine the impacts to the goals that result from a problem. In this case, we will look at the problem of a scientist at the South Pole becoming ill and requiring evacuation. There is an impact to the patient safety goal due to the delay of medical treatment. There’s also an impact to the safety of the aircrew on the flights used to rescue the scientist. There’s also an impact to property/ equipment and labor/ time due to the risky, complex evacuation process.

In the analysis (the second step of the process), the impacted goals become the effect in the first cause-and-effect relationships. The delay in medical treatment for the patient (the ailing scientist) results because required treatment is not available at the station, although a physician and physician’s assistant staff the clinic throughout the winter. There’s also a delay for the decision to send an evacuation plane. In this case, a day and a half of deliberation were required. As previously discussed, normally planes do not arrive at the station during the winter. It’s happened only twice previously in the last nearly 60 years. In order to ensure safety, the crew at the station undergoes a rigorous medical screening, to prevent illnesses requiring evacuation as much as possible.

Medical treatment is also delayed by the time required for the plane to arrive at the South Pole, and then for the plane to return the patient to a medical treatment center. (Which center is determined by the nature of the medical issue, which has not been disclosed, but the nearest centers are thousands of miles away.) The trip to the South Pole takes at least 5 days because of the complexity of the process. It also poses a risk to the air crews making the trip. (There are two planes sent in; one for evacuation and one to remain nearby in a search-and-rescue capability.)

The conditions in Antarctica are the cause of many of the difficulties. The sun set at the station in March, and will not rise again until September, so the plane must land without any daylight. It also has to land on packed snow/ ice, which requires skis, as there are no paved runways and the average winter temperature is -76°F (with wind chill it feels like -114°F). At those temperatures, most jet fuel freezes, so only certain planes can make the trip. (This is why they’re coming from Canada.) The planes can only hold 12-13 hours of fuel, and the last leg of the trip (across Antarctica) takes 10 hours (again, in good weather) so after a few hours into the flight, the plane has to either turn back, or they must land at the South Pole, regardless of conditions. Due to the desolation of the area, there’s nowhere else to land or refuel.

Currently one plane has made it to the South Pole, where it will wait for at least ten hours to allow the flight crew to rest and monitor the weather. The second plane remains at the Rothera Research Station, on Adelaide Island on the edge of Antarctica. Check for updates by clicking here. View the one-page downloadable Cause Map by clicking “Download PDF” above.

 

Particulate Matter Closes Operating Rooms at VA Hospital

By ThinkReliability Staff

On February 17, 2016, the 5 operating rooms at a New York Veterans Affairs (VA) hospital were closed due to particulates falling from the air ducts. An internal email from the engineer & safety officer to administrators at the hospital described the problem as this: “The dust is depositing on HVAC registers, ceilings, walls, and on medical equipment. Maintenance continues to clean the surfaces but, as the staff has observed, the dust reappears within a short time. At least three staff members have indicated their concern that this environment has affected them. They have been sent to employee health and to their individual physicians.”

The information related to this issue determined as part of the incident investigation can be captured within a Cause Map, a visual form of root cause analysis. The first step of the process is to determine the impacts to the goals. In this case, both patient and employee safety are impacted due to the risk of illness from exposure to the particulates. The environmental goal is impacted because of the release of the particulates into the facility. Patient services are impacted because patients are being sent to other facilities as sterile procedures are not being performed (an impact to the production/ schedule goal). The labor and time required for an investigation is also an impact to the goal.

The second step of the process is the analysis: determining why these goals were impacted. The release of the particulates into the facility is because there are particulates within the air ducts, and the air ducts open into the facility to provide heating, ventilation and air conditioning. In order to determine where the particulates come from, first it must be determined what they are composed of. An environmental analysis determined that the particulates were rust, crumbling concrete, fiberglass fibers, and cladosporium (a common mold).

The analysis also identified that rust in air systems typically results from aged equipment exposed to moisture. Cladosporium also results from exposure to moisture. The air duct system pulls in outside air, including humidity, resulting in the system being exposed to moisture. The VA hospital is 45 years old, which actually makes it one of the “newer” VA facilities. (According to the VA, about 60% of its facilities are more than 60 years old.) While it’s unclear what maintenance or replacements have been performed on these components over the life of the facility, deferred maintenance is a general problem at VA facilities. According to the VA inspector general, there is a $10-12 billion maintenance backlog at the department.

Once the causes of the problems (or impacted goals) have been determined, the last step is to implement action items to reduce the risk of the problem recurring. There are two parts to this step: brainstorming possible solutions, and determining which will be most effective to meet the organization’s needs. The hospital considered bringing in mobile surgical units and installing high efficiency particulate air filters in the vents in the operating rooms. The cost of the mobile surgical units (over $70,000 per month) led the hospital to select only the solution of the air filters. At least one operating room is expected to be ready to return to service June 1st.

To view a one-page downloadable PDF of the incident investigation, including the impacted goals, analysis with evidence, and possible solutions, please click on “Download PDF” above.

Regulators ask hard questions about blood testing startup Theranos

By Kim Smiley

The biotech startup Theranos has been all over headlines in recent years.  At first the company made news for its ambitious goals of running comprehensive laboratory testing on just a few drops of blood.  The company has claimed to have created a handheld medical device (nicknamed Edison) that uses only a finger prick of blood and makes blood testing less painful, faster and cheaper.  Theranos’ young and compelling founder Elizabeth Holmes has been featured in multiple magazines, gave a popular Ted talk and has even been compared to Steve Jobs and Bill Gates. In 2014, the company was valued at $9 billion.

Lately, the type of headlines the company has made have changed as the company has been embroiled in controversy.  The multiple concerns about Theranos can be visually represented in a Cause Map, a visual format for performing root cause analysis.  A Cause Map intuitively breaks down a problem to the basic cause-and-effect relationships and visually lays them out.  (Click on “Download PDF” to view an intermediate Cause Map of these issues.)  Many of the issues raised haven’t been proven yet and require more evidence so a question mark is used to note this open question within the cause box.

The problems for the company started coming to a head in the latter half of 2015. A December 2015 report by The Wall Street Journal, At Theranos, Many Strategies and Snags, raised concerns about the accuracy of the company’s propriety handheld blood testing device.  Studies showed that the results of the Edison device differed from testing done by traditional blood testing methods. Additionally,  inspections over a three-week period in August and September 2015 at two Theranos facilities found multiple issues.  Specifics on the exact problems found during the inspections have not been released, but they have been described generically as problems with record keeping, quality audits, and handling of consumer complaints. The FDA has also raised concerns about the approval of a medical device called a nanotainer that is used by Theranos. The nanotainer was classified as a Class I exempt device during the approval process and it should have been classified as a risky Class II device that would have received greater scrutiny during the approval process.

A federal criminal investigation into Theranos is now underway looking into claims the company made about its technology.  A separate probe by the Securities and Exchange Commission is working to determine whether the company misrepresented its new blood testing technology and its claim that it could run a full range of laboratory tests from just a prick of blood from a finger.

As of right now, Theranos has taken a beating in the court of public opinion, but the company has not been convicted of anything and is still selling blood tests from 40 Walgreens in Arizona.  Only time will tell the fate of the company, but the issues it has faced can be seen as a cautionary tale for other biotech startups.  Even if the company is cleared of all wrongdoing, there are lessons to be learned about ensuring laboratories meet all appropriate standards and ensuring proper approvals of all medical devices.

Deadly medication error illustrates danger of discharge period

By ThinkReliability Staff

Medical errors can happen anywhere and at any time. However, these errors may be most likely to occur at transitions, especially the transition from the hospital to home when follow-up care is still required. Says Alicia Arbaje, an assistant professor at the Johns Hopkins School of Medicine, “Poor transitional care is a huge, huge issue for everybody, but especially for older people with complex needs. The most risky transition is from hospital to home with the additional need for home care services, and that’s the one we know the least about.”

The case of a woman’s death from medication errors during that transition period illustrates multiple errors that can occur during this period. We will capture the known details of this issue in a Cause Map, or root cause analysis. The first step in the Cause Mapping process is to capture the what, when, where and impacted goals in a Problem Outline. In this case, the patient passed away October 30, 2013, after discharge from a regional medical center in Missouri, where the patient was treated for congestive heart failure. Organizational goals that were impacted by the patient death (an impact to the patient safety goal), settlements with both the hospital and pharmacy (impacts to both organization’s legal/ financial goals), and the patient being administered a high dose of the wrong medication (an impact to the patient safety goal).

These impacts to the goals become “effects” of cause-and-effect relationships. The Cause Map contains all the cause-and-effect relationships that led to these impacted goals. Causes included in the map are verified with evidence, which can be provided by a variety of sources. Causes can be determined by asking “Why” questions, but more than one cause may be required to produce an effect. In this case, all necessary causes are included and joined with an “AND”.

The patient safety goal was impacted because of a patient’s death due to multiple organ failure when her bone marrow became unable to create blood cells as a result of an overdose of methotrexate. Methotrexate can damage blood cell counts and is primarily used to treat cancer and severe arthritis. The patient was administered a high dose (for methotrexate) of a drug that was not prescribed for her. When the patient left the hospital, the hospital phoned an order for a daily dose of the diuretic metolazone. However, according to court evidence, the order was written down by a pharmacy technician as a daily dose of methotrexate.

Because of the side effects of methotrexate, it is included in a list of eight “high-alert” medications that warrant special safeguards to prevent incorrect dispensing. The typical dose of methotrexate is much lower, usually only once or twice a week. Despite this, the pharmacist missed the error. In a testimony, he was unable to identify a specific reason for this oversight. The pharmacy manager said “there was a breakdown in the system.”

There were more opportunities for this error to be caught before this drug was dispensed to the patient. The patient herself could have noticed the incorrect medication based on the name or information on the enclosed information sheet. However, the patient likely did not fully understand the discharge instructions. Federal data shows that less than half of patients say they’re confident they understand discharge instructions. This patient was also receiving home health care, but neither of the two nurses that saw the patient identified the medication mix-up. Even though a primary purpose of home health care is to develop and follow-up on patient care, a 2013 government report found that more than a third of facilities did not do this properly. Medicare requires that home health agencies verify patient’s medications and check for possible interaction, but inspectors found that nearly a quarter of home health agencies inadequately reviewed or tracked medications for new patients. One of the challenges is that the typical providers of post-discharge patient care (nursing homes, rehabilitation facilities and home health care providers) did not receive any of the funding provided by Congress to upgrade to electronic medical records.

Several systemic issues were identified in this case and actions meant to improve these issues are still ongoing. One reason for increased use of electronic medical records is to avoid delivering prescriptions over the phone, which can result in transcription errors. Ensuring patients better understand their discharge instructions is another goal that could improve patient safety. Lastly, improvements to home health care agencies to ensure their required tasks are being completed effectively is clearly needed, but it has been difficult to determine the most effective way to do this.

To view the Cause Map of this incident, click on “Download PDF” above. Or, click here to read more.

NIH suspends work at two facilities

By Kim Smiley

Research has been suspended at two National Institutes of Health (NIH) facilities – a National Cancer Institute laboratory working on cell therapy production and a National Institute of Mental Health facility that makes positron emission tomography materials – over concerns about patient safety. A panel of experts determined that these facilities were not in compliance with quality and safety standards and they are shut down pending a review and any necessary upgrades.

A Cause Map, a visual format for root cause analysis, can be built to help understand this issue.  The first step in the Cause Mapping process is to fill in an Outline with the basic background information for an issue, along with how the issues impacts the overall goals. Thankfully, no patient harm has been identified as a result of issues at the facilities, but the potential for patient harm existed and potential impacts should be included on the Outline. No new patients will be enrolled in the affected trials until the issues are resolved and this is an impact to the schedule/operations goal. Once the Outline is complete, the Cause Map is then built by asking “why” questions and the answers are laid out to visually show the cause-and-effect relationships. (Click on “Download PDF” to see a completed Outline and high level Cause Map of this issue.)

So why was work at two NIH facilities shut down? A little background is needed to understand this issue. In April 2015, fungal contamination was found in products that were supposed to be sterile that were prepared at a different NIH facility, the Clinical Center’s Pharmaceutical Development Service. The investigation into the contaminated product found multiple deficiencies, both in the facility itself and in work practices. The deficiencies included a filter missing in an air handling system and insects found in two light bays in clean rooms. (Read our previous blog to learn more.) Following this issue, the director of NIH appointed a panel of experts to review safety compliance at all other NIH facilities that produce sterile or infused products for administration to research participants.

The panel’s evaluation is still underway, but preliminary findings determined that the two facilities in question are not in compliance with quality and safety standards and production has been suspended as a result.  The panel found that NIH has many outdated or inadequate facilities and that personnel lack expertise on applicable regulations, but no specific details about the deficiencies found have been released. NIH plans to do a rigorous review to identify and correct issues found before these facilities resume manufacturing sterile products. No timeline has been given at this point.

The final step in the Cause Mapping process is to identify and implement solutions to reduce the risk of similar errors reoccurring in the future. In addition to correcting the deficiencies found at these facilities, NIH is working on creating more oversight to help ensure manufacturing facilities are in compliance with safety regulations. The panel recommended the creation of both an outside hospital board to oversee the clinical center and a new central office to coordinate research quality and safety oversight.

Only time will tell how effective these solutions prove to be, but I find it promising that NIH proactively reviewed all of the facilities that produce sterile or infused products for administration to research participants following the fungal contamination issues last year.  It may be painful and embarrassing to suspend work at facilities, but the process is at least moving in the right direction if problems can be corrected before patients are harmed.

Programming Errors Can Impact Patient Safety

By ThinkReliability Staff

Clinical decision support systems (CDSS) aim to improve health care quality, safety and effectiveness by providing alerts to providers based on criteria (such as identifying drug interactions). However, a malfunctioning CDSS can actually reduce patient safety when physicians rely on these alerts.

According to “Analysis of clinical decision support system malfunctions: a case series and survey” by Adam Wright, et al, published March 28, 2016, “CDSS malfunctions are widespread and often persist for long periods. The failure of alerts to fire is particularly difficult to detect. A range of causes, including changes in codes and fields, software upgrades, inadvertent disabling or editing of rules, and malfunctions of external systems commonly contribute to CDSS malfunctions, and current approaches for preventing and detecting such malfunctions are inadequate.”

A survey that was part of the analysis found that 93% of Chief Medical Information Officers who responded had experienced at least one CDSS malfunction and two-thirds experienced at least an annual CDSS malfunction. Four such malfunctions were found within the CDSS system at Brigham and Women’s Hospital and were presented as case studies. We will examine one of these case studies within a Cause Map, or visual form of root cause analysis.

The first step in any root cause analysis method is to identify the problem. The CDSS malfunction in this case study involved a stopped alert for annual thyroid testing in patients prescribed amiodarone. When the issue was noticed and resolved in February 2013, it was determined that the alert had been stopped since November 2009, when the internal code for the drug amiodarone was changed.

An important step in describing the problem is to determine the organizational goals that were impacted. In this case, patient safety is impacted because of the potential for untreated thyroid issues and patient services are impacted because of the potential of missed testing.

The second step is to perform the analysis by developing the cause-and-effect relationships that led to the impacted goals. In this case, patient safety is impacted because of the potential for untreated thyroid issues. Patients may have untreated thyroid issues if they are taking amiodarone to treat arrhythmia. Amiodarone has a known side effect of thyroid issues. If staff is unaware of a patient’s thyroid issues, that patient won’t be treated. Staff would be unaware of thyroid issues in a patient if testing is not performed.

The goal of clinical decision support systems is to identify interventions based on patient needs – in this case, the hospital created an alert to suggest thyroid testing for patients who had been amiodarone and had not had a thyroid test in at least a year. Based on typical alert values from the years prior to 2009, the analysis determined that more than 9,000 alerts suggesting thyroid testing were missed.

Thyroid tests were missed because the CDSS did not identify the need for thyroid testing, and because physicians may rely on the CDSS to recommend a test like this one. The alert was originally set up to identify patients taking amiodarone (then code 40) with a start date at least 365 days ago, and no thyroid test values from within the last 365 days. In November 2009, the internal code for amiodarone changed to 7099, but the logic for the alert was not changed. (The reason for the code change is unclear.) As patient records were updated with the new code for amiodarone, the alert failed to identify them for thyroid testing.

The issue was identified during a demonstration of this particular feature of the CDSS and fixed the next day. While the details aren’t known, this issue identifies an ineffective change management program. When changes are made within systems, change management processes are necessary to ensure there are no unintended consequences. While updating the amiodarone code in the alert logic fixed this particular problem, a robust change management program is necessary to ensure that there are no other unintended consequences that could affect patient safety.

To view a visual root cause analysis of this example, please click on “Download PDF” above.

16 patients infected with hepatitis C; thousands potentially exposed

By Kim Smiley

At least 16 patients were infected with hepatitis C after receiving treatment at two hospitals in Utah. Additionally, officials have stated that an estimated 7,200 patients may have potentially been exposed to hepatitis C.  Investigators are working to determine exactly what happened and to test patients who were potentially exposed.

Hepatitis C is a blood-borne illness and cannot be spread by casual contact, including through saliva or sharing food and water. It is not an illness that should typically be at risk of transmission from healthcare professional to patient. A nurse who tested positive for a rare form of hepatitis C worked at the two hospitals that have each had at least one patient who tested positive for the same rare form of hepatitis C. Officials have not released detailed information on how the hepatitis C outbreak occurred, but there are some suspicious circumstances.

The nurse in question was fired in November 2014 after a hospital found evidence that she had diverted medications, which means she was tampering with syringes or other injectable equipment to steal medication.  The nurse pled guilty to the offense and her license was suspended in December of 2015.

It can be very difficult to identify medication tampering by medical personnel, but one of the most alarming facets of this case is that the nurse had been reprimanded and fined by a previous employee for similar misbehavior.  It seems like it should be possible to identify whether a prospective employee has a history of issues with medication diversion during the hiring process. Investigators have not commented on what type of background checks were done prior to her employment at the second hospital, but it seems like an area where hard questions should be asked.

The immediate risk of this particular nurse exposing more patients has been addressed since she is no longer working at a healthcare facility.  The hospitals are offering free testing to anybody who was potentially exposed and are working on a case-by-case basis to determine how to pay for any necessary treatment of those who were infected.  No longer-term solutions have been identified yet, but the investigation is still underway so it is not clear if any lessons learned will result in changes to overall work processes.

Click on “Download PDF” to view an initial Cause Map of this incident.  A Cause Map visually lays out cause-and-effect relationships and can help identify a wider range of causes that contributed to an issue.  Identifying more than a single root cause can promote a wider range of solutions to be considered and can aid in reducing the risk that a problem may reoccur.

Death from Patient-Controlled Morphine Overdose

By ThinkReliability Staff

Could improving the reliability of the supply chain improve patient safety?

The unexpected death of a patient at a medical facility should always be investigated to determine if there are any lessons learned that could increase safety at that facility. A thorough analysis is important to determine all the lessons that can be learned. For example, the investigation into a case where a patient death was caused by a morphine overdose delivered by a patient-controlled analgesia (PCA) found that increasing the reliability of the supply chain, as well as other improvements, could increase patient safety.

The information related to this patient death was presented as a morbidity and mortality case study by the Agency for Healthcare Research and Quality. The impacts to goals, analysis, and lessons learned from the case study can be captured in a Cause Map, a visual form of root cause analysis that develops the cause-and-effect relationships in sufficient detail to be able to find solutions that will reduce the risk of similar incidents recurring.

Problem-solving methodologies such as Cause Mapping begin with defining the problem. In the Cause Mapping method, the what, when and where of the problem is captured, as well as the impact to the goals, which defines the problem. In this case, the patient safety goal is impacted due to the death of a patient. Because the death of a patient under medical care can cause healthcare providers to be second victims, this is an impact to the employee safety goal. A death associated with a medication error is a “Never Event“, which is an impact to the compliance goal. The morphine overdose is an impact to the patient services goal. In this case, the desired medication concentration (1 mg/mL morphine) was not available, which can be considered an impact to the property goal. Lastly, the response and investigation are an impact to the labor/time goal.

The analysis begins with one impacted goal and developing cause-and-effect relationships. One way to do this is by asking “Why” questions, but it’s also important to ensure that the cause listed is sufficient to have resulted in the effect. If it’s not, another cause is required, and will be joined with an “AND”. In this case, the patient death resulted from a morphine overdose AND a delayed response to the patient overdose. (If the response had come earlier, the patient might have survived.) It’s important to validate causes with evidence where possible. For example, the morphine overdose is a known cause because the autopsy found a toxic concentration of morphine. Each cause in the Cause Map then becomes an effect for which causes are captured until the Cause Map is developed to the point where effective solutions can be found.

The available information suggests that the patient was not monitored by any equipment, and that signs of deep sedation, which preceded respiratory depression, were missed during nurse checks. Related suggestions for promoting the safe use of PCA include the use of monitoring technology, such as capnography and oximetry, and assessing and recording vital signs, including depth of respiration, pain and sedation.

The patient in this case was given PCA morphine. However, too much morphine was administered. The pump settings were based on the concentration of morphine typically used (1 mg/mL).   However, that concentration was not available, so a much higher concentration (5 mg/mL) was used instead. The settings on the pump were entered incorrectly for the concentration of morphine used, likely because of confirmation bias (effectively assuming that things are the way they always are – that the morphine on the shelf will be the one that’s usually there). There was no effective double check of the order, medication and pump settings.

Related suggestions for promoting the safe use of PCA include the use of “smart” pumps, which suspend infusion when physiological parameters are breached, the use of barcoding technology for medication administration (which would have flagged the use of a different concentration), performing an independent double check, storing only one concentration of medications in a dispensing cabinet (requiring other concentrations to be specially ordered from the pharmacy), standardizing and limiting concentrations used for PCA, and yes, improving the supply chain so that it’s more likely that the lower concentration of morphine will be available. Any of these suggestions would improve patient safety; implementation of more than one solution may be required to reach an acceptable level of risk. Imagine just improving the supply chain so that there would be very few (if any) circumstances where the 1 mg/mL concentration of morphine is unavailable. Clearly the risk of using the wrong concentration would be lessened (though not zero), which would reduce the potential for patient harm.

To view a one-page downloadable PDF with the outline, Cause Map, and action items, click “Download PDF” above. Click here to read the case study.

“Desensitization” Process Improves Compatibility of Donor Kidneys

By ThinkReliability Staff

Many patients with advanced and permanent kidney failure are recommended for kidney transplants, where a donor kidney is placed into their body. Because most of us have two kidneys, donor kidneys can come from either living or deceased donors. If a compatible living donor is not found, a patient is placed on the waiting list for a deceased donor organ. Unfortunately, there are about 100,000 people on that waiting list. While waiting for a new kidney, patients must undergo dialysis, which is not only time-consuming but also expensive.

Researchers estimate that about 50,000 people on the kidney transplant waiting list have antibodies that impact their ability to find a compatible donor kidney. Of those, 20,000 are so sensitive that finding a donor kidney is “all but impossible” . . . .until now.

A study published March 9, 2016 in the New England Journal of Medicine provides promising results from a procedure that alters patients’ immune systems so they can accept previously “incompatible” donor kidneys. This procedure is called desensitization. First, antibodies are filtered out of a patient’s blood. Then the patient is given an infusion of other antibodies. The immune system then regenerates its own antibodies which are, for reasons as yet unknown, less likely to attack a donated organ. (If there’s still a concern about the remaining antibodies, the patient is treated with drugs to prevent them from making antibodies that may attack the new kidney.)

The study examined 1,025 patients with incompatible living donors at 22 medical centers and compared them to an equal number of patients on waiting lists or who received a compatible deceased donor kidney. After 8 years, 76.5% of the patients who were desensitized and received an “incompatible” living donor kidney were alive compared to only 43.9% of those who remained on the waiting list and did not receive a transplant.

The cost for desensitization is about $30,000 and a transplant costs about $100,000. However, this avoids the yearly life-long cost of $70,000 for dialysis. The procedure also takes about two weeks, so patients must have a living donor. The key is that ANY living donor will work, because the desensitization makes just about any kidney suitable, even for those patients who previously would have had significant trouble finding a compatible organ. Says Dr. Krista L. Lentin, “Desensitization may be the only realistic option for receiving a transplant.”

The study discusses only kidney transplants but there’s hope that the process will work for living-donor transplants of livers and lungs. Although the study has shown great success, the shortage of organ donations – of all kinds – is still a concern.

To view the process map for kidney failure without desensitization, and how the process map can be improved with desensitization, click on “Download PDF” above. To learn more about other methods to increase the availability of kidney donations, see our previous blog on a flushing process that can allow the use of kidneys previously considered too damaged for donation.

 

Hospital pays hackers ransom of 40 bitcoins to release medical records

By Kim Smiley

In February 2016, Hollywood Presbyterian Medical Center’s computer network was hit with a cyberattack.  The hackers took over the computer system, blocking access to medical records and email, and demanded ransom in return for restoring the system.  After days without access to their computer system, the hospital paid the hackers 40 bitcoins, worth about $17,000, in ransom and regained control of the network.

A Cause Map, an intuitive visual format for performing a root cause analysis, can be built to analyze this incident.  Not all of the information from the investigation has been released to the public, but an initial Cause Map can be created to capture what is now known.  As more information is available, the Cause Map can easily be expanded to incorporate it.

The first step in the Cause Mapping process is to fill in an Outline with the basic background information.  The bottom portion of the Outline has a place to list the impacts to the goals.  In this incident, as with most, more than one goal was impacted.  The patient safety goal was impacted because patient care was potentially disrupted because the hospital was unable to access medical records.  The economic goal was also impacted because the hospital paid about $17,000 to the hackers.  The fact that the hackers got away with the crime could be considered an impact to the compliance goal.  To view a filled-in Outline as well as a high level Cause Map, click on “Download PDF” above.

Once the Outline is completed, defining the problem, the next step is to build the Cause Map to analyze the issue. The Cause Map is built by asking “why” questions and laying out the answers to show all the cause-and-effect relationships that contributed to an issue.  In this example, the hospital paid ransom to hackers because they were unable to access their medical records.  This occurred because the hospital used electronic medical records, hackers blocked access to them and there was no back-up of the information.  (When more than one cause contributed to an effect, the causes are listed vertically on the Cause Map and separated with an “and”.)

How the hackers were able to gain access to the network hasn’t been released, but generally these types of ransomware attacks start by the hacker sending what seems to be routine email with an attached file such as a Word document. If somebody enables content on the attachment, the virus can access the system. Once the system is infected, the data on it is encrypted and the user is told that they need to pay the hackers to gain access to the encryption key that will unlock the system. Once the system has been locked up by ransomware, it can be very difficult to gain access of the data again unless the ransom is paid.  Unless a system is designed with robust back-ups, the only choices are likely to be to pay the ransom or lose the data.

The best way to deal with these types of attacks is to prevent them. Do not click on unknown links or attachments.  Good firewalls and anti-virus software may help if a person does click on something suspicious, but it can’t always prevent infection.  Many experts are concerned about the precedent set by businesses choosing to pay the ransom and fear these attacks may become increasingly common as they prove effective.