Tag Archives: Root Cause Analysis

What’s the best way to screen for breast cancer? Opinions differ.

By ThinkReliability Staff

In 2015, there were 40,000 deaths from breast cancer and 232,000 new cases of breast cancer in the United States. It is the second-leading cause of cancer death in women in the United States. The very high level cause-and-effect is that people (primarily women) die from breast cancer due to ineffective treatment. The later the cancer is detected, the later the treatment begins so early detection can help prevent breast cancer deaths. Currently the best solution for detecting breast cancer is a mammogram. But the matter of when mammograms should occur is based on risk-benefit analysis.

There’s no question that mammograms save lives by detecting breast cancer. This is the benefit provided in the analysis. Lesser known are the risks of mammograms. Risks include false negatives, false positives, unnecessary biopsies, and unnecessary treatment. The radiation that may be used in treatment can actually be a cause of future breast (and other types) of cancer.

On January 11, 2016, the United States Preventive Services Task Force (USPSTF) issued an update of their guidelines on mammogram starting and ending age (as well as other related recommendations). To develop these recommendations, the task force attempted to quantify the risks and benefits of receiving mammograms at varying ages.

For women aged 40 to 49, the task force found that “there is at least moderate certainly that the net benefit is small.” The net benefit here reflects the benefits of screening (~.4 cancer deaths prevented for every 1,000 screened and an overall reduction in the risk of dying from breast cancer from ~2.7% to ~1.8%) compared to the risks of screening. Risks of mammograms every other year for women aged 40 to 49 include ~121 false positives, ~200 unnecessary biopsies, ~20 harmless cancers treated, and ~1 false negative for every 1,000 women screened. The task force determined that in this case, the benefits do not significantly outweigh the risks for the average woman. Thus, the recommendation was rated as a C, meaning “The USPSTF recommends selectively offering or providing this service to individual patients based on professional judgment and patient preferences.” (Women who are at high risk or who feel that in their individual case, the benefits outweigh the risk, may still want to get screened before age 50.)

For women aged 50 to 74, the task force found that “there is high certainty that the net benefit is moderate or there is moderate certainty that the net benefit is moderate to substantial.” The types of benefits and risks are the same as for screening women ages 40 to 49, but the benefits are greater, and the risks are less. For women aged 50 to 74, there are ~4.2 cancer deaths prevented for every 1,000 screened and an overall reduction in the risk of dying from breast cancer from 2.7% to ~1.8%.   Risks of mammograms every other year for women aged 50 to 74 include ~87 false positives, ~160 unnecessary biopsies, ~18 harmless cancers treated, and ~1.2 false negatives for every 1,000 women screened. The task force determined that for women aged 50 to 74, the benefits of mammograms every other year outweighs the risk. Thus, the recommendation was rated as a B (the USPSTF recommends the service).

The task force determined it did not have enough evidence to provide a recommendation either way for screening women over age 74.

Comparing these risks to benefits is a subjective analysis, and some do not agree with the findings. Says Dr. Clifford A. Hudis, the chief of breast cancer medicine at Memorial Sloan Kettering Cancer Center, “The harm of a missed curable cancer is something profound. The harm of an unnecessary biopsy seems somewhat less to me.” To those that disagree, the task force reiterates that personal preference should determine the age screening begins. However, insurers may choose to base coverage on these recommendations. (Currently, private insurers are required to pay for mammograms for women 40 and over through 2017.)

Determining these recommendations – like performing any risk-benefit analysis – was no easy task and demonstrates the difficulty of evaluating risks vs. benefits. Because these analyses are subjective, results may vary. To view the risk vs. benefit comparison overview by the task force, click on “Download PDF” above.

Shoveling snow really can trigger heart attacks

By Kim Smiley

You may have heard that shoveling snow can trigger a heart attack and studies have found that there is truth behind that concern.  Before you pick up a shovel this winter, there are a few things you should know.

Shoveling can be much more strenuous than many people realize – even more strenuous than running at full speed on a treadmill.  Snow shoveling also tends to be a goal-oriented task.  People want to clear the driveway before they stop and they may push their bodies beyond the point where they would if they were exercising for fitness.

Cold temperatures can increase the risk of heart problems occurring.  When a body gets cold, the arteries constrict and blood pressure can increase, which in turn increases the risk of heart issues.  High blood pressure and a sudden increase in physical activity can be a dangerous combination.  Additionally, it may take longer than normal for emergency help to arrive if it is needed because of snow and ice on the roadways which makes the situation potentially even more dangerous.

If you are young and fit, snow shoveling can be a great workout (and maybe you could help out your elderly neighbors if possible…), but if you are at risk of heart problems, you may want to put some thought into how you attack the problem of clearing your driveway and/or sidewalks.  First off, you should know if you are potentially at high risk.  Studies have found that people over 55 are four times more likely to experience heart-related issues while shoveling and men are twice as likely as women. People with known heart problems, diabetes or high blood pressure are also potentially high-risk.  Anybody who is sedentary is also at a higher risk of heart issues than somebody who exercises regularly.

So what should you do if you are concerned about the risk of heart problems and shoveling?  If possible, you may want to avoid shoveling if there is somebody else who can do it.  If you are determined to shovel yourself, make sure you drink lots of water and dress warmly.  Try to push the snow if possible, rather than shoveling it.  It is also generally better to shovel lots of lighter loads rather than fewer, heavy loads.  If possible, you may want to shovel several times throughout the storm to spread the work out over time. Take frequent breaks and stop immediately if you feel tired, lightheaded, short of breath or your chest hurts. Stay safe this winter!

To see a Cause Map, a visual root cause analysis, of this issue, click on “Download PDF” above.  A Cause Map visually lays out all the causes that contribute to an issue so that it can be better understood.  This example Cause Map also includes evidence and potential solutions.

Chipotle Improves Food Safety Processes After Outbreaks

By ThinkReliability Staff

On February 8, all Chipotle stores will close in order for employees to learn how to better safeguard against food safety issues.  This is just one step of many being taken after a string of outbreaks affected Chipotle restaurants across the United States in 2015.  Three E. coli outbreaks (in Seattle in July, across 9 states in October and November, and in Kansas and Oklahoma in December) sickened more than 50 customers.  There were also 2 (unrelated) norovirus outbreaks (in California in August and Boston in December) and a salmonella outbreak in Minnesota from August through September.

In addition to customers being sickened, the impacts to the company have been severe.  The outbreaks have resulted in significant negative publicity, reducing Chipotle’s share price by at least 40% and same-store sales by 30% in December.  Food from the restaurants impacted by the fall E. coli outbreak was disposed of during voluntary closings, and the company has invested in significant testing and food safety expertise.

E. coli typically sickens restaurant customers who are served food contaminated with E. coli. Food ingredients can enter the supply chain contaminated (such as the 2011 E. coli outbreak due to contaminated sprouts), or be contaminated during preparation, either from contact with a contaminated surface or a person infected with E. coli. While testing hasn’t found any contamination on any surfaces in the affected restaurants or any employees infected with E. coli, it hasn’t been able to find any contaminated food products either. While this is not uncommon (the source for the listeria outbreak that resulted in the recall of ice cream products has not yet been definitively determined), it does require more extensive solutions to ensure that any potential sources of contamination are eliminated.

Performing an investigation with potential, rather than known causes, can still lead to solutions that will reduce the risk of a similar incident recurring.  Potential or known causes can be determined with the use of a Cause Map, a visual form of root cause analysis.  To create a Cause Map, begin with an impacted goal and ask “Why” questions to determine cause-and-effect relationships.  In this case, the safety goal was impacted because people got sick from an E. coli outbreak.  A contaminated ingredient was served to customers.  This means the ingredient either entered the supply chain contaminated or it was contaminated during preparation, as discussed above.  In order for a contaminated ingredient to enter the supply chain, it has to be contaminated with E. coli, and not be tested for E. coli.  Testing all raw ingredients isn’t practical.

Chipotle is instituting solutions that will address all potential causes of the outbreak.  Weekly and quarterly audits, as well as external assessments will increase oversight.  Cilantro will be added to hot rice to decrease the presence of microbes.  The all-employee meeting on February 8 will cover food safety, including new sanitation procedures that will be used going forward.  The supply chain department is working with suppliers to increase sampling and testing of ingredients.  Certain raw ingredients that are difficult to test individually (such as tomatoes) will be washed, diced, and then tested in a centralized prep kitchen and shipped to individual restaurants.  Other fresh produce items delivered to restaurants (like onions) will be blanched (submerged in boiling water for 3-5 seconds) for sanitation prior to being prepared.

Chipotle has released a statement describing their efforts: “In the wake of recent food safety-related incidents at a number of Chipotle restaurants, we have taken aggressive actions to implement pioneering food safety practices. We have carefully examined our operations—from the farms that produce our ingredients, to the partners that deliver them to our restaurants, to the cooking techniques used by our restaurant crews—and determined the steps necessary to make the food served at Chipotle as safe as possible.”  It is hoped that the actions being implemented will result in the delivery of safe food, with no outbreaks, in 2016.

To view the impacts to the goals, timeline of outbreaks, analysis, and solutions, please click on “Download PDF” above.  Or click here to learn more.

The water crisis in Flint, Michigan

By Kim Smiley

The quality of tap water, or rather lack thereof, in Flint, Michigan has been all over headlines in recent weeks. But prior to a state of emergency being declared and the National Guard being called up, residents of the town reported strangely colored and foul tasting water for months and were largely ignored. In fact, they were repeatedly assured that their water was safe.

Researchers have determined that lead levels in the tap water in Flint, Michigan are 10 times higher than previously measured. Forty-three people have been found to have elevated lead levels in their blood and there are suspected to be more cases that have not been identified. Even at low levels, lead can be extremely damaging, especially to young children under 6. Lead exposure can cause neurological damage, decreased IQ, learning disabilities and behavior problems. The effects of lead exposure are irreversible.

The water woes in Flint, Michigan began when the city switched their water supply to the Flint river in April 2014. Previously, the city’s water came from Lake Huron (through the city of Detroit water system). The driving force behind the change was economics. Using water from the Flint river was cheaper and the struggling city needed to cut costs. Supplying water from the Flint River was meant to be a temporary move to hold the city over while a new connection to the Great Lakes was built within a few years.

The heart of the problem is that the water from the Flint river is more corrosive than the water previously used. The older piping infrastructure in the area used lead pipes in some locations as well as lead solder in some joints. As the more corrosive water flowed through the piping, the lead leached into the water.

A Cause Map, a visual root cause analysis, can be built to document what is known about this issue. A Cause Map intuitively lays out the cause-and-effect relationships that contributed to an issue. Understanding the many causes that contribute to an issue leads to better, more detailed solutions to address the problem and prevent it from reoccurring. The Flint water crisis Cause Map was built using publicly available information and is meant to provide an overview of the issue. At this point, most of the ‘whats’ are known, but some of the ‘whys’ haven’t been answered. It isn’t clear why the Flint river water wasn’t treated to make it less corrosive or why it took so long for officials to do something about the unsafe water. Open questions are noted on the Cause Map by including a box with a question mark in it.

This issue is now getting heavy media coverage and officials are working on implementing short-term solutions to ensure safety of the residents. The National Guard and other authorities are going door-to-door and handing out bottled water, water filters, and testing kits. Michigan Governor Richard Snyder declared a state of emergency in Flint on January 5, 2016 which allows more resources to be used to solve the issue. However, long-term solutions are going to be expensive and difficult.

The city’s water supply was switched back to Lake Huron in October 2015, but it will take more than that to “fix” the problem because there is still a concern about lead leaching from corroded piping. Significant damage to the piping infrastructure was done and the tap water in at least some Flint homes is still not safe. It is estimated that fixing the piping infrastructure could cost up to $1.5 billion. A significant amount of resources will be needed to undo the damage that has been done to the infrastructure of the city, and there is no way to undo the damage lead poisoning has already done to the area’s residents, especially the children.

Equipment, procedural failure lead to resident scalding

By ThinkReliability Staff

While equipment and procedures were both in place to prevent resident scalding from too-hot baths, failures of both resulted in a resident receiving serious burns on August 13, 2013. The Health and Safety Executive (HSE) report was recently released on the incident, which resulted in prosecution for the care home and the employee responsible for the bath.

This incident illustrates the limitation in looking for the “one” root cause. There wasn’t just one thing that resulted in this incident; rather multiple failures were required to result in the tragic scalding. We can show these causes by performing a visual root cause analysis, known as a Cause Map. Note that the term “root cause” refers to a system of causes, much like the root of a plant is a system.

We begin the analysis by looking at the impact to the goals. Resident safety was impacted due to the very serious burning of a resident. The burning was so severe it resulted in the amputation of ten toes and the resident will never walk again. In addition, employee safety is impacted because of the emotional impact to the employee (known as the second victim). The employee safety is also impacted due to a risk of burns. The environmental goal is impacted due to the lack of temperature control and the compliance goal is impacted due to the prosecution of both the employee and the care home. Resident services are impacted from a resident being placed in a scalding bath. The failure of a thermostat is an impact to the property goal and the time required for response and investigation is an impact to the labor and time goal.

Beginning with one of the impacted goals (in this case we’ll begin with the resident safety goal) and asking “why” questions develops the cause-and-effect relationships that caused the incident. In this case, the resident’s injuries resulted from being placed in a scalding bath and being unable to exit due to physical and communication limitations. The resident was placed in the too-hot bath because the water in the bath was too hot, and the caregiver placed the resident in the bath. Both of these things (the water temperature being too high, and the caregiver placing the resident in the bath) had to occur in order for the injury to occur.

The water temperature was too high because of the failure of the immersion heater thermostat. The reason for the failure, as well as how long it was not working, is unknown. The caregiver placed the resident in the bath because she did not check the water temperature and failed to realize it was too hot. The caregiver appears to have been unaware of the thermostat failure, or certainly there would have been other safeguards in check. Additionally, there were inadequate thermometers provided to check the water temperature. (A manual check for comfort was still possible, though in this case could have resulted in a burn to the employee.) Although it was “required” to test the water temperature and record that the check had been done, there were no written instructions to that effect.

The care home has purchased portable thermometers for caregivers’ use, but the HSE also recommends the use of a secondary thermostatic cut-out, which would prevent boiling of the water tank even if the thermostat failed. The HSE has also provided a white paper “Managing the risks from hot water and surfaces in health and social care“, that discusses appropriate risk assessments and control measures to prevent burns of vulnerable care home residents.

To view the Cause Map of this incident, click “Download PDF” above.

Or, click here to read the HSE report of the incident.

Lethal-Injection Drug Mix-up

By ThinkReliability Staff

On January 15, 2015, a prisoner was executed by lethal injection in Oklahoma. On October 8, the autopsy report, showed that prisoner had been injected with potassium acetate instead of potassium chloride as intended.

This was the first injection to take place in the state since a prisoner took 43 minutes to die after the drugs were administered in April 2014 (see our previous blog about this execution).  After that, further executions were stayed.

Just hours prior to the first execution scheduled since January, Department of Corrections personnel realized they were sent potassium acetate instead of potassium chloride and that execution was called off.  Shortly afterwards, an Oklahoma court granted an indefinite stay for the prisoners who were scheduled for execution.

While there is ongoing debate about whether the change adversely impacted the speed or humaneness of the execution, it certainly caused great concern about the ability of the state of Oklahoma to correctly perform an execution.  Says an attorney, “The state’s disclosure that it used potassium acetate instead of potassium chloride during the execution of Charles Warner yet again raises serious questions about the ability of the Oklahoma Department of Corrections to carry out executions.”

Along with the concern for ability to perform future executions, there is potential safety impact regarding the prisoner’s suffering, as well as the production impact resulting from the delay in future executions.  The ongoing investigation will also impact goals because of the resources required.  This investigation will attempt to determine how the wrong drugs were used in the execution.

In case of the execution scheduled for September, the wrong drug was placed in the syringe used to inject the prisoner, and there was an ineffective verification of the drugs.  It’s unclear whether there was an attempt at verification that the drugs being used were correct.  If there was such a check, verification may have been difficult because records show that the syringe was labeled potassium chloride (the desired drug).

Department of Corrections records also show that the state received potassium acetate instead of the desired potassium chloride.  It seems that the potassium acetate was accidentally delivered from the supplier (there doesn’t appear to be a need for potassium acetate).  According to the prisons director, the supplier believed that the drugs were interchangeable.  In general, the oversight of suppliers who provide lethal injection drugs is limited – many states refuse to disclose their suppliers and many suppliers are compounding pharmacies, which are subject to less regulation.

Oklahoma does have several different combinations and substitutions of drugs allowable for executions, but there is no approved substitute for potassium chloride.  This, and recent changes to suppliers because so many refuse to supply drugs for lethal injection, may have led to some confusion.

It’s likely that solutions, or changes to the execution protocol may not be discussed until after the investigation is complete.  A completely different type of execution may be considered: in April 2014 Oklahoma approved nitrogen gas the backup method for executions if lethal injection could not be used.  Based on all the recent issues and concerns, that new method may be under consideration.

Why You Will Experience a Diagnostic Error

By ThinkReliability Staff

On September 22, 2015, the Institute of Medicine released a report entitled “Improving Diagnosis in Health Care“. The report was the result of a request in 2013 by the Society to Improve Diagnosis in Medicine to the Institute of Medicine (IOM) to undertake a study on diagnostic error. The tasking to the committee formed by the IOM matched the three step problem-solving process: first, to define the problem by examining “the burden of harm and economic costs associated with diagnostic error”; second, to analyze the issue by evaluating diagnostic error; third, to provide recommendations as “action items for key stakeholders”.

The burden of harm determined to result from diagnostic errors is significant. Diagnostic errors are estimated to contribute to about 10% of hospital deaths, and 6-17% of hospital adverse events, clearly impacting patient safety. Not only patient safety is impacted, however. Diagnostic errors are the leading type of paid malpractice claims. They also impact patient services, leading to ineffective, delayed, or unnecessary treatment. This then impacts schedule and labor as additional treatment is typically required. The report found that, in a “conservative” estimate, 5% of adults who seek outpatient care in the United States experience a diagnostic error each year and determined that it is likely that everyone in the US will likely experience a meaningful diagnostic error in their lifetime.

The report also provided an analysis of issues within the diagnostic process (to learn more about the diagnostic process, see our previous blog) that can lead to diagnostic errors. Errors that occur at any step of the diagnostic process can lead to diagnostic errors. If a provider receives inaccurate or incomplete patient information, due to inadequate time or communication with a patient, compatibility issues with health information technology, or an ineffective physical exam, making a correct diagnosis will be difficult. Ineffective diagnostic testing or imaging, which can be caused by numerous errors during the process (detailed in the report). Diagnostic uncertainty or biases can also result in errors. However, not all errors are due to “human error”. The report asserts that diagnostic errors often occur because of errors in the health care system, including both systemic and communication errors.

When diagnostic errors do occur, they can be difficult to identify. The data on diagnostic errors is sparse due to both liability concerns as well as a lack of focus historically on diagnostic errors. In addition, there are few reliable measures for measuring diagnostic errors, and diagnostic errors can frequently only be definitely determined in retrospect.

The report identifies eight goals for improving diagnosis and reducing diagnostic errors that address these potential causes of diagnostic errors. These goals are presented as a call to action to health care professionals, organizations, patients and their families, as well as researchers and policy makers.

To view a high-level overview of the impacts to the goals, potential causes and recommendations related to diagnostic error presented in a Cause Map, or visual root cause analysis, click on “Download PDF” above. To learn more:

To read the report, click here.

For an overview of the diagnostic process, click here.

For an example of a diagnostic error with extensive public health impacts, click here.

Understanding the diagnostic process is the first step towards improving diagnosis in health care

By ThinkReliability Staff

On September 22, 2015, the Institute of Medicine (IOM) released a report entitled “Improving Diagnosis in Health Care“. To achieve that goal, the committee, “developed a conceptual model to articulate the diagnostic process, describe work system factors that influence this process, and identify opportunities to improve the diagnostic process and outcomes.”

With a goal of improving a given process – in this case, the diagnostic process – it’s important to understand how the process should work in theory (which may be very different from how the process actually works in practice). The conceptual model outlined within the report provides an overview of the theoretical diagnostic process at several different levels of detail.

A Process Map is similar to a geographical map in that it can provide different levels of detail while remaining accurate. For example, a map of a country as a whole typically contains only the most major roads, a map of a city will contain far more roads, and an inset providing detail of a section of the city may contain all the roads. All these maps are accurate; but the city map contains more detail than the national map.

Similarly, an overview of the diagnostic process can be summarized in just four steps: patient reporting of a health problem, information gathering and analysis, diagnosis, and treatment. By adding more detail to this process, the responsive nature of the process is revealed – if sufficient information is not gathered to make a working diagnosis, the process returns to the information gathering step. A similar “decision point” is made after treatment – if treatment is found to be ineffective, the process again returns to the information gathering step for another look at the diagnosis.

Even more detail can be provided about the information gathering step. Information gathering typically involves a clinical history/ interview, a physical exam, diagnostic testing and/or imaging, and referral or consultation with other health care professionals. As the information gathering step can be broken down into more detail, so can the diagnostic testing/ imaging step. In more detail, the diagnostic testing/ imaging step involves ordering diagnostic tests and/or imaging, preparation and collection of the specimen/image, examination of the specimen/ image, result interpretation, follow-up, and incorporating the results into the patient’s medical record. (Because of the similarities at a high level between the diagnostic testing and diagnostic imaging processes, they have been combined in the Process Map on the PDF, but a more detailed process would have separate steps for each.)

When analyzing a complex process, such as the diagnosis process, breaking it down into steps allows for an analysis of problems that occur at each step. Next week, our blog will discuss in more detail the impacts from diagnostic error, potential causes of diagnostic error, and the recommendations from the IOM report to improve diagnosis and reduce diagnostic error.

To view the diagnostic process map at several levels of detail, click on “Download PDF” above. Click here to read the Institute of Medicine report “Improving Diagnosis in Health Care.”

 

Smoke from wildfires in West may impact public health across the US

By ThinkReliability Staff

A significant portion of the United States is currently being affected by wildfires. The Valley and Butte fires in California, two of the worst in that state’s history, have killed five (all civilians found dead in their homes). The Tassajara Fire has resulted in another civilian fatality. The Rough Fire (also in California) has burned more than 141,000 acres. The US Wildfire Activity Public Information Map and National Wildlife Coordinating Group Incident Information System shows dozens more fires across the Western United States.

The wildfires are also impacting the population in areas not directly impacted by the fires. Public safety has been impacted by the deaths and risk for injury. Worker safety has been impacted as well; four firefighters were burned in the Valley fire. Even animal safety has been impacted; animals were left to fend for themselves in many areas that were evacuated rapidly due to changing conditions, leading to risk of injury or death. Tens of thousands of people have been evacuated. Hundreds of thousands of acres have been burned and thousands of buildings destroyed, causing a potential long-term impact on area businesses. More than 15,000 workers have been deployed to assist in fighting the fires.

The wildfires are also affecting air quality in areas not directly impacted by the fires. The smoke from these wildfires is causing environmental and health issues including asthma, chronic lung disease and even heart attacks. Janice Nolan, the assistant vice president for national policy at the American Lung Association says of recent air quality, “It’s really bad. I hadn’t seen ‘code maroon’ days, which is the most hazardous air quality, in years.” (The Air Quality Index reports the quality of outdoor air in color categories. Maroon, or “hazardous” represents a level of air pollution that means the entire population is likely to experience serious health effects. Lower categories indicate when members of more sensitive groups may experience health concerns.)

Health issues can occur when smoke is breathed in and enters the respiratory system. The organic particles that make up smoke can be so small they can bypass the body’s natural defenses (such as mucus and hair in the nose). The particles can even enter the bloodstream. This occurs any time a person is exposed to smoke. Says Sylvia Vanderspek, the chief air quality planner for the California Air Resources Board, “If you can smell smoke, then basically you’re breathing it.”

An average person can breathe in about 35 micrograms of particulate matter for only 24 hours before experiencing health problems. Unfortunately, the California air quality board has measured levels of particulate matter up to 34 micrograms in a day . . . and the fires have been burning for weeks and may continue for weeks more. Weather conditions impact not only the wildfires themselves but also where the smoke from those fires goes. Weather conditions this summer have meant that smoke issues have been seen into the Midwest.

The only really effective protection against health impacts from smoke is to stay inside with air conditioning on recirculate if in an affected area (based on the local air quality index). This has meant schools are holding indoor recess and sports practices and outdoor festivals have had to cancel performances. Idaho is considering establishing clean air shelters so the population can avoid breathing in smoke. Regrettably, most air masks won’t help, as they don’t protect against the tiny particles of concern. Instead, health officials reiterate that if the air quality in your area is poor, stay indoors to protect your health.

Child Paralyzed by Vaccine-Derived Polio

By Kim Smiley

There has been amazing progress in the effort to eradicate polio, but recent cases of the disease are a harsh reminder that the work isn’t complete and now isn’t the time to be complacent.  Public health officials are planning three mass vaccination rounds in less than 120 days after a child was recently paralyzed by polio in Mali.  In addition to this case, the World Health Organization (WHO) announced that two children in western Ukraine were also paralyzed by polio.

The last case of polio was detected in Mali in 2011.  A Cause Map, a visual root cause analysis, can be used to analyze how the child contracted polio as well as help in understanding the overall impacts of this case.  The first step in a Cause Map is to fill in an outline with the basic background information, including listing how the issue impacts the different overall goals.  This issue, like most, impacts more than a single goal.  For example, the child being paralyzed is an impact to the patient safety goal, but the potential for an outbreak of polio is an impact to the public safety goal.

Once the impacts to the goals are defined, the Cause Map itself is built by asking “why” questions and including the answers in cause boxes.  The Cause Map visually lays out all the cause-and-effect relationships that contributed to an issue.  So why was the child paralyzed?  The child was infected with vaccine-derived polio because he was exposed to the disease and wasn’t immune to it, likely because he didn’t receive all four of the required doses of vaccine.  Vaccine rates in Guinea, where the child was from, dropped during the Ebola outbreak.

In this region of the world, oral polio vaccine is used and it contains weakened, but live, strains of polio virus.  After being administered oral polio vaccine, a child will excrete live virus for a period of time.  The live virus can replicate in the environment and there is the potential for it to mutate into a more dangerous form of polio, which is what causes vaccine-derived polio.

Cases of vaccine-derived polio are very rare, but are a known risk of using oral polio vaccine.  The injectable vaccine uses dead polio virus that cannot mutate, but there are other important factors that come into play.  The oral polio vaccine is cheaper and is simpler to administer than the injectable vaccine because medical professionals are needed to give injections.

The use of oral vaccines also eliminates the risk of spreading blood borne illnesses.  Because there are no needles involved, there is no risk of needles being shared between patients.  The oral vaccine also provides greater protection for the community as a whole, especially in regions with poor sanitation.  When a child is fully immunized with the oral polio vaccine this ensures immunity in the gut so that the polio virus is not excreted after exposure.  This is not true with the injectable polio vaccine; an immunized child exposed to “wild” polio would not be infected, but may still excrete polio virus after exposure and potentially spread it to others.  One negative of using the oral polio vaccine is that in rare cases (estimated to be about one in about 2.7 million) the weakened polio virus can cause paralysis in a child receiving their first dose of the vaccine.  Concern over paralysis is one of the reasons that developed nations generally use the injectable polio vaccine.

Polio is highly contagious and public health officials are planning an aggressive vaccine campaign to reduce the risk of an outbreak now that a case of polio has been verified in Mali. The plan is to have three mass vaccination rounds in less than 120 days, a level of effort aided by the many World Health Organization and United Nations staff that are still in the area as part of the response to the Ebola outbreak.  Thankfully, Guinea has not reported any cases of Ebola for several months so officials can devote significant resources to the mass polio vaccine effort.