2.29.2008

Florida Power Outage: Was a Lack of Lean the Problem?

When I read about the power outage that occurred in Florida earlier this week, I couldn’t help but wonder whether an application of lean tools might help prevent this kind of event in the future.

You probably heard about the rolling blackout that cut power to several million people in the Sunshine State. The earlier reports describe a kind of snowball effect triggered by some small incidents. According to
CNN.com,

Florida authorities are investigating how a small fire and a switch failure at an electrical substation outside Miami triggered a power failure that affected millions of people.
When a nuclear power plant sensed the disruption, it shut down. In turn, the state's power grid triggered rolling blackouts Tuesday across the state…


Florida Power & Light President Armando Olivera said a disconnect switch failed at 1:08 p.m. ET Tuesday at an automated substation west of Miami, and a piece of equipment that controls voltage caught fire about the same time. Neither failure by itself would have caused a widespread outage, he said

Utility workers are trying to piece together what happened, but Olivera said the "initiating event" was the failure of the disconnect switch.

"These systems are all designed so that you can handle two contingencies," he said. "If you had a switch that failed, protective devices would have isolated the problem. That did not occur today. That's the part we don't have an answer for…"

The substation trouble set off a sequence of events that within two to three minutes had knocked numerous power plants off-line -- including the Turkey Point nuclear power plant south of Miami.

Olivera said Turkey Point's two nuclear reactors and a natural gas-powered generation unit automatically shut down when the plant's systems detected a fluctuation in the power grid.

"In a fraction of a second, the demand was far greater than the power plants that were online generating electricity could handle," he said. "When you have that kind of imbalance, we have a system that kicks in and it starts turning people's lights off, essentially balancing the demand with what's available."

We’re dealing here with automatic actions (or reactions) by technology, not processes conducted by people.

Nonetheless, I find myself gripped by a strong urge to reach into the lean toolbox. Ask “why” five times to figure out what happened. Create a process map (if not an actual value stream map) to identify process triggers and figure out whether something is set up improperly. And so on.

The power plants shutting down after detecting a fluctuation in the power grid brings to mind a worker stopping a line by pulling an andon cord, which is often the correct response to a problem. But was the shutdown supposed to happen in this case? I don’t know, and someone needs to find out.

What do you think? Can lean be applied here? Would it help prevent problems in the future? Post your comments below.

2.27.2008

Book on Lean and IT Wins Shingo Prize

At best, lean and IT often maintain a kind of peaceful co-existence, but not really a warm relationship.

Lean processes are typically supported by visual controls and without a lot of computer technology, thank you very much. In fact, IT can sometimes get in the way of lean if not used properly.

But IT is also essential to business, so finding common ground between technology and process improvement is an important, if difficult, task.

That is why it is particularly gratifying that one of this year’s Shingo Research Prizes has been awarded to Easier, Simpler Faster: Systems Strategy for Lean IT, by Jean Cunningham and Duane Jones.

The Shingo Prize for Operational Excellence is perhaps most well-known for its business prizes, which are given each year to factories that have demonstrated the value of using lean/world-class manufacturing practices to attain world-class status.

But there is also a category of research prizes, which recognizes and promotes research and writing regarding new knowledge and understanding of manufacturing consistent with the philosophy of the business prize.

Easier, Simpler, Faster certainly fits that definition. When the book was published a year ago, it was one of a very small number of publications that focus on the relationship between lean and IT, and perhaps the only one that laid out what Jamie Flinchbaugh, co-author of The Hitchhiker’s Guide to Lean, called “a pragmatic set of guidelines” for enabling IT support of lean.

The book features a case study of an actual lean implementation involving an IT system of a mid-sized manufacturer.


This book joins a long list of other Productivity Press books that, over the years, have won Shingo research prizes. We are delighted by this announcement, and we congratulate the authors.

2.25.2008

Will the Next Administration Be Lean?

It would be nice if our next president leads a lean administration.

What do I mean by that? First, government tends to be bloated and inefficient, doing a poor job of carrying out its mission of delivering services. A president who commits the federal government to embracing lean principles could contribute to improving service and reducing costs.

There are some isolated cases of this happening in government already, particularly in certain segments of the military. But a great deal more could be done, and that requires top leadership championing the cause of lean.

Second, if the federal government was committed not only to implementing lean strategies, but to promoting them, that would be good for business in particular and the country as a whole. There is some government support of lean now through the network of
Manufacturing Extension Partnership (MEP) organizations, but it’s one of those useful but little-known programs. Lean promotion ought to be a priority at the cabinet level – in Commerce, for business; in Health and Human Services, for healthcare, etc.

I don’t expect this to become an issue in the presidential campaign. It is certainly not viewed as a high-priority item, it’s not a sexy issue, and there would probably be very little disagreement over it.

The more important question is whether it would become a priority for the next administration.

I’m not too hopeful. Our next president is likely to be Clinton, Obama or McCain, and they are all career politicians. That is not meant as a criticism, but simply a statement that none of the candidates has the kind of business experience that would have given them the exposure to, and understanding of, lean strategies and their value.

The best hope for having a lean administration lies in whether the next president will seek advice and counsel from all sectors of society, including business leaders, and whether he or she will listen to that advice. It probably wouldn’t hurt for good, eloquent people to begin offering that advice, even unsolicited, now.

2.22.2008

Improving Patient Flow in Emergency Rooms

Many hospitals are focusing their lean efforts on emergency rooms, trying to streamline the flow of patients through that increasingly busy department.

(I know I’ve been writing a lot about healthcare lately, but that’s where some of the most interesting lean developments are taking place.)

An interesting recent article in The New York Times describes some of the changes taking place in ERs. The article, written by Sarah Kershaw, doesn’t actually focus on process improvement. However, it offers some tantalizing hints that lean may be part of what is taking place.

The article focuses on urban hospitals in New York. However, Kershaw notes a variety of factors that are fueling ongoing growth in the numbers of people who come to emergency rooms throughout the nation. These include growing numbers of uninsured patients, rapid population growth in urban areas where hospitals are located, a shortage of primary care doctors, and the closing of bankrupt hospitals, which increases demand at those that survive.

One result is that many hospitals – if they can find the funding – are expanding their emergency rooms, or adding additional ERs.

And part of the article focuses on the changing nature of the ER to make patients more welcome and comfortable. These efforts range from the addition of individual flat-screen televisions and telephones to the addition of child-care specialist who work with children.

However, any attempt to address ER overcrowding must include a focus on improving patient flow. The article describes some of these efforts:

At St. Vincent’s Hospital Manhattan, officials recently spent $7.6 million to create what they call a “fast-track” option to speed the treatment of patients with more minor injuries. St. Luke’s-Roosevelt Hospital Center recently embarked on a $15 million project to double its capacity at the Roosevelt campus…


Some hospitals now have “navigators,” staff members assigned solely to the uninsured to handle the cumbersome paperwork required for registering them. And an increasing number have also instituted the fast-track systems, which Beth Israel Medical Center in Manhattan — now constructing an emergency room that will be twice the size of its current one — is calling “fast-food McDonald’s-type in-and-out service.”

The fast-track systems divide emergency rooms into areas for patients with minor injuries for those with more acute problems, so that someone with a sprained ankle is not lumped together with a patient who is bleeding profusely from the head.

Sorting patients by the severity of injury is a lean approach – think of it as dividing products into families, based on what is needed to complete their production.

Of course, once these categories are established, a hospital needs to go a step further. It needs to focus on the processes by which each group of patients is received and brought through the emergency room. The article does not indicate whether the hospitals mentioned above are taking that step.

However, I suspect and hope at least some of them are. That way, if you or I need to visit an emergency room, we will experience improved service.

2.20.2008

Don’t Automate Bad Processes

In my last post, I talked about how technology alone will not eliminate medication errors in hospitals caused by bad processes.

This week I found essentially the same thing being said by someone far more knowledgeable about these matters than I am, Dr. John Halamka, Chief Information Officer of the CareGroup Health System, and Chief Information Officer and Dean for Technology at Harvard Medical School.

He writes an excellent blog called
Life as Healthcare CIO. His latest posting is about his experience in implementation of computerized provider order entry (CPOE) systems. (I assume that ‘s the same thing as the “computerized PHYSICIAN order entry” system I wrote about.)

Dr. Halamka presented a list of 10 lessons he learned about implementing such systems, some of them technical, some focusing on cultural and management issues. However, one in particular jumped out at me.

Automating a bad process does not improve anything.


When I was a resident, I was told that heparin should be dosed as a 5000 unit bolus then an infusion of 1500 units per hour for every patient. I was not taught about relating heparin dosing to body mass index, creatinine clearance or the presence of other medications. Unfortunately, it often took days to get the heparin dosing right because 5000/1500 is definitely not a one size fits all rule. Creating an automated CPOE order for 5000/1500 is not going to improve the safety or efficacy of heparin dosing. Implementing a new protocol for dosing based on evidence that includes diagnosis, labs, and body mass index will improve care.

Our experience is that it is best to fix the process, then automate the fixed process. By doing this, no one can blame the software for the pain of adapting to the process change.Our experience with CPOE over the past 7 years is that it has reduced medication error by 50%, it paid for itself within 2 years, and clinicians have embraced it. In 2008-2009 we're completing the bar coding for all our unit dose medications (including repackaging every dose of Tylenol in bar coded baggies) so we can scan the patient wrist band, scan the medication, and scan the nurse, achieving a completely automated medication administration record. Once this is complete, the last causes of medication error will be removed from our hospital and we hope to achieve a truly zero rate of adverse drug events.

I am skeptical of that last claim, that implementation of a bar coding system will remove the “last causes of medication error.”

However, Dr. Halamka clearly understands the broader issue, that it’s the process, stupid. Remember the old acronym about computer systems, GIGO? It stood for garbage in, garbage out. Automating a bad process – and the need to avoid doing that – is simply a different version of that old truism.

2.15.2008

Preventing Medication Errors Takes More Than Technology

I wrote recently about research by the Philadelphia Inquirer into how many hospital patients are actually harmed by serious, avoidable medical errors. Now new research looks at how many patients suffer from a particular type of error – a mistake in medication.

According to a
report in the Boston Globe, a new study released by two non-profit groups found that one in every 10 patients admitted to six community hospitals in Massachusetts suffered “serious and avoidable” medication mistakes.

The researchers who prepared the report looked at 4,200 randomly selected patient medical charts at the six hospitals, covering stays from January 2005 to August 2006.

A mistake was defined as a patient being given a drug even though the medical records indicated it might trigger an allergic reaction or exacerbate a medical condition.

Medication errors were counted only when patients suffered serious reactions, including going into shock or suffering kidney failure. In nearly every instance, the patients remained in the hospital longer to recover from the mistake. Nobody died from any of the mistakes, researchers said.

This kind of information is valuable because it show the problems that can result from flawed processes and the benefits that can be achieved from improving them.

However, I am concerned that everyone seems to think the answer to the problem is technology. The article says none of the community hospitals has a computerized physician order entry system,


which requires doctors to type into a central database every medical order, including prescriptions, diagnostic tests, and blood work. The doctors' orders are matched against the patient's medical history, triggering red flags to prevent problems related to drug allergies, overdoses, and dangerous interactions with other drugs…

After this system was put in place at Brigham and Women's Hospital in 1995, preventable medication errors declined by 55 percent over the next two years.

Don’t get me wrong, I have nothing against technology. And if this type of system seriously reduces medical errors, great.

What concerns me is whether anyone is really examining the causes of the medication errors, rather than just assuming that a computerized system will take care of everything.

Why were errors at Brigham and Women’s Hospital reduced by only 55 percent? Clearly, the computerized system didn’t eliminate all errors. There are probably process flaws still causing errors, and those flaws weren’t addressed by computerization. But they probably could be addressed by a systematic lean approach that maps processes, identifies sources of problems, then uses mistake-proofing and/or other tools to solve them.

More technology is needed in hospitals. But it’s not enough.

2.13.2008

NASA Needs to Learn Lean

I was reading an article on CNN.com about the successful launch of the space shuttle this past Thursday, and was taken aback by something in the story.

The article noted that NASA aborted plans to launch back in December when fuel gauges failed, which could have caused serious problems with fuel flow. The launch took place this month because the gauge problem was fixed:

The gauges failed back in December because of a faulty connector, and NASA redesigned the part to fix the problem, which had been plaguing the shuttles for three years.

Three years? Are you kidding me? In other words, for three years nobody made a serious effort to address something everyone knew was a problem, but when the situation finally became critical, they solved it in two months.

NASA is definitely not a lean organization. When an organization is committed to lean, problems are addressed as soon as they arise. The root cause is analyzed, and action is taken - quickly.

First, that kind of mindset clearly is not in place at NASA. But second, wasn’t NASA taking some risky chances here? There have been at least a few shuttle launches in the past three years. Maybe the fuel gauges seemed all right at the time of each launch. But if there was a history of problems, wasn’t it possible the gauges might fail, interrupting the flow of fuel, at a critical moment when the shuttle was already in the air?

NASA may have been more lucky than safe in those earlier launches. You would think that with a history of two disasters, the agency would be more paranoid about problems and more aggressive in solving them.

About five years ago, I wrote about NASA in our newsletter Lean Manufacturing Advisor, describing how contractors working on possible successors to the space shuttle were using lean techniques in the design and development process. It’s a shame those lean efforts didn’t become more widespread.

2.08.2008

Saving Lives by Making Hospitals Lean

How many lives can be saved by using a lean approach to improve hospitals?

That’s hard to quantify, of course, but the Philadelphia Inquirer has come up with some interesting possible answers.

The newspaper notes that the Pennsylvania Department of Public Welfare, following the lead of Medicaid, will stop paying for the costs of care resulting from serious, avoidable medical mistakes.

To enforce that policy, the state will be scrutinizing 419 billing codes used in cases of the most catastrophic problems. That will identify serious problems, and then a review of full medical records will to determine whether the problems resulted from medical errors.

The Inquirer came up with an interesting approach to determine what state officials might find. The newspaper looked at billing codes for more than 700,000 hospital admissions in the Philadelphia metropolitan area in 2006.

According to the article, written by Josh Goldstein, the research revealed nearly 1,500 deaths. These included 11 patients who died after receiving transfusions of the wrong blood type; 40 who died after medication errors, and four more who were accidentally burned.

Probably a minority of the deaths can clearly be attributed to medical errors.

"I would be surprised that we have more than 100 cases in any given year that we would actually reduce the funding," said David K. Kelley, chief medical officer for the state's Medicaid program.

The article also includes data from the Pennsylvania Patient Safety Authority, an independent state agency that already collects and analyzes information about events that harm or could have harmed hospital patients.

The safety authority got data indicating 1,214 reports of hospital errors statewide that correspond roughly to the "never events" targeted by Rendell's initiative - preventable medical mistakes that cause serious injury or death - from July 2004, when the information was first collected, through December 2007.


In 2006, the authority got reports about 13 patients who died or were hurt by medication errors in hospitals. An additional 42 people had sponges, instruments or other medical equipment accidentally left in their bodies. And 42 others had operations performed on the wrong part of their bodies, according to the agency.

" 'Never events' make up a small portion of all of the different reports that we see," said Mike Doering, executive director of the patient safety authority.
"Still, when we look at something like a wrong-site surgery, clearly we can say something is not working here."


I’ve said before that the spread of these new reimbursement policies is a good thing because it removes a disincentive for hospitals to improve their processes. This new information drives home that point.

Also, we have a couple of new books that drive home what process improvement can accomplish in healthcare:


The Pittsburgh Way to Efficient Healthcare: Improving Patient Care Using Toyota Based Methods, by Naida Grunden

Paradox and Imperatives in Health Care: How Efficiency, Effectiveness, and E-Transformation Can Conquer Waste and Optimize Quality, by Dr. Jeffrey C. Bauer and Mark Hagland

2.06.2008

Value Stream Mapping for Sustainability

The tools of lean manufacturing can be particularly valuable in helping us address energy and environmental concerns, I believe.

This thought occurred to me while attending a panel on these issues at the recent
Automotive News World Congress. The well-attended panel took up an entire afternoon, and the speakers (from both inside and outside the industry) were generally in agreement on some key points:

Environmental problems and concerns are real, and are only going to increase.
More mandates are likely to come from government.
Dramatic action is needed – possibly through some kind of “Manhattan Project” – if serious progress is going to be made.
Environmental issues should be examined with a systemic approach.

It is that last point I’d like to discuss. This came up during discussion of the various technologies being developed for vehicles to improve gas mileage and reduce emissions – hybrids, biofuels, clean diesel, fuel cells.

On the surface, it might seem that a fuel cell is the best technology. It uses a widely available resource (hydrogen) and produces virtually no pollution.

But it takes a considerable amount of energy to process hydrogen so it can be used in a fuel cell. Further, widespread use of fuel cells would require construction of a vast new infrastructure of filling stations. And that construction would consume energy and raw materials.

(As an aside, that is why advances in hybrids are coming faster than any other of the technologies since the support infrastructure already exists. Plug-in hybrids are likely to come next.)

If you think of the delivery of energy from its source to a vehicle as a kind of supply chain, then the best approach is to map that supply chain to understand its intricacies and inefficiencies. And we need to develop a clear definition of “waste” so we can identify when the supply chain is wasting resources or energy.

The result would be some kind of value stream map. It wouldn’t be easy to develop, and I’m not sure what it would look like. But it could be extremely valuable in helping us address environmental issues.

Do any of you have experience using a lean approach to sustainability? Tell us your stories below.

2.04.2008

Ford’s CEO Faces Reality

Alan Mulally, who became CEO of Ford after serving at Boeing, has demonstrated in several ways that he understands lean principles. Perhaps the most important demonstration is his willingness to face reality.

I wrote previously about
Mulally’s global strategy and his efforts to copy Toyota in establishing a global product strategy. Another key aspect of his leadership is what I will call his attempt to “right-size” Ford.

At the
Automotive News World Congress recently (and on other occasions), Mulally talked about his desire to match capacity to demand – in other words, to embrace the lean principle of building only what the customer wants.

Like the other American automakers, Ford has too much capacity. That’s what happens when your share of the U.S. market goes from 25 percent down to about 15 percent.

Ford can’t afford to maintain all that excess capacity, which means closing facilities and laying off people. Similarly, there is now considerable excess capacity among automotive suppliers, where major consolidation is taking place. And third, there are too many dealers for American automobiles, and consolidation is necessary there as well – particularly in urban areas.

Ford is making progress. The company recently reported earnings, and while Ford lost a lot of money in 2007, it lost far less than it had in 2006.

Closing operations to eliminate excess capacity is not what I would call a lean strategy. In this type of situation I would normally say the manufacturer should aggressively pursue lean initiatives, improve operations to gain a competitive edge and do everything possible to regain the lost business.

However, I believe Mulally recognizes the reality of the situation, even if he didn’t say so specifically at the recent conference. And the reality is that those 10 percentage points of lost market share are not coming back.

They were lost to companies like Toyota and Honda, and no matter how lean Ford might become in the future, it is not going to leapfrog ahead of Toyota.

Ford may return to profitability some day. But the growth that will enable it to do so will occur in China, India, Russia and other overseas locations, not in the United States.

Mulally has said publicly he is more concerned with profitability than with market share. That is a necessary and refreshing change for an American automaker.

2.01.2008

Virtual Reality and Lean

Most of you are probably aware of at least some of the virtual worlds that now exist where people can visit, play games, and/or take part in a whole range of other activities.

Perhaps the most well-known is
Second Life, where you can create your own identity through an avatar, view entertainment, meet other visitors, “build” a house – in short, spend time in a different world.

While I’ve never done that myself, I’ve certainly read about it. And I know some businesses have set up virtual operations in Second Life, as part of their marketing strategy. (However, I am unaware of any such operations having a significant impact on business.)

For gamers, there are many online communities where you can play your favorite video games (with impressive, lifelike graphics), and do so against other players who may be located throughout the world.

I had always assumed these virtual worlds attracted primarily young players, or at least the hard-core geeks who love technology.

However, I read recently about a new game world that I believe is targeted to a slightly broader, less hard-core audience. It got me thinking about how these virtual communities may have business applications that go beyond having a presence in Second Life.

It is called
World Golf Tour. It is an Internet sports league where you can play virtual golf on simulations of real golf courses. And you can do so as part of a foursome whose members may actually be thousands of miles apart.

It is still in demo form, but is scheduled to go live within six months or so with a virtual version of the Kiawah Island Golf Resort. A half-dozen courses will be online by year-end.

Josh Quittner of Fortune magazine predicts that once the site is live, “white-collar productivity will fall through the floor like a Looney Tunes anvil dropped from a skyscraper.”

That remains to be seen. But think about how this kind of technology could be used to create factory simulations for testing improvement efforts.

There is nothing new about using factory simulations. However, with a virtual world, all members of a cross-functional team could be on the “floor” of a virtual factory, each with the capability to take action in their particular area. A floor supervisor could make changes to production, and someone in distribution could experience and respond to the impact of those changes in the warehouse.

You could conduct a kaizen event without any disruption to actual operations because everything you do is virtual. When you finally come up with the best solution, actual implementation is faster and easier.

Do you agree? Does anyone know of technology already being used in this way? I look forward to your comments.