Tuesday, July 30, 2013

Automation, Asiana Airlines and Addlebrains

My second job was at the largest and most automated cell culture plant in the world. This automation was heralded (at least internally) as the mechanism that will ring in a new age of fewer errors, lower costs, lower process variability and higher efficiency.

It was also there where I met Mary who relayed this paraphrased story:
My dad is an ophthalmologist and he says, "Mary... you can train a monkey can do my job 99% of the time. But you want me and not a monkey when something goes wrong that 1% of the time"
This is exactly what I learned from my days as a fermentation engineer at this uber automated cell culture plant. Automation is the perfect for handling routine jobs or tedious jobs, or even moderately complicated jobs... not so good at managing exceptions.

My 3rd week on the job, my boss and grand-boss were at the annual Process Development offsite in Lake Tahoe. Their parting words were, "Good luck, Oliver! If anything really bad happens, call Bob."

As it would happen, there was a batch feed operation (where they prep salty media, sterilize a line to an already-running production culture and pump said salty media in) where a valve was left open and only a fraction of this media was actually delivered.

What did I do? I call Bob. This was essentially Bob's response:

"Make up another batch feed and send in the right amount. What's the problem?"

The problem was that the recipes are coded with exactly one batch feed; there is no provision in the automation for a second batch feed; we could have coded a loop to allow for a second batch feed, but as none was specified in the Biologics License Agreement (BLA); therefore, there was no justification to code the loop.

We ended up using another branch of the recipe logic to complete the batchfeed but it took 3-times longer and was twice as complicated as the fully-manual batch feed.

Automation is the perfect solution 99% of the time. But there's always that 1% of the time when something goes wrong and does so spectacularly.

Asiana Airlines

asiana crash Look no farther than to the Asiana Airlines crash at San Francisco International Airport. It turns out that Asiana pilots were not pilots, they were airplane operators. These putative pilots were capable of using the software engineered into that Boeing 777; they were not actually capable of flying the plane when "everything goes to heck in a handbasket."

So what's the answer to too much automation and not enough skill?

Well, if you work for the government, the answer is: more automation. The US Federal Aviation Administration issued an edict that all non-US must use GPS systems when landing at SFO.

This bandaid is just that: a bandaid. On an immediate-term basis, there likely won't be any more deaths. Long-term, this addlebrained approach allows "airplane operators" to masquerade as "pilots" all the while not being able to fly a plane when all goes to hell in a handbasket.

Automation in Biotech Context

Biotechnology in the US is very prone to these "if some is bad, more is better" scenarios because drugs - like airplanes -  are regulated by the federal government. All too often, fixing the immediate pain comes at the expense of the long-term; this is the nature of solving engineering problems in the most politically-expedient manner. And heads of technology departments and managers of automation departments need to be particularly vigilant and resistant to the bandaid solutions that carry subtle long-term detriment.

Generally speaking, automation is good.  But be certain that your long-term strategy of using automation does not undermine mission-critical skills.  And tactically, be certain that the hows and the whys of the status-quo engineering design gets explained so that your staff can rise to the occasion and to apply those mission-critical skills.

No comments: