Showing posts with label multivariate. Show all posts
Showing posts with label multivariate. Show all posts

Thursday, January 23, 2014

Multivariate Analysis: Pick Actionable Factors Redux

When performing multivariate analysis, say multiple linear regression, there's typically an objective (like "higher yields" or "troubleshoot campaign titers"). And there's typically a finite set of parameters that are within control of the production group (a.k.a. operators/supervisors/front-line managers).

This finite parameter set is what I call, "actionable factors," or "process knobs." For biologics manufacturing, parameters like

  • Inoculation density
  • pH/temperature setpoint
  • Timing of shifts
  • Timing of feeds
  • Everything your process flow diagram says is important
are actionable factors.

Examples of non-actionable parameters include:
  • Peak cell density
  • Peak lactate concentration
  • Final ammonium
  • etc.
In essence, non-actionable parameters are generally measured and cannot be changed during the course of the process.

Why does this matter to multivariate analysis? I pick on this one study I saw where someone built a model against a commercial CHO process and proved that final NH4+ levels inversely correlates with final titer.



What are we to do now?  Reach into the bioreactor with our ammonium-sponge and sop up the extra NH4+ ion?

With the output of this model, I can do absolutely nothing to fix the lagging production campaign. Since NH4+ is evolved as a byproduct of glutamine metabolism, this curious finding may lead you down the path of further examining CHO metabolism and perhaps some media experiments, but there's no immediate action nor medium-term action I can take.

On the other hand, had I discovered that initial cell density of the culture correlates with capacity-based volumetric productivity, I could radio into either the seed train group or scheduling and make higher inoc densities happen.

by

Thursday, August 1, 2013

Every MSAT's Response to Process Development



Reducing variability is the only thing the Manufacturing team can control.  Ways to do this involve getting more accurate probes, improving control algorithms, upgrading procedures, etc.

But there are limits. Probes are only so precise. Transmitter may discretize the signal and add error to the measurement. The cell culture may have intrinsic variability.

What makes for releasable lots are cell cultures executed within process specifications.  And measuring a process parameter's variability in relation to the process specification is the SPC metric: capability.

1fc1cbd2a59a0da04cb5e11abc816b77[1]

Process specifications are created by Process Development (PD). And at the lab-scale, it's their job to run DOE and explore the process space and select process specifications narrow enough to produce the right product, but wide enough that any facility can manufacture it.

It's tempting to select the ranges that produce the highest culture volumetric productivity.  But that would be a mistake if those specifications were too narrow relative to the process variability.  You may get 100% more productivity, but at large-scale be only able to hit those specifications 50% of the time resulting in a net 0% improvement.

The key is to pick specification limits (USL and LSL) that are wide so that the large-scale process is easy to execute.  And at large-scale, let the MSAT guys find the sweet-spot.

Thursday, July 25, 2013

Fermentation Analysis Software

There's this neat question on the Mathematical Modeling of Fermentation LinkedIn Group on software used in Fermentation.
I would like to ask about the software for the analysis of your fermentation processes. Software for analysis, but not for the fermentation control. Although, if you can say something about the control programs, it is welcome, too.

I suspect that the people in this group deal with small-scale or pilot plant-scale, but this question is actually worth answering for large-scale cell culture/fermentation.

deltav In 1999, the fermentation control software was basically free-for-all.  No single company had a stranglehold on the market. Allen-Bradley PLCs were popular, Siemen's was popular, Honeywell was a good option... But over a decade, the company that has really taken over the control layer is Emerson's DeltaV system.

The reason this is worth talking about is because the data source comes from instrument IO that is monitored by the control software. All analysis is preceded by data capture, archival and retreival. DeltaV is that software that does the capture.
1) What software is used on your fermentation equipment?
osisoft pi Next up is the system that archives this instrument data for the long-term. DeltaV has a historian, but the most popular data historian is OSIsoft's PI (a.k.a. OSI PI). And the reason is because the PI has stellar client tools and stellar support. PI client tools like DataLink and ProcessBook are good for generic process troubleshooting and support. More sophisticated analysis requires statistical programs.

Zymergi offers OSI PI consulting for biotech companies.

2) What software you prefer to analyze of your fermentations and for your future fermentation processes planning?

JMP This is where there's a lot of differentiation in fermentation analysis software. My personal fave is SAS Institute's JMP software. This is desktop stats software that lets users explore the data and tease signal from noise or truth from perception. I've solved a ton of problems and produced answers to 7-figure problems with this software.

Zymergi offers MSAT consulting helping customers set up MSAT groups and execute MSAT functions.

There are others operating in this space, but I have yet to see any vendor make headway beyond trial installation and cursory usage.
3) Do you agree with the fact that the question of software for fermentation processes doesn't undergo a rapid development now?
All of these tools are not fermentation specific.  They each are superior in their respective categories:

  • DeltaV is a superior control system
  • OSI PI is a superior data historian
  • JMP is a superior data analysis software
Where there is a gap, fermentation analysis is how to link upstream factors to downstream responses.

Wednesday, April 24, 2013

Viral Inactivation of Cell Culture Media HTST

For all this talk of bioreactor sterility, the vast majority of this contamination talk refers to bacterial contamination.

What about viral contamination?

Viral contamination is mitigated with HTST treatment of cell culture media. This is where you put the cell culture media in continuous flow while subject to a high temperature for a short period time.
  • High-temperature inactivates the virus.
  • Short-time ensures that cell culture media components do not denature.
The best time to put the media in continuous flow is when you pump the media from the media prep tank to the bioreactor, so viral inactivation often happens during this transfer. The HTST unit is essentially:
  • Heater
  • Hold-tube (insulated pipe)
  • Cooler
At large-scale, the first plug of media that goes through the HTST may not meet the specification, so this plug of media cannot be permitted to be delivered to the bioreactor. This plug may be sent to drain or recycled through the HTST unit until the HTST unit reaches steady-state.

Once at steady-state, the remainder of the media is pumped (through a sterile-filter and subsequent sterile pipes) into the bioreactor; when batch volume is reached, the remaining media is sent to drain.

Simple enough, right?

The hard part is when your HTST performance begins to degrade:
  • Perhaps your sterile filter starts clogging
  • Perhaps your heater controller output maxes out
  • Perhaps your media-prep post-use is showing problems
As recently as January of this year, Amgen's Drug Product Development published a paper titled, "Identification and root cause analysis of cell culture media precipitates in the viral deactivation treatment with high-temperature/short-time method."

I haven't read the paper, but my manufacturing sciences consulting experience predicts it to say the following:

The calcium phosphate in the cell culture media becomes insoluble at the high temperatures during the HTST. This calcium phosphate precipitate may collect on the surface of the holding-tubes thereby decreasing the heat-transfer coefficient, sporadically causing the HTST to fail.

This calcium phosphate (sandy white stuff) may also clog up the 0.1 micron sterile filter causing a high delta-pressure across the sterile filter, maybe even diminishing the media flow rate.

The problem is that calcium phosphate stays in solution except when both the temperature is high and the pH is high.


Unfortunately, high-temperature is a requirement of HTST; which means the only solution to preventing calcium phosphate precipitation and the ensuing HTST performance degradation and filter clogging is to run the media through at low pH.

See Our Fix

This is a classic multivariate problem where operating in a different range will solve the problem.

See also:


Thursday, December 20, 2012

Gun Violence Is Not a Univariate Problem

The public seems to have a hard time debating multivariate problems.

I remember the Ford Explorer/Firestone Tires issue years back very distinctly.  Was driving a Ford Explorer the cause of the SUV flipping over?  Those who say Ford was culpable pointed to the fact that few other SUVs were flipping over.  Ford pointed out that there were Explorers that weren't flipping over... just the ones with Firestone Tires.

Firestone was saying that there were plenty of cars driving around on Firestone tires without issue and it was Ford's fault that their SUV sucked.

This debate went on and on.  What my boss when I worked at Genentech Vacaville, Jesse Bergevin, said to me at the time was that this was a classic multivariate problem with one interaction.

Likewise, gun violence in America is a classical multivariate problem: there are not one, not two, but many variables that contribute to these horrific events.  And like most complex systems, gun violence is many variables coming together (interacting) for a specific effect.

  • When it comes to gun violence, we know that guns are a factor... as in, were it not for guns, we wouldn't have gun violence. (Yes, we'd have some sort of other violence).
  • We also know that mental illness is a factor.  After all, not all gun owners are going around shooting up malls and elementary schools.
  • We also know gun-free zones are favorite targets for gunmen with bad intentions
We know of these factors.  And we know that they interact.  To treat this issue as a univariate problem will change the response.

The right thing to do is to model the system and optimize for least number of gun-related deaths.

In the meantime, I will be thinking often of the children who died at Sandy Hook Elementary.  When I think of them, there's this vacuous hole that fills my stomach and my skin feels numb.

We must solve the problem of violence in our society; but we can't afford to do it wrong and treat it as a univariate problem (i.e. ban guns and be done with it).




-->