Showing posts with label cell culture. Show all posts
Showing posts with label cell culture. Show all posts

Wednesday, June 3, 2015

Controllable Parameters of Mammalian Cell Culture

Albert Einstein

Einstein famously said:
Insanity [is] doing the same thing over and over again and expecting different results.
Which got me thinking...

Question: Is there such a thing as doing the same thing over and over again and getting different results?

Answer: Biotech Manufacturing

Rob Caren once asked: "How hard can (large-scale) cell culture be? It's ONE unit operation."

He also observed that large-scale chromatography shouldn't be that hard either:
  • Send the pool through that fixed-bed reactor.
  • When the optical density is not zero shut this valve and open that valve.
  • Try not to send product to drain.
On both counts, he's right, but I can only speak to the cell culture side.

When running bioreactors, there are a few parameters that are within your control. These parameters are sometimes called "knobs" because the production team can literally go to the control system and "turn a knob" (or click a few buttons on the SCADA) to change the parameter.

For the bioreactor or fermentor, those knobs are:

pH. The intracellular pH is known to change the activity of enzymes that run the rates of metabolic reactions. Since cells regulate intracellular pH, the best you can do is to control the extracellular pH. In reality, the pH for the process is specified. And if specified well, the specification will come with a target range and a proven acceptable range. At commercial scale, you ought to be able to operate within the proven acceptable range. For more information on pH control in mammalian cultures, see this.

Dissolved oxygen. Maximizing cellular productivity means aerobic metabolism. From university biochemistry, we know that anaerobic metabolism produces far less energy (2 ATP) than aerobic metabolism (38 ATP). We also know from chemistry that the solubility of oxygen in water is quite low (<10 mg/L when temperature > 15 degC). Therefore, it is important that the bioreactor supplies oxygen. Some bioreactors supply air and others supplement with oxygen. While the dissolved oxygen range is typically specified by the process, the air supply, air/oxygen mix, flow rates and sparge-type can be determined by the facility. As previously discussed, dissolved oxygen depends on other parameters such as agitation and temperature and can be changed within the specified range.

Temperature. Temperature control happens with the bioreactor jacket where water is flowed around the outside surface of the bioreactor. When the temperature gets too hot, the control system sends cooler water; and when the temperature of the culture gets too cold, the control system sends hotter water. While temperature is typically specified, there are processes that will intentionally cool the culture to reduce the rate of metabolic reactions and extend culture viability. Also, since temperature is defined in a range, the setpoint is a turnable knob.

Agitation. Agitation is typically not specified by the process, just that the cells must be suspended (i.e. not settled on the bottom of the bioreactor).  In practice agitation rate is determined by power-to-volume calculations and stays constant for the bioreactor, nonetheless, this is a manipulatable parameter when running bioreactors.

Timing of Inoculation. Inoculation density is often specified by the process. But there's no way to "dial down" or "dial up" inoc density in a control system somewhere like you can with pH, dissolved oxygen, temperature and agitation. When cells grow, the cell density naturally increases, so the way to control inoculation density is to time it (i.e. wait vs. not wait).

Timing of feeds. In fed-batch cultures, additional nutrients are added to the culture. The additional nutrients tend to increase the osmolality and the additional volume can help dilute the cellular waste (like ammonium). Not in all processes, but in some processes, the timing of feeds have been shown to impact culture productivity.

Timing of shifts. Some processes are specified with changes in setpoints of the aforementioned parameters (e.g. pH or Temperature). The shift specifications come in the form of: "When the culture duration reaches X hours" or "when the cell density reaches Y x 106," then change the set point up/down.

When building multivariate models, it is crucial that controllable parameters are modeled as factors and here's why:

When your model shows can correlate significant main effects and interactions to some process output (e.g. titer or quality attribute), you can actually step out of theory and prove it in practice.

Tuesday, March 25, 2014

ZOOMS 2 - Fastest Way to View Trends

So in addition to biologics manufacturing commentary, it turns out that Zymergi is actually a for-profit business that provides software, consulting and technical support.

For 2014, we're pretty much booked on the technical support side, but I wanted to take some time to talk about our software products.

american hero
Our flagship product is ZOOMS, which is an acronym for Zymergi Object-Oriented Metadata Search.  And in 2014 - thanks to Edward Snowden, more people know what metadata is than in 2008 when we started ZOOMS (v1.0)

So why are we interested in searching metadata? Well, let's take a step back. When working the front lines of campaign monitoring and process support, we noticed that viewing trend data (i.e. tag values plotted against time) was the principal component of process understanding. The more trends a person reviewed, the more process knowledge they gained and the more they understood normal from abnormal.

And in all that time, very few people actually learn to speak "Automationese."
"Hey, did you see that weird thing going on with T100.GLOBAL.CLX.AI05.PV?"
- No one ever
In the automation world, everything is a tag. In the Manufacturing Sciences world, everything is about a measurable parameter within the process. So when you listen to the process scientists and engineers talk, it's always about some parameter (e.g. "Optical Density") in some unit (e.g. "Tank 100"). That right there is the metadata of a tag.

The tag takes care of the Y-axis on the trend. What about the X-axis?

The X-axis deals with time-windows: start times and end times and the metadata around the time-windows are called, "batches." Specifically using S88 terminology, people involved with campaign support are interested in Unit Procedures, a.k.a. "unitbatches."

I'll leave the formal definition of "unit procedure" up to the automation gurus, but to a Manufacturing Sciences data engineer, a unit procedure is a process that happens on a piece of process equipment.

So say you're making rituximab in production bioreactor T100 using the version 1.1 process and batch R2D201 ran from 20-Dec-2013 to 28-Dec-2013... that there is a unit procedure:

batchid unit product procedure start time end time
R2D201 T100 rituximab production
culture
version 1.1
20-Dec-13 28-Dec-13

The metadata around this time-window (i.e 12/20/2013 to 12/28/2013) are as follows:
  • R2D201
  • T100
  • rituximab
  • production culture version 1.1
So it stands to reason that if an internet user who knows nothing about a subject can type keywords into Google and get the most relevant results on that subject; that in 2014, a process citizen who doesn't know too much about the process ought to be able to type some keywords into a webpage and get some process trends.

And now they can: Introducing ZOOMS 2:

ZOOMS Search Engine Process Data

Learn More

Monday, November 4, 2013

Post-Licensure Cell Culture Process Improvements

There's a great article out in GEN on cell culture process improvement, in particular, the Dr. Yuval Shimoni segment on the "low hanging fruits" of post-licensure improvements.

From the article:
At the CHI conference, Dr. Shimoni demonstrated how changes to cell culture media can make a difference by increasing production capacity through greater cellular productivity.
Genetic Engineering News article As I didn't go to the conference, I am left thinking that his feat was pretty impressive.  Changing media components post-licensure is quite daring.

The biologics license agreement (BLA) will call out the exact ingredients +/- percentages on each media component.  And changing a single component can (and has shown to) alter product quality.

Changing several media components, if in fact, that's what he did, is quite the feat and would take testicular fortitude of magnitude 10 on the Moh's scale: Any adverse impact on product quality - no matter the cell productivity improvements - is unwelcome.

Pulling off a media-change post-licensure is not only a technical accomplishment, but a political one as well.  

Monday, October 7, 2013

Who Are You Guys, Anyway?

So, I asked for a report to study Zymergi blog readers, and here's where the biotech/pharma readers are coming from:

Abbott
Alexion
Allergan
Amgen
Astellas
Baxter
Bayer
Biogen Idec
BioMarin
Bristol-Meyer Squibb
Boehringer Ingelheim
Dr. Reddy's Laboratories Biologics
Genentech
Genzyme
Gilead
GlaxoSmithKline
Ironwood Pharmaceuticals
Eli Lilly
Lonza
MedImmune
Merck
Novartis
Onyx (now Amgen)
Pfizer
Regeneron
Roche
Sanofi-Aventis
Teva

This is a veritable who's who of the biotech world.  Obviously, you aren't all customers, but when it comes to large-scale biologics support, cell culture and bioreactor contaminations, readers and customers find themselves in good company.

Thanks for reading.

Note: All logos/trademarks belong to the trademark holder and inclusion on this list is not an endorsement of Zymergi or vice versa.

Tuesday, October 1, 2013

MSAT to Automation, MSAT to Automation. Come in, Automation

When I was running cell culture campaign monitoring and we were using PI to review trends to understand physical phenomenon, there were times the trends didn't make any sense.

After digging a little, we found out that the data was simply recording with too little resolution either in the Y-direction or the X-direction.

Here's a short blog post describing the words to say to Automation (as well as some perspective) to get some more resolution in your data.

Compression Settings

If the data seems sparse in the Y-direction (e.g., you expect to see oscillation but only see a straight line), it could because the compression settings are such that too much data gets filtered out. For OSI PI, there are two types of filter settings: (1) exception and (2) compression.

Exception is responsible for filtering out repeat data between the data interface and PI.

Compression is responsible for filtering out repeat linear data within PI (between the snapshot and the archive).

Every point attribute can be viewed from within PI ProcessBook. view pi point attribute And if you find that your exception or compression settings are too wide, view them within PI and make a note of what they ought to be, then go on and tell your Automation team.

In my experience, you'll find a reluctance within Automation for changing the individual settings on points. Generally, there is a standard or a rule that is applied uniformly to the set of points. For example, you're using Broadley-James pH probes in both cell culture and purification and we (cell culture) ask for a 0.005 compdev on bioreactor pH probes, shouldn't the buffer prep pH probes also be set to 0.005 compdev?

Automation has to balance the tension between customer (your) needs as well as defensible system configuration.

Generally speaking, you're going to be asking for changes to compdev of excdev point attributes, and if you're asking for more data to be collected, you want these numbers to be smaller.

Scan Rate Settings

What if after improving compression to filter out less data you still find that there is not sufficient resolution in the data to observe the physical phenomena that you know is happening? Well, the only place left to check is in the scan rate of the data... sparseness of data along the X-axis.

A point's scan rate is set based on a list of pre-defined intervals in the data interface. The data interface is a piece of software that transfers data from the source (control system) to the destination (data historian). If the interface is configured well, it will have sensible scan rates:
  1. Every second
  2. Every 2 seconds
  3. Every 5 seconds
  4. Every 10 seconds
  5. Every 20 seconds
  6. Every 30 seconds
  7. Every minute
  8. Every 5 minutes
It isn't always like this, but very often you'll see these intervals. The scan rates are defined in the interface configuration file and once set, they rarely change. The way it works is this, the first entry in the interval configuration is gets assigned: 1... the second entry: 2... the third entry: 3.

And whatever you set the point's location4 attribute is what it's scan rate is.

So suppose 00:00:05 is the third entry. Then a point whose location4=3 has a scan rate of every-5-seconds.

In a lot of cases, you simply tell your PI administrator you want the scan rate to be "every second," after which he's on the hook for looking up that scan rate in the interface. But FYI, if they said they made the change to the point but the location4 attribute is the same before and after, they're just BSing you.

There are a lot of considerations that need to get balanced when figuring out this stuff.  What's working against you is the default settings that come out-of-the-box with PI... as well as a generation taking the path of least resistance.

Get FREE Paper on PI Compression

Friday, August 23, 2013

10 Ways to Tell If You Suck at Cell Culture Support

Here are 10 ways to tell if your support of large-scale cell culture, well, sucks:
  1. s-curve volumetric productivityKey performance indicators.
    You don't know what the right KPIs are for cell culture, but you're 100% certain that it's titer.
  2. Proven Acceptable Limits.
    You don't have any defined for your critical process parameters and you failed to demand them of Process Development.
  3. control chart IR spcControl charts. You're not using them or you don't know how to make them, and your bar graphs are just fine, thankyouverymuch. They're not just fine and it's because you can't identify:
  4. Special cause vs. common cause variability.
    You investigate common cause variability because that titer seemed too low or too high.
  5. CpK. You don't know what process capability is and you're not calculating them.
  6. Histograms. You aren't looking at the distribution of your KPIs.
  7. Bi-variate Analysis.
    Linear-regressions, ANOVA, Tukey-Kramer.  You have no idea what this stuff is, 我還不如寫中文.
  8. multivariate analysisMultivariate Analysis.
    You're not doing these and when you do, Y-responses are treated as X-factors.
  9. MSAT local labLocal Lab. You don't have a local MSAT lab to run satellite experiments to confirm the hypothesis generated from the plant.

    A lot of people assume that you can use the resources of a distant process development lab; but then again, a lot of people like blood sausage.
  10. Excel. You're still using Excel to store data. You're still using Excel to analyze data. If you're looking to play varsity cell culture support, you really need to be using a cell culture data management system.

See also:

Tuesday, August 20, 2013

Putting Contamination TimeWindow to Use

In a previous post, I introduced the calculation to estimate the earliest time of bioreactor contamination.

And the reason anyone'd ever bother running this calculation is to help direct the focus of contamination investigation.

Have a look at the example from the previous post. The sterility samples collected showed that the 12,000-liter bioreactor was "clean" all the way through 84-hours. By the time 108-hours culture duration rolled around, the dO2 and pH had crashed, prompting us to send the bottles to QC Micro. QC Micro reports that then 96-hour sample was also "hot."

There are folks who'd look at this data and say,
If we were clean at 84-hours, but hot at 96-hours, then bioreactor manipulations in that time-frame (84 to 96) are culprits for contamination.
But what if there were no bioreactor manipulations in that time frame but a sterile envelope manipulation at 77-hours?

Saying the 84-hour sample was "clean" is actually a mistake. It is more accurate to say, "Bioburden levels of the 84-hour sample were less than detectable." And using a clean 84-hour sample to vindicate prior manipulations would be a mistake by disqualifying true root causes.

On the other side of the spectrum are folks who say:
We need to look at every single sterile-envelope manipulation of the bioreactor starting from the time of inoculation at 0-hours.
This ocean-boiling approach is expensive and includes improbable root causes that ought to be disqualified.

The most effective approach lies somewhere in between and - we think - is to estimate the growth rate of the microbe by assuming the last "clean" sample was simply less-than-detectable. And computing this growth rate.

Using this growth rate to estimate the earliest 1 CFU could have penetrated sterile barriers is one scientifically defensible way of balancing the last-clean vs. boil-the-oceans approaches.

As for the assumptions of this method, they are:

  • Constant growth rate of microbe. This method assumes that microbes entered the bioreactor in the growth phase and didn't stop. Since microbes (like spore-formers) can be in the stationary phase, the constant growth assumption tends to not include as much time as perhaps should be.
  • 1 CFU inoculated the bioreactor. While it is unlikely that a bioreactor breach let in a single CFU or that the SIP killed all organisms except one, assuming 1 CFU tends to include more time and helps counter the assumption of constant growth.
  • Once sample pulled, growth stopped. If the organism is an aerobe, this is a good assumption. If not, use the time of QC Micro count for (t).
Bioreactor contamination response is a lot like crime-scene response and investigation, and the contamination time-window calculation is a lot like estimating the time of death (of a murder victim). This information can be used to help rank probable cause and ultimately the most probable cause (i.e. identify the killer).

Get "Time of Contamination" Spreadsheet

Friday, August 16, 2013

Cell Culture Contamination Consultants

This is how we see customers who hire us to fix bioreactor contaminations:
south park bioreactor

No one likes getting contaminations.

No one wants to search the internet for "bioreactor contamination".

No one wants to pay consultants to help fix them.

But here you are...

...you found us because we write about bioreactor sterility concepts, principles and best-practices.

Sign a confidentiality agreement and give us a call.

Wednesday, August 14, 2013

Rebuttal to Atmospheric Breaks on Drains

Here's some feedback from an industry colleague regarding air-breaks on drains:
For one thing, BSL-2+ areas like [highly toxic] bacterial fermentation suites require the facility to have closed piping to avoid or minimize aerosol effects and biohazard contamination of people and the environment around the fermentor.  The BMBL 5th Edition (basically the biosafety bible) requires it for BSL-2 and above organisms.
Clearly we know that protecting workers is clearly paramount to safety.  But we also know that not every process uses "BSL-2 and above organisms."  A lot of facilities are designed "just in case" the expression system produces biohazard.

Closed-pipe drain headers requires a deep understanding of SIP cycle design and implementing a robust recipe to get rid of vacuum.

So in closing:

Monday, August 12, 2013

When Was the Bioreactor Actually Contaminated?

In a previous post, I glossed over detection of microbial contamination. I'm certainly no QC Micro expert, but a former co-worker, Mary Jane McRoberts, who was telling me the sensitivity of these QC Micro tests:

me: Hey MJ, what are the chances that there's a bug in the sample, but that your tests just happen to not catch it?

MJ: I tell you what.... if there's one CFU (colony forming unit) in there, my test is going to pick it up.

So suppose the final sample I hand over has exactly 1 CFU in the entire sample.

If you are using 40mL bottles to collect samples, that's a concentration of 1 CFU/40mL = 0.025 CFU/mL.

In a 12,000-liter bioreactor... a.k.a 12,000,000-mL bioreactor, you're looking at 300,000 colony forming units floating around in your production culture before your QC methods are sensitive enough to pick it up.

Knowing this 0.025 CFU/mL is crucial in estimating the contamination time-window.

Contamination TimeWindow

Anytime you have a bioreactor contamination, one (good) question that gets asked is: "So when did the contamination happen?"

This is because the signs of bioreactor contamination show up long after the insult as it takes time for the microbial contaminants to consume detectable amounts of oxygen and nutrients to crash the dO2 and pH signals.

All you need to compute this time-window is a spreadsheet of your contamination timeline:
And the equation for exponential growth:
X = X0 eμ(t - t0)
where:
  • X is the concentration at time t
  • X0 is the concentration at time t0
  • e is the natural log constant
  • μ is the growth rate

If we want to know the time of microbial contamination, we're interested in solving for t0.

X is given to us by QC Micro...in this example, QC Micro counted the last sample and found the concentration to be:
X = 2.2 x 105CFU/mL
The time of contamination is known to us:
t = 4.5 days
And if we want to be uber-conservative, we assume that the initial insult was simply 1 CFU. So if our bioreactor is 12,000-liters, the initial concentration is 1 CFU/12,000,000mL or:
X0= 8.3 x 10-8CFU/mL
We know X, we know t, we know e, but we don't know μ, so at this point we have 1 equation, but 2 unknowns (μ and t0).

One way to estimate μ is to assume that the "last clean sample" was just short of the detection limit: 0.025 CFU/mL (assuming 40 mL sample bottle).  Solving for the growth rate:
μ = ln ( X/X0 ) / ( t - t0 )
μ = ln ( 2.2 x 105/ 0.025 ) / ( 4.5 - 3.5 ) = 16 day-1

Since we now know the growth rate (μ), we can flip the equation around and solve for the time of the initial insult (t0):
t0 = t - ln ( X/X0 ) / μ
t0 = 4.5 - ln ( 2.2 x 105/8.3 x 10-8 ) / 16 = 3.14 days (culture duration)
Using a simple plug 'n chug of the exponential growth equation and plate counts from QC Micro, one can estimate the time at which the microbial contamination actually took place.

Question: What are the implicit assumptions of this method?

See also:

Thursday, August 8, 2013

Drain Water vs. Clean Air - Drain Design for Bioreactor Contamination

UPDATE: The point isn't to install air-breaks at all costs.  The point is to use the correct BioSafety Level for your process, recognizing that a lot of facilities are overly-conservative for the processes they run.

On multiple consulting assignments, we are seeing an alarming trend where CIP manifolds and process piping are piped directly to drain.  We have identified direct piping to floor drains as contamination risks.  And our experience mitigating floor drain contamination risk is to cut the piping.

The main objection to this recommendation is that it would compromise the Class 100,000 clean room status of the process space.

With the cut in the piping, the worry is that contaminants from the drain are now able to enter the processing suite and will send your viable airborne particles beyond your environmental monitoring action limits.

But of the unfavorable options available, there's one that's obvious to us.

Your choices are as follows:
  • Keep the bioreactor sipping drain water, but hey, you've got a Class 100,000 processing suite.
  • Cut the pipes and get your bioreactor sipping fresh, 20 air-changes-per-hour filtered air.

bad choices trooper

It turns out that that we aren't the only ones who think this is true.  In a 2006 article on biocontamination control, @GENBio reported the "original views" of chemical engineer, Jim Agalloco:
...Trying too hard to protect the bioreactor environment can adversely affect the ability to sterilize equipment. For example, a steam sterilizer normally requires an atmospheric break between its drain and the facility drain, but some biotech companies object to that layout because it compromises the controlled environment.
Somehow, the viable airborne particles of the environment matter more than the ability to sterilize equipment.  They further state:
Eliminating the atmospheric break introduces more piping and surfaces, which leads to more opportunities for microbes to grow. To protect the outside of the tank, they purposely risk contaminating the inside.
Which is exactly our position on the matter.

We're aware that managing perceived action is as important as managing action.  But taking the action that keeps cell cultures from contamination is always defensible even if it flies in the face of perception.

Zymergi Bioreactor Sterility Consulting

Thursday, August 1, 2013

Every MSAT's Response to Process Development



Reducing variability is the only thing the Manufacturing team can control.  Ways to do this involve getting more accurate probes, improving control algorithms, upgrading procedures, etc.

But there are limits. Probes are only so precise. Transmitter may discretize the signal and add error to the measurement. The cell culture may have intrinsic variability.

What makes for releasable lots are cell cultures executed within process specifications.  And measuring a process parameter's variability in relation to the process specification is the SPC metric: capability.

1fc1cbd2a59a0da04cb5e11abc816b77[1]

Process specifications are created by Process Development (PD). And at the lab-scale, it's their job to run DOE and explore the process space and select process specifications narrow enough to produce the right product, but wide enough that any facility can manufacture it.

It's tempting to select the ranges that produce the highest culture volumetric productivity.  But that would be a mistake if those specifications were too narrow relative to the process variability.  You may get 100% more productivity, but at large-scale be only able to hit those specifications 50% of the time resulting in a net 0% improvement.

The key is to pick specification limits (USL and LSL) that are wide so that the large-scale process is easy to execute.  And at large-scale, let the MSAT guys find the sweet-spot.
-->