A strong cell culture media development and optimization program is key to successful manufacture of any cell-based biologic. Effective media development and optimization programs require substantial data about how cells respond to varying aspects of media composition, feed strategies, gas exchange and several other factors. Strong analytics are the foundation for this information and cell culture media analysis provides key insights into the health and productivity of cells in culture and intelligence on what components cells need, which have negative impacts, and how these components are ultimately utilized by the cells.
A recent GEN webinar, “The Importance of Cell Media analysis in Upstream Bioprocess Development,” covered current trends in upstream bioprocessing as well as the impact of real time cell media analysis on process improvements and process analytical technologies. The esteemed panel of participants included:
- Richard Rogers, Ph.D, Director of Process Analytics, Umoja Biopharma
- Michael Betenbaugh, Ph.D, Director of AMBIC biomanufacturing center
- Christopher Brown, Ph.D, CTO & Co-Founder, 908 Devices
- Graziella Piras, Ph.D, Bioprocessing Segment Director, 908 Devices (Moderator)
This article summarizes key takeaways from the discussion and highlights areas that the panelists identified as challenges and opportunities for improvement.
Evolution of Bioprocess Development and Optimization
The impact of bioprocess development and optimization can perhaps most clearly be seen in the dramatic increase of monoclonal antibody titer yield over the past few decades, from ~ 1 g/L to greater 10 g/L, in some cases. Early low-yield processes necessitated the use of very large bioreactors and thus very large facilities to manufacture early monoclonal antibody therapies. As the industry gained better knowledge of the overall process, yields increased geometrically, with current yields approaching the maximum capacity of production cell lines.
Increases in yield were due to several factors. Utilizing protein engineering to create better parental cell lines and the use of improved analytics in cell line development permitted the selection and use of cell lines optimized for high titer production. Process intensification and the introduction of perfusion technologies for semi-continuous and continuous production has enabled processes with very high cell densities, thus driving yields even higher and better control of protein quality attributes. Lastly, media development and optimization has fueled yield increases and the ability to sustain cells during process intensification.
Betenbaugh summed it up well: “All our tools and all our capacities have come together from cell line development to analytics to process optimization framework to give us the geometric increases over the past 30 years which are really exciting and a proud moment for us to appreciate.”
Once higher yields were achieved, the question became how to ensure consistent, reliable and predictable manufacturing, all of which requires a significant amount of process understanding. This process understanding is derived from a class of analytics that have been developed to inform and guide not just media optimization, but optimization of the entire process.
From here, the industry should strive toward total understanding of the entire process. To that end, Betenbaugh said that he would like to see complete characterization of the chemistry and the biochemistry of the cell and the culture environment. He added that advanced analytics and tools such as omics are moving the industry toward the type of total characterization that would permit bioprocesses to be defined in the same way you would look at a chemical process.
mAb Process Challenges and Opportunities for Improvement
mAb production has been moving toward platform processes that are applicable across multiple products. While several classes of moleculet have platform processes in place, most products still require extensive process optimization to reach full production potential. For newer protein-based biologics, like bispecific and trispecific antibodies as well as hybrid fusion proteins, these platform approaches either aren’t applicable or result in very low yields. Thus, there are challenges that need to be addressed in protein expression around these new modalities.
There are also protein-based therapeutics, recently approved or in the pipeline, where large demand is anticipated. For instance, Biogen’s Adulhelm for treatment of Alzheimer’s is anticipated to reach high volume demands. As a result, improvements in cost-effective manufacturing at large quantities will be an important consideration. Cost of production in general continues to be a pressing issue as well.
Process intensification is a possible solution to both the need to manufacture at large scale as well as reduce cost. To implement an intensification strategy, media optimization must be prioritized. With perfusion processes, it is important to not only understand what the cells need, but also how many medium components can reasonably be dissolved in a given volume and not fall out of solution. Given that the number of components that can be added to a medium formulation is limited, thus, process development must understand exactly what the cells need. Betenbaugh stressed the need to have a much more detailed understanding of the media components and the metabolites in order to make perfusion processes as effective as possible.
New Modalities Create New Opportunities and Challenges
One of the biggest challenges is applying what we have learned from working with mAbs and other protein biologics to new modalities, like next-generation protein biologics and cell and gene therapy applications.
BMS’ Rogers stated that peptide mapping and intact analysis of monoclonal antibodies using mass spectrometry is not a simple task, and becomes even more complex when applying those types of technologies to look at cell and gene therapies. He added that fully understanding the complexity of a cell that is made up of thousands of proteins and how those proteins interact with each other and within the microenvironments of a cell culture process is a big challenge. It is also important to fully understand how those cells behave once they are administered to the patient. Thereby in order to ensure the best therapeutic possible, more work should be done in this area.
Gene Therapy Manufacturing
Gene therapy manufacturing is in its infancy compared to the more established mAb processes. Betenbaugh pointed to exceedingly low yields in current processes, causing the cost of goods and the price of the drugs to be uncomfortably high. Rather, he said, the industry needs to develop a process improvement approach like the one that was done for monoclonal antibodies, looking for process optimizations on all fronts. “From molecule design to vector design to cell line design and bioprocess design and all that goes with that, including media development, analytics and all those factors. We are likely five to ten years, maybe more, away from having the kind of success that we currently see with mAbs.”
Rogers added that “viral vector process development is a key component of our manufacturing process and making sure that we’ve created a viral vector that will give us the highest transduction efficiency possible for our drug product is essential. In doing that, we understand the protein profile that makes up that viral vector is extremely important as well as ensuring that it is functional and it’s delivering the right genetic material to our target cells. So, we see an opportunity there to improve our process, as well as lower cost of goods.”
Media optimization is also an important part of improving viral vector manufacturing. It is important to leverage analytics in the same way that was done for process intensification to understand which media components are essential and which media formulations provide the highest titers and full to empty capsid ratios.
Cell Therapy Manufacturing
Cell therapy manufacturing is also fairly new and there is still a lot that needs to be learned and optimized. One of the biggest challenges with cell therapy manufacturing is the high variability of the starting material.
Rogers shared that the biggest source of variability in creating an autologous therapy is the donor itself. “We actually have extremely robust manufacturing processes but understanding the input material that we are using to make that drug product still needs to be better characterized,” he said. “There are a lot of efforts going on at many companies to dissect what creates a good drug product and identify those levers that enable us to make that high quality drug product at the end.”
Impact of Real Time Cell Culture Media Analysis
One common denominator across mAbs, gene therapy, and cell therapy is cell culture media analysis. And when it comes to that type of analysis, Chris Brown, the CTO and co-founder from 908 Devices, said time to data is key.
“In the early phases of process development data is power, but the challenge is being able to get that data in a way that’s not too disruptive either to the process itself or to the people doing the work,” Brown said. He added that the practicality of acquiring some of these data, given how complex these systems are, must be considered. There are likely many hundreds of independent variables in optimizing a process, so it is important to focus on key areas that need improvement.
He provided the following example: if a company has reasonable yield, that company may want to focus on designing a process that provides better stability and reproducibility. That, in turn, requires high-dimensional design spaces and sufficient high-dimensional data to support it. Brown concluded that there has been a long history of using sensors for the basic metabolites and bioprocesses, i.e., glucose, lactate, oxygen, and ammonia. These are relatively simple measurements, but cumbersome to obtain. What is really needed is a broader panel of analytes to get a snapshot in time of the bioprocess.
Historically these panels required pulling samples and sending to external or core labs to get a comprehensive evaluation. This is time consuming and quite costly, so researchers often must make tough choices about how often and what tests should be run. Rogers added that if we can’t generate these data in a reasonable amount of time, they’re not useful to process development groups, the cell culture groups, or the downstream purification groups. “We need to be able to generate these data in the shortest amount of time, but still developing high quality data that’s informative of our process and our product,” he said.
Rogers added that leveraging instruments that can take a less-processed sample is a huge win. “For instance, being able to take a media sample, add a diluent and then put it directly onto an instrument as the REBEL is a huge advantage for generating process data points to be able to learn more about your system,” he said.
Inevitably, data management and integration need to be part of the discussion around process optimization because the more data that is generated, the more data must be analyzed and evaluated. Brown said that currently data systems are more segregated, and it can be difficult to exchange information between different systems. Increasing integration means increased efforts to harmonize protocols, or at a minimum generating data in open file formats that are immediately discoverable and ingestible by systems universally rather than in proprietary file formats. Ultimately the biggest challenge is that data is not useful on its own, it needs to be converted into information that needs to then be converted into action.
Brown added that artificial intelligence and process modelling may be able to help with converting data into actionable insights, but there are a lot more process variables in play with these bioprocesses. This puts stress on data modeling because when you have many variables in play, it’s easy to think that random correlations might be valuable insight, but actually have no underlying biophysical or physical pathway. It is important that there is enough data to support specific conclusions generated from predictive modelling systems. The industry will likely have some challenges along the way as these systems are implemented more broadly, but these tools have been used very successfully in other industries.
Rogers concluded that it is important to leverage multiple data types to fully create models that can be tested, and results reevaluated with the next set of data. Computational groups have been employed within many companies because of the broad array of expertise required to generate high quality data with these instruments, pull the data together, and model them in a way that can be predictive and useful. It requires a significant investment across the industry to do these types of analyses to gain product and process knowledge.
The panel agreed that having frank discussions, within the industry, around challenges and possible solutions is the only way that the industry can move forward. It is key to have both product developers and suppliers on the same page when it comes to identifying pinch points in the process or gaps in the data. Fully understanding the biologics process can only happen through collaboration by identifying industry needs and then specifically targeting instrument suppliers or software vendors to make sure those needs are being met.
Rogers closed by stating, “I really am excited for the future and I’ve been involved with a couple of different collaborations, specifically with 908 Devices, making sure that the analytic meets our needs and I do think that that the biopharma industry can make huge strides forward by identifying those needs.”