Thursday, November 21, 2013

Simplyfeye among best product poster presented at IKMC 2013

Simplyfeye team recently attended a poster conference (http://www.ikmc2013.com/ikmc/index.html) organized by IKP Knowledge Park at Hyderabad,India. This conference is an annual event organized by IKP to showcase innovative product development in the country and is attended by industry veterans , government representatives and angel investors / venture capitalists. More than 100 innovators and start-up companies presented their ideas in the form of posters. Our poster titled "Bioprocess Monitoring" based on our innovative product ProcessPad was very well appreciated. We are excited to inform that we were adjudged second best poster in the conference. This recognition boost our confidence and gives Simplyfeye team encouragement to continue our journey of building quality products and solutions for the bio/pharmaceutical drug manufacturing industry. We thank IKP team for inviting us and giving us chance to present our ideas. Sharing below is the poster that we presented.



Tuesday, November 12, 2013

Additional note on Bioreactor Contamination Control: Some Simple Strategies to Consider

Further to my earlier post on using process data for troubleshooting and control of bioreactor contaminations, I recently came across a wonderful article in Pharmaceutical Engineering for simple strategies to keep a check on contamination events published by Ryan Schad and his team (from Eli Lilly & Co.) back in June 2010 issue of the magazine.

I especially liked the following checklist provided in this article.


The article elegantly describes the following considerations in good detail:

  • handling of steam and condensate
  • air removal
  • cold spots
  • equipment drains
  • piping slopes
  • elastomers
  • methods of data collection from reactor for monitoring
    • offline monitoring: sterile sampling techniques
    • online monitoring: reactor probes and its sterility
  • valves and dead lags
  • agitator shaft seals

In addition to these I also recommend considering these:
  • Incorporating a Helium Leak Test: Since helium can penetrate very small cracks and crevices, filling the bioreactor with helium prior to a batch and snooping for leaks is very quick and easy way to identify potential leaks especially in deteriorated elastomers. Most of the leak testing methods that are usually employed are bubble test or pressure decay. Helium can be a good alternative if you find any of the current methods that you employ are not effective.  
  • Using inexpensive Tempilstik temperature crayons: If your plant is old or there is no way to install additional temperature sensors on your valves to check for effectiveness of your SIPs, you may consider good old temperature sticks to confirm the effectiveness of sterilizations each time and record the checks in your SIP batch records. (Note: This process is laborious but can be an easy cost effective solution for contamination check and also historizing SIP data for future troubleshooting)
If you think you already have incorporated these checks and still are having problem in controlling your contamination events its high time you take help of external consultants as they will help provide a third eye to the investigation. Most of the time taking the help of consultants who have already solved similar problems in various contexts earlier can help you overcome your own organizations' blind spots, internal politics or lack of expertise (internal transfers, attrition etc.).
Simplyfeye team can be one such resource to you that you can rely and can utilize our years of experience in solving such problems. Our team will help you implement effective CAPAs (and data systems) for future checks on such events.

Sunday, November 3, 2013

Mixing and Blending Effects: Troubleshooting with process data

Those of us involved in biomanufacturing know the crucial role that buffers (or media) prep/hold tanks or product intermediate holding tanks play in any biopharmaceutical operations. We cannot imagine any biomanufacturing activity without these hold/blending tanks. In large scale operations (any thing above 1000L scale) most of these tanks are stainless steel.  Figure below shows a schematic of such a tank. There can be large variation in the size and design of these tanks within the facility.


Lack of knowledge or consideration of these design variations in operations leads to inconsistent buffer (or product intermediate) preparations leading to incorrect product estimations which in turn leads to incorrect product loading on the next unit operation. Some of the key areas that we should keep in mind are
  • estimating appropriate mixing/blending time with factor of safety and incorporating those in relevant SOPs (standard operating procedures) and BPRs (batch production records).
  • understanding the relative positioning of regions 1 through 4 (regions are marked by yellow bubble in the tank schematic figure above). Region 2 is important in estimating the concentration (via offline sampling) while region 3 is important in controlling pH (or conductivity) (via online feedback control) respectively. Region 1 and 4 are important in understanding the design differences and relative concentration gradients of solutions within the tank. 
Achieving homogeneity of liquid in these regions (w.r.t to whole mix) is the key to operations with reduced variability and eliminating troubleshooting headaches. Differences in position, type and size of impellers (as shown in figure below) also considerably affect the blending operations which should also be considered while writing SOPs and BPRs.


If you are observing variability (especially in intermediate step protein or buffer concentrations) in your operations it is highly probable that some of it may be coming from inconsistent blending operations. Most of the time blending is overlooked on scale up and we don't invest much time and effort building good data systems around it, hence, the operating staff is not geared with right tool-set for troubleshooting. Usually most of our focus in operations is getting the main unit operations right. We at Simplyfeye highly recommend investing in some extra sensors for online monitoring and detailed recording of manual blending operations for offline historical analysis.

Let me explain this need for data systems with a small example.

Imagine you have a two consecutive chromatography steps ChromA and ChromB within your protein production process. The eluate from ChromA is pooled into a intermediate tank where the pool is blended (and 20% diluted) with a load buffer to prepare for load onto the next step ChromB as shown below


If you had invested in some extra sensors capturing data from tank, the "Tank Weight" trend for a batch on the "Load Prep Tank" would look something like the chart below


The five phases depicted in the chart above (marked by black bubbles) are showing different operations that happened during the blending operations and this data can be sliced and diced accordingly for process troubleshooting:
  • Phase 1: Shows tank weight increasing when the eluate is being transferred to the tank. The slope of this curve will give additionally an estimate of the eluate flow rate.
  • Phase 2: Shows the hold/mix/idle time before the buffer is added to the tank
  • Phase 3: Shows tank weight increasing when the buffer is being transferred to the tank. The slope of this curve will give additionally an estimate of the buffer addition flow rate.
  • Phase 4: Shows the hold/mix/idle time after the buffer is added to the tank and a homogeneous mixture is achieved. If the inconsistency is seen in load concentrations this time can be correlated with concentrations 
  • Phase 5: Shows decreasing tank weight when the diluted ChromB load is being loaded onto the chromatography column for step ChromB. The slope of this curve will give additionally an estimate of the load flow rate.
As we can make out from this chart above the offline sampling from the tank for estimating protein concentration should be taken
  • towards the end of Phase 2 for estimating ChromA eluate pool concentration and
  • towards the end of Phase 4 for estimating ChromB load pool concentration
If you have had this information easily available, you could then easily correlate sampling times with the timings of end of Phase 2 and Phase 4 to understand the variability in your ChromA eluate pool or ChromB load pool concentrations respectively. This would help confirm if time of sampling are set properly in the SOPs or BPRs and if not set properly necessary CAPA (corrective and preventive action) can be assigned based on the actual data.

You can also sometime observe a trend where load pool concentration is either getting over estimated or under estimated (as shown in the batch trend chart below), the root cause of which can sometimes be attributed to mixing/blending operations. 


The concentration can be over or under estimated if
    • sampling is taken without giving sufficient mixing time
    • pH or conductivity control range is too wide
    • target concentration calculation is always rounded on the higher (or lower) side of the target
    • over/under transfer of buffer due to faulty load scale on either buffer or blending tanks
    • line losses in transfer operations
    • buffer confounding causing test method error
This explains the need for having an efficient data system that can not only capture but also aggregate the process events, manual operations and machine data efficiently. Otherwise troubleshooting of such problems becomes a nightmare for scientists and engineers working on the floor.

This is just a small example, however, there can be similar other operational effects that are often overlooked giving severe headaches to operations folks in nailing down the variability in their processes. Investing in proper data acquisition and data management tools is worth the money that can save million of dollars in wasted efforts, resources and precious drug product going down the drain. ProcessPad from Simplyfeye is one such platform/solution that can bring some sanity to your process troubleshooting efforts. Write to us if you are solving similar issues in your operations. Our team of experts can help you solve your riddles and also help in setting up efficient data systems.

Monday, October 21, 2013

Bioreactor Contamination Control: Role of Process Data

One thing that those of us involved in large scale biomanufacturing most dread is a contamination event in one of our cell culture bioreactors. Especially in the mammalian ones that run more than a fortnight to churn out a batch. This not only disrupts planning and scheduling activities having a direct impact on supply chain, a contamination event drains lot of resources (time, raw materials etc.) having a very high quality and economic impact on the operations.

This blog post concerns CIP/SIP stainless steel reactors typically starting from 100L ranging upto 20000L. Contamination troubleshooting and root cause identification in autoclave lab reactors (<50L) or single-use disposable reactors is relatively simpler compared to the stainless counterparts.

Understanding the machine and its sterile boundary:


Operating a bioreactor successfully is analogous to driving a car in a crowded marketplace (especially if you have driven in a developing country) without compromising passengers (or pedestrians) safety and avoiding any dents on the car surface. You have to have a real feel of the machine you are operating in order to manouvre it safely through the threats on its integrity. The concept of sterile boundary and understanding the conditions under which this sterile boundary can be compromised is not easy and is gained only by experience and constantly monitoring the bioreactor data.


As shown in the figure above the sterile boundary is constantly maintained by:

  1. sterile filtering the gas flows
  2. sterile filtering all the liquid feeds to the reactor
  3. steaming all the ports (inlet and outlet) before and after each feed or sampling event
  4. maintaining a positive pressure (usually by inert gas like Nitrogen) in the reactor

Modes of breach in sterile boundary


Here below I will mention the modes in which this sterile boundary can be compromised categorized in the order of their probability of occurrence.

Highly Probable

  • Contamination entering from feed/sampling ports: Whenever reactor ports are opened to take in some feed or to take out samples, the sterile boundary is stretched. The reactor sterility can be compromised if
    • There is improper steaming of the ports i.e. port is not sterilized for the required time or required sterilization temperature is not achieved
    • Micro-cracks in the diaphragms' elastomers on one of the port valves

Less Probable

  • Online sterile gas filters integrity breach or filters not assembled properly prior to the run

Rare but probable

  • Most of the modes in this category are related to improper bioreactor maintenance or poor reactor design:
    • Improper seating of O-rings or warped reactor O-rings (head-plate, bottom-plate, man-hole etc)
    • Breach in impeller shaft seals
    • Faulty pressure relief valve
    • Faulty drain system
    • Poor reactor design with improper dead legs
    • CIP/SIP not being able to sterilize certain parts of the bioreactor
    • Faulty steam traps

Looking at process data for root cause identification


Identification of a contamination event in a bioreactor is usually spotted while monitoring the dissolved oxygen profile of the batch in progress. Some information that is needed to solve the riddle of tracking the source of contamination are listed below.
  • Dissolved oxygen profile: The sudden drop in dissolved oxygen levels (shown in chart below) provides the estimate of time when possibly the contamination would have entered in the bioreactor.
  • Valve temperature profile: Suspected valve temperature profiles (see chart below) should be checked if valve was properly sterilized before the feed/sampling event.
  • Bioreactor events during the batch: These event times when correlated with the time of drop in the %DO provides preliminary estimation of source of contamination
  • Contaminant species identification: Investing in rapid species identification will help identify whether the contaminant is gram-positive, gram-negative or spore formers. Doubling time of species identified then helps in approximate estimation of time between species entering the reactor and it overpowering the whole culture (%DO bottoming out).
  • Past bioreactor maintenance history
  • Number of CIP/SIP cycles that elastomeric diaphragms had gone through: This check will help nail the possible diaphragm valve (if above the usage limit) that may have failed/cracked due to overuse.

The figure below shows a sample dashboard providing some of the information listed above



Access to these data sets comes from various sources like
  • QC data
  • Data historians
  • Batch Records
  • Work order records
  • Non-conformance management systems
  • Training records

Instant access to this data helps the investigator to get to the probable root cause in the fastest possible time so that appropriate CAPA's can be implemented before the next bioreactor run. ProcessPad platform from Simplyfeye is one such tool-set that help you provide instant on-demand access to these information for timely action, recovery and prevention from such events.

Tuesday, October 15, 2013

Monitoring Outsourced Biomanufacturing: Sponsor/CMO Collaboration

Further to my previous post on using Excel as a recording tool, one of the advantages of using this approach is in data collection from diverse locations particularly in the outsourced manufacturing environment.

As we see more and more CMO activites proliferating (with drug discovery and commercialization companies sponsoring/outsourcing their manufacturing to CMO's providing in clinical and commercial manufacturing services), there arise a need for better collaboration and continuous communication (almost in the real-time mode) between both the parties. With easy to configure Excel based process templates sponsor organizations can easily collect process data from their CMO counterparts in a near real time manner. Since most of the CMO's prefer to work on a paper based recording system as its practically unviable for them to invest in electronic batch recording system owing to multi-product nature of their business and constant changes in their product manufacturing pipelines, in our view this (Excel recording) approach is most economical in an outsourced environment.

With our ProcessPad platform the sponsors can provide a secure way of uploading the process data where both parties can have real time access to same process information. Sponsors can easily control the access to the system. A schematic of such a system is shown below:



Write to us if your organization is involved in outsourcing activity/services and you are looking for effective solutions for sharing and collaborating with your sponsors or CMO's.


Monday, October 14, 2013

Process Data Management: To Excel or not?

Microsoft Excel is ubiquitous in any organization. My current context of "any organization" is a biopharmaceutical organization involved in development and manufacturing of biotherapeutics. We all use Excel in our day-to-day activities for storing data and analysis. Most of it is for personal consumption and sometimes to share information with our colleagues. Excel spreadsheets offer both speed and flexibility that no other software can offer. Speed for setting up the tables and columns in which we wish to capture data and flexibility for the way we wish to organize these to record data. Popularity of Excel spreadsheets are primarily due to
  • Its WYSIWYG (what you see is what you get - always) which is not the case in databases
  • Capture is same as data output
  • No involvement of IT folks while setting up capture records which is tremendously faster and less annoying for scientists and engineers than to deal with techies 
  • It has personal flavor
  • And its free as it comes bundled with Microsoft Office (the lifeline of enterprise productivity). We do not need to go through long and tiring budget approval battles for a new data system.
These personal data records (aka spreadsheets) can however proliferate fast and become a nightmare to manage which many database and software companies (including ours) term as "spreadsheet madness or mania". Here are a few things that does not go in Excel's favor:
  • Sharing: Ability to share data in real-time so that all my employees can have access to same/single source of truth
  • Data Duplication and data authenticity: Same Information is maintained individually by many employees in his/her way own way.
  • Storing Time Series data
  • Performing Advanced queries
  • Real-time integrated data access: to visualize and trend process data in a holistic way (unified process information from all process steps both historical and current)
If your process execution recording system (experimental run records or batch production records) is still on paper we are of the view that good old Excel still can still play a very cost effective and useful role in developing a world class process data management system and should not be shunned away just yet. Excel makes an excellent choice if used just as a templated data recording tool owing to the following:

  • Employee familiarity with the tool leading to extremely low learning curve and resistance in adoption
  • Recording templates can be maintained and distributed by scientists and engineers themselves without the need of IT intervention
  • Almost zero implementation time (less than 5-6 business days)
The other tasks of data storage, process aggregation and data analysis should be outsourced to advanced tools that involves databases, historians and data visualization/analysis platforms. One such solution can be using Simplyfeye's ProcessPad platform as shown below:


(Write to us if you are looking for bringing sanity to process data management of your cell culture or fermentation based bioprocesses. Our 100% browser based solutions are very cost effective with extremely low implementation times. More about Simplyfeye products and solutions at http://www.simplyfeye.com)

Thursday, October 10, 2013

Bioprocess data landscape

Welcome to Simplyfeye Blog!


On this blog Simplyfeye team would be posting information regarding our products/solutions and our views on process data management and process troubleshooting in general. Simplyfeye develops products and solutions for biopharma industry for process engineers and scientists to efficiently troubleshoot process issues and continuously improve processes. 

Figure below shows various data, information, knowledge repositories and how the content flows (generated and consumed) from one to another during the normal execution of a commercial manufacturing process life cycle of a product batch. The information is maintained in systems owned by various functional groups like process development, engineering, quality control, quality assurance and manufacturing.


Currently this information is managed by various electronic systems (as shown in blue ovals in figure below) like:

  • ERP (Enterprise Resource Planning)
  • BMS (Building Management System)
  • MES (Manufacturing Execution System)
  • LMS (Learning Management System)
  • LIMS (Laboratory Information Management System)
  • QMS (Quality Management System) 


The "core activity" (as shown below in yellow region) of process development, technology transfer and biomanufacturing operations still don't have any integrated system. The process troubleshooting and process improvement activity still lacks a formal data system that caters to the process analysis needs. This blog and Simplyfeye product and solutions are primarily targeted towards providing insights into this space for efficient data management, monitoring of processes and effective development/manufacturing of biopharmaceuticals.


Watch out for our posts!