The COVID-19 pandemic has shone a light on a range of challenges linked to quality management in life sciences manufacturing. On the one hand, public safety measures over the last 18+ months have put physical distance between team members – hampering the usual form-filling, manual sign-offs and Excel-based record-keeping associated with monitoring traditional manufacturing processes. And informal discussions at the watercooler, in which patterns of emerging problems might have surfaced, have simply not happened.
Increased practical barriers to quality assurance, added to the missed opportunities to spot and pre-empt issues along the supply chain using data analytics, have helped drive a renewed business case for intelligent, joined-up quality monitoring based on a single, global, real-time graphical view of all aspects of production.
In the meantime, other parts of pharma organizations have seen first-hand the benefit of pre-emptive signal detection. This is most visible in pharmacovigilance, where use of smart systems present a department’s best chance of accurately processing reams of incoming adverse event data, and meeting deadlines—with the confidence that nothing critical will be missed.
Proactively monitoring and establishing alerts for potential manufacturing quality issues/product deviations, or process non-conformance, would be another logical use case for the same kind of software solution.
The case for harnessing smart, real-time quality analytics is strong and growing. Particularly where artificial intelligence/machine learning is involved, this is about being able to spot emerging patterns very early on—at the first sign of deviation/non-conformance. Issues might range from recurring problems with equipment to varying impurity levels/product instability whose cause needs further investigation.
The Need for Continuous, Real-time Quality Monitoring
Up to now, the tendency has been to view quality monitoring as a compliance activity, linked to the regulatory requirement for a periodic Product Quality Review. Yet this approach doesn’t necessarily invite continuous, real-time quality monitoring, nor with it the chance to stave off production line issues before avoidable risks and costs are incurred. If issues do surface during preparations for a review, these are likely to be fairly established and now require retrospective investigation to determine what has gone wrong, the likely root cause, what the impact may have been, and what remedial action is now required.
This is a wasted opportunity, especially as much of the data to support more continuous and timely quality tracking is being gathered anyway – with a view to creating that annual review report at some later date.
The issue is that this data is not being amalgamated, compared or processed in the moment – to produce actionable insights and/or to trigger alerts.
Moving to a situation that enables continuous, active quality monitoring does not require a major upheaval. The main criterion is that systems are able to draw on data from across functional or departmental silos, so that deviation details, environmental data, complaint information, and CAPA records can be combined and cross-checked on the fly. Better still, analytics and reporting tools should be able to call on historical data too, allowing live comparisons to be made and enabling immediate smart signal detection wherever incoming data deviates from current parameters and past patterns.
All life sciences manufacturers are on a drive to be more effective and efficient with resources, driving up quality without over-extending internal resources—and smart, real-time quality monitoring and reporting plays directly to this requirement.
Tapping into Existing Data for Immediate Returns
What’s more, it is much easier to implement such capabilities in the current ‘cloud-first’, ‘platform-based’ enterprise IT environment. Here, adding new capabilities and use cases is often simply a case of switching on additional features, or user groups who are able to draw on already-centralized, pre-integrated data, for tailored display and application for their own particular purposes.
Given that, as already noted, much of the data exists or is being captured already, adding in a smart analytics and reporting capability can elicit an immediate return—by saving resources/reducing waste, and ultimately by preventing a sub-standard product batch from leaving the production line.
Rather than conducting quality reviews and further investigations in hindsight, smart on-the-fly reporting gives manufacturing teams a chance to explore emerging issues and perform root-cause analyses in real time. If impurity levels are seen to exceed accepted norms, for instance, teams can swiftly move to determine whether the issue might be a variance in the air humidity. This in turn might be traced back to a change of heating/ventilation/air conditioning (HVAC) system.
Amalgamating Data Sources
The starting point for the shift of emphasis towards continuous quality monitoring must be an amalgamation of data sources, ideally in a centralized, cloud-based repository that underpins multiple use cases. An integral master data source will underpin the ability to configure the analytics, viewing, and reporting experience to fit particular user requirements, as well as the ability to set thresholds or parameters to trigger push notifications or alerts to the relevant people. This will deliver efficiency, cost and risk reduction benefits that make for a compelling business case for smart real-time quality analytics in life sciences manufacturing.