Updates Ready: Reboot Necessary
Clinical trials are now more complex than ever, and technology has not kept pace. Though a shift to digital decentralization has begun, all too often the solutions are just stitched onto already underperforming legacy systems.
Temitope Keyes | | Opinion
It is news to no-one in the industry that we have a history of bolting together disparate systems to eke out new functionality and meet emerging trial needs. The resulting patchwork of systems has evolved and expanded over time, but was never designed as a functional and cohesive whole. As such, trial data and information are, unsurprisingly, not always properly integrated. Add to the picture cascading data from high data-load sources, such as wearables, biomarker labs, and electronic patient reported outcomes, and it falls to clinical operations and data management staff to connect the dots of this vast, unwieldy matrix.
Unsurprisingly, clinical teams compensate by creating workarounds. The most common of these entails pulling extracts into Excel. In an era of inflating data volume and complexity, such practices hinder the availability of data and delay critical trial decisions. Sponsor teams are denied the benefit of a “single source of truth,” which limits patient centricity and erodes investments in upstream analysis tools by reducing the speed of reporting and visualization. We need to accept that many of the individual systems are aging and no longer fit for purpose.
The use of eSource (direct data capture), adaptive trials, and new risk-based approaches and their supporting technologies is commonly supported by regulators, but some areas of the industry seem slow to embrace them. Analytics is rarely supported or considered, which limits programming options and passively preserves a cumbersome, suboptimal process. New systems must provide effective, user-friendly visualization of trends across patients, sites, and trials. They must support multi-modal data monitoring of trial conduct and provide performance metrics against key indicators – even for the most complex trials. Ultimately, the objective is to increase data quality, improve patient safety, and facilitate rapid data aggregation and decision-making.
In my view, it’s important for sites to have the opportunity and the flexibility to update their procedures and use new technology options, such as trial virtualization and central data monitoring. For example, instant data visualizations available 24/7 would increase clinical awareness and improve patient safety and oversight. DCTs require a single-technology platform that allows for improvement of efficiency through practices, such as data standardization, and training in data transformation processes. It’s best to look for a clinical trial platform that offers multi-modal and centralized data collection to help standardize quality across sites, and enable full traceability of that data for auditing purposes. No matter which system you choose, it’s important to ensure that disparate data can be easily integrated and immediately available for review and decision making, as sponsors will be able to act more quickly and make optimal decisions for the clinical program, compliance, and safety of patients. With the right technology in place, sponsors can realize their goals of running flexible, dynamic DCTs. And it includes the power of big data, artificial intelligence, machine learning, and natural language processing. A radical digital transformation has long been a vision for clinical trial leaders.
Moving clinical trials forward in our new digital reality is not about loosely linked point solutions or so-called unified solutions. It is now vital to have a data strategy built around technology platforms that can truly support the current and future health innovation of sponsors – no matter where the sites or the patients might be.