Clinical Trials: Decentralization and Digital Chaos
Why modern systems for efficient data management are essential
Katrina Rice | | 5 min read | Opinion
Decentralized trials (DCTs) are not a novel concept, but the disruption of nearly 68 percent of clinical trials during the height of the pandemic led to more widespread interest in and adoption of hybrid and virtual trial models. Studies have shown that DCTs can lead to shorter development cycle times, lower clinical trial screen failure rates, and fewer protocol amendments. With wider DCT adoption, however, comes an increase in data sources (particularly external data sources), leading to a surge in data volume. Good data management is crucial to control this. From acquisition and analysis to data cleaning and statistical processes, data managers are the stewards – responsible for guiding a modern data strategy amid the perfect storm of growing data complexity, digitization initiatives, and ever-increasing pressure to accelerate timelines.
In the past, data management was often siloed, focused on cleaning and querying listings of electronic data capture (EDC) data. However, non-EDC and external data sources now contribute significantly to overall data volume. The percentage of data coming from outside EDC continues to rise, while a rise in outsourced models has prompted the data management role to become more oversight focused.
Welcome to the age of data chaos
As DCTs become more widely adopted and as the volume of disparate data continues to grow, data management processes will become even more complex. An Industry Standard Research survey in 2019 revealed that 38 percent of pharma and contract research organizations anticipated DCTs to make up a large portion of their portfolios, and 48 percent expected trials to operate with the majority of activities taking place from the participants’ homes. When revisiting the same questions only one year later, all of the respondents anticipated decentralized trials would make up a significant portion of their research profiles.
Although remote participation is pleasing for patients, it results in even greater data source volume and variety, which is difficult for clinical trial teams to manage. Research from the Tufts Center for the Study of Drug Development in 2019 found 75 percent of life sciences organizations still using SAS and Excel to integrate and analyze data. Over 80 percent of respondents reported data management activities as time consuming and labor intensive.
The same study also found that over two-thirds of clinical trial sponsors were using or piloting at least four types of data. The number of sources has nearly doubled since then and will continue to rise as DCT models are widely operationalized. A 40 percent increase in last patient, last visit (LPLV) to database lock cycle times for companies with five or more data sources was reported; the study concluded that contending with disparate data sources was contributing to longer database lock cycle times. In our services organization, trials frequently average eight or more data sources – but many include over 15!
The trends that contributed to the Tufts study findings have only accelerated since the onset of the pandemic, which means one thing: data chaos. If the industry doesn’t adopt new approaches, data management will only get more challenging.
The modern approach to clinical trial data management
Identifying and creating a data strategy roadmap in the midst of these growing pains can present a challenge, but it is essential, if you want your organization to be ready to face the future. The increased adoption of virtual and DCT approaches to clinical trials necessitates a balance between the use of advanced solutions that connect trials with a greater number of patients, and maintaining efficient, high-quality data review and analysis. Improving the overall patient experience is a motivating factor for DCTs, as is easing the burden of traveling to and from sponsor sites. The problem is that many organizations lack the infrastructure to accommodate the shift.
Operational leaders plagued by oversight and monitoring challenges in DCTs need methods to streamline and standardize data from increasingly non-traditional sources. However, there are now a number of data solutions available in the industry that can help. Below are just two examples of how companies are using data management platforms to help manage more external data streams.
With an increasing volume of external data streams, Bristol Myers Squibb (BMS) sought out a data management platform that would integrate with and support its current EDC platform, while also supporting data curation and aggregation. Implementing the platform streamlined clinical data flow, providing quicker access to clean data, streamlined data acquisition, and mapping and standardization – all of which resulted in faster access to data by downstream teams. The platform alleviated pain points experienced with BMS’ previous infrastructure by compiling all data into a unified source, giving the company the ability to create cross-study analytics reports for deeper insights.
A second example: Karyopharm Therapeutics worked on randomized clinical trials with hospital patients suffering from severe COVID-19 – it was the first study of an XP01 inhibitor in patients with viral infections. To support rapid data collection, cleaning, and review for this program, Karyopharm partnered with a data management platform, working closely to build a fully validated database to collect data from physicians and patients in just 15 days. This accelerated timeline enabled Karyopharm to meet the first patient milestone in its critical research initiative.
Cloud-based centralized data management platforms allow clinical trial teams to manage their data more efficiently, mitigate costs, minimize timeline delays, and improve cycle timelines. Put simply, cloud-based platforms modernize data infrastructure by compiling data sources into a unified source of truth. By implementing a cloud-based platform – with expert data configuration, management, and statistical analysis, some companies have seen up to a 50 percent decrease in cycle time was experienced from LPLV to database lock in 2021.
In short, the right data management tools facilitate data transformation, delivering consistent real-time updates and allowing researchers to analyze data faster and uncover insights needed for critical decision making. Moreover, identifying areas and opportunities to pivot earlier in the trial process can help prevent avoidable delays. To keep up with the evolving clinical trial landscape, companies must employ a modern strategy, which requires three core elements: an interoperable approach to DCTs and other non-traditional trial models, investment in resources to ease the data management burden, and the ability to generate meaningful insights from a good data platform.