As the COVID-19 pandemic shows no sign of abating and healthcare providers struggle to find effective treatments, valuable information is accumulating in electronic health records (EHRs). Researchers used this information—in observational studies—early in the pandemic when an alarm was raised that blood pressure medications based on renin-angiotensin system inhibition posed a theoretic threat to COVID-19 patients. Studies that mined EHRs found no signal of harm to patients who continued to take these medications. Expert guidelines quickly reflected this.
Clinical trials are difficult to run during a pandemic, making the information from EHRs and other forms of real-world data (RWD) poised to play a more prominent role in the age of COVID-19.
Could this kind of information even be used to create virtual clinical trials? That’s a question the US Food and Drug Administration (FDA) has been grappling with since the passage of the 21st Century Cures Act in 2016.
That law requires the FDA to use real-world evidence (RWE) in its regulatory decisions, including approval of new indications for previously approved drugs. These new indications can gain approval through a more streamlined process because the agency assumes that the clinical trials for the original approval established the drug’s safety.
From data to evidence
In December 2018, the agency published a “Framework for FDA’s Real-World Evidence Program” that distinguishes between RWD and RWE: “RWD are data relating to patient health status and/or the delivery of health care routinely collected from a variety of sources. RWE is the clinical evidence about the usage and potential benefits or risks of a medical product derived from analysis of RWD.”
The framework says that RWD can come from a variety of sources, including EHRs, claims and billing activities, product and disease registries, patient-generated data, and data from sources such as mobile devices.
Randomized clinical trials (RCTs) remain the criterion standard for evaluating new drugs and treatments, but they have limitations:
1 Because of their necessarily small size, they can miss rare side effects. If a side effect occurs in only one patient in a thousand, there might be only two or three reports in an RCT with 3000 participants. But if the drug is approved and used widely, there may be hundreds of reports of the side effect. The analysis of EHRs or insurance claims can often identify the association.
2 Clinical trials take place under idealized conditions, with patients supervised to ensure they not only take the drug but do so at the correct dosage and on schedule. Patients left to themselves have lots of reasons for not taking their drugs as prescribed. RCTs can show how well a drug can work, but RWD can show how likely it is to work in the hands of the average patient.
3 RCTs lack diversity in enrollment, with people of color often underrepresented, which could be particularly insidious given the demographics of kidney diseases. RWD can certainly overcome this.
Testing virtual RCTs
RCTs are also very expensive to conduct, so one first step the FDA is studying is whether RWE can be used to facilitate approvals of new uses for approved drugs. Obviously, a brand-new drug candidate has no track record, but after approval, can researchers sift through thousands, even millions, of patient records to find well-matched participants to run a virtual clinical trial?
The FDA has launched a pair of ambitious projects to test this possibility, according to David Martin, MD, associate director for RWE analytics at the FDA’s Center for Drug Evaluation and Research: “One such project, RCT Duplicate, is attempting to duplicate the results of recently completed clinical trials using RWE studies. Approximately 40 trials have been identified for potential duplication. Another 10 ongoing trials will be duplicated before the clinical trial results are reported. A separate project with the Yale-Mayo CERSI will attempt to duplicate several more trials using medical claims and electronic health record data. This work may increase or decrease confidence in the validity of noninterventional RWE, and it may also suggest which techniques are best aligned with different types of drug effectiveness questions.”
For the RCT Duplicate (https://www.rctduplicate.org/) project, the FDA has contracted with Boston’s Brigham and Women’s Hospital and with Aetion, an RWE analysis company founded in 2013, according to company cofounder Sebastian Schneeweiss, MD, ScD. Schneeweiss is also professor of medicine and epidemiology at Harvard Medical School and chief of the division of pharmacoepidemiology at Brigham and Women’s Hospital.
The first of the ongoing trials projects was to predict the outcome of the CAROLINA trial, an RCT to compare major adverse cardiovascular outcomes in patients with type 2 diabetes taking the dipeptidyl peptidase-4 inhibitor linagliptin versus patients taking the established sulfonylurea glimepiride. The challenge for Schneeweiss’ team was to mine insurance claims data to predict the outcome of the CAROLINA trial before its results were published.
Schneeweiss and his team registered a protocol at clinicaltrials.gov, submitted their article to Diabetes Care months before the CAROLINA trial findings were published, and presented their predictions at the American Diabetes Association shortly before the CAROLINA findings were unveiled.
The RWE findings were “spot on” to the RCT findings, Schneeweiss said. “We came to the conclusion that there is no difference in the cardiovascular risk between linagliptin and glimepiride, but we also found that there is a substantial benefit of linagliptin with regard to avoiding hypoglycemic events, he said.” So that is the exciting example of how real-world evidence may work at its best.”
Dangers of misuse
It is not difficult, however, to find examples of RWE not at its best, when researchers use RWD to come to questionable conclusions. For example, the goal of the CVD-REAL study was to use RWD to extend the findings of the clinical trials of sodium-glucose cotransporter-2 (SGLT2) inhibitors in patients with type 2 diabetes, comparing them with “other glucose-lowering drugs.” The CVD-REAL study claimed that SGLT2 inhibitors were associated with a decrease in all-cause mortality that was inconsistent with the findings of the RCTs and so large as to be “unrealistic,” Schneeweiss said. An outside analysis of the results contended that the mortality discrepancy could have arisen from researchers miscounting the SGLT2 patients’ survival time in an effect known as immortal time bias (also known as survivor treatment selection bias).
One of the RAS blood pressure medication studies illustrates an even more insidious potential danger. The New England Journal of Medicine retracted a study that claimed to be an analysis of a large database when the company that provided the alleged dataset would not release the raw data to third-party auditors. Schneeweiss said the situation “is an illustration of what happens if nonexperts who don’t understand how you check whether data are real or fake” are involved and of the need for transparency. Aetion’s contracts with the FDA allow the agency to check Aetion’s data, reproduce Aetion’s findings, and do its own repeated analyses with different assumptions.
Well suited for nephrology
In a recent Perspective article in CJASN on “EHR-Based Clinical Trials” (1) Khaled Abdel-Kader, MD, of Vanderbilt University, and Manisha Jhamb, MD, MPH, of the University of Pittsburgh, state that “the field of nephrology is uniquely well suited to conduct EHR-based research” because so much relevant information is available in EHRs. They point to the high prevalence of kidney disease, the use of routinely collected biomarkers to detect acute kidney disease and chronic kidney disease, the routine capture of kidney disease risk factors in EHRs, and the use of a common EHR product by the organizations that provide dialysis services.
The authors write that the “nephrology community has been vexed by the dearth of RCTs” but that RWD could be used to overcome this deficit. They note that the Isotonic Solutions and Major Adverse Renal Events Trial used EHR-based enrollment to include nearly 16,000 patients in under 2 years. “EHRs can provide a powerful platform to enroll and randomize patients and deliver interventions, thus lowering the cost and enhancing the feasibility of conducting a clinical trial,” they write.
Abdel-Kader told Kidney News that the study by Schnee-weiss’ team represents a cardiovascular disease “safety study, not a study to examine the effectiveness of a medication. It is an observational, cohort study that hopes through rigorous methods to create results that are akin to a trial. I’m skeptical that observational data will be useful for providing FDA-worthy data on effectiveness (i.e., not just noninferiority or safety) in the near term. My opinion is we are many years from fruition re: using observational data to ‘recreate virtual RCTs’ and for FDA labels of effectiveness (with the exception of rare diseases).” Another example of RWD in kidney care comes from a Nature Medicine article by researchers who mined the records of more than 400,000 patients with type 1 and type 2 diabetes in the IBM Explorys database to produce a better model for predicting diabetes-related chronic kidney disease. The authors contend that their model based on seven factors (age, body mass index, GFR, and concentrations of creatinine, albumin, glucose, and hemoglobin A1c) “outperforms published algorithms, which were derived from clinical study data.”
The actual utility of the model remains to be seen, but the huge dataset used and the incorporation of seven measurements easily found in EHRs to create it illustrate the point that nephrology is particularly well suited for the application of RWD.
The FDA is hoping for more engagement from nephrologists in this effort. “It is incumbent that the nephrology community participate in the discussion on the use of real-world evidence and real-world data in drug development, learn from the experiences in other disease areas, and continue to conduct and learn from its own pilot projects,” Aliza M. Thompson, MD, MS, and Mary Ross Southworth, PharmD, of the FDA’s Center for Drug Evaluation and Research, wrote in a CJASN Perspective article (2). “There is widespread recognition that there will be a learning curve and that demonstration projects will play a critical role in defining how and when real-world data and real-world evidence can be used.”
That learning process is clearly under way.
References
- 1.↑
Abdel-Kader K, Jhamb M. EHR-based clinical trials: the next generation of evidence. Clin J Am Soc Nephrol 2020; 15:1050–1052. doi: 10.2215/CJN.11860919
- 2.↑
Thompson AM, Southworth MR. Real world data and evidence: support for drug approval: applications to kidney diseases. Clin J Am Soc Nephrol 2019; 14:1531–1532. doi: 10.2215/CJN.02790319