Over the past decade, numerous data sharing platforms have been launched, providing access to de-identified individual patient-level data and supporting documentation. We evaluated the characteristics of prominent clinical data sharing platforms, including types of studies listed as available for request, data requests received, and rates of dissemination of research findings from data requests. We reviewed publicly available information listed on the websites of six prominent clinical data sharing platforms: Biological Specimen and Data Repository Information Coordinating Center, ClinicalStudyDataRequest.com, Project Data Sphere, Supporting Open Access to Researchers-Bristol Myers Squibb, Vivli, and the Yale Open Data Access Project. We recorded key platform characteristics, including listed studies and available supporting documentation, information on the number and status of data requests, and rates of dissemination of research findings from data requests (i.e. publications in a peer-reviewed journals, preprints, conference abstracts, or results reported on the platform's website). The number of clinical studies listed as available for request varied among five data sharing platforms: Biological Specimen and Data Repository Information Coordinating Center (n = 219), ClinicalStudyDataRequest.com (n = 2,897), Project Data Sphere (n = 154), Vivli (n = 5426), and the Yale Open Data Access Project (n = 395); Supporting Open Access to Researchers did not provide a list of Bristol Myers Squibb studies available for request. Individual patient-level data were nearly always reported as being available for request, as opposed to only Clinical Study Reports (Biological Specimen and Data Repository Information Coordinating Center = 211/219 (96.3%); ClinicalStudyDataRequest.com = 2884/2897 (99.6%); Project Data Sphere = 154/154 (100.0%); and the Yale Open Data Access Project = 355/395 (89.9%)); Vivli did not provide downloadable study metadata. Of 1201 data requests listed on ClinicalStudyDataRequest.com, Supporting Open Access to Researchers-Bristol Myers Squibb, Vivli, and the Yale Open Data Access Project platforms, 586 requests (48.8%) were approved (i.e. data access granted). The majority were for secondary analyses and/or developing/validating methods (ClinicalStudyDataRequest.com = 262/313 (83.7%); Supporting Open Access to Researchers-Bristol Myers Squibb = 22/30 (73.3%); Vivli = 63/84 (75.0%); the Yale Open Data Access Project = 111/159 (69.8%)); four were for re-analyses or corroborations of previous research findings (ClinicalStudyDataRequest.com = 3/313 (1.0%) and the Yale Open Data Access Project = 1/159 (0.6%)). Ninety-five (16.1%) approved data requests had results disseminated via peer-reviewed publications (ClinicalStudyDataRequest.com = 61/313 (19.5%); Supporting Open Access to Researchers-Bristol Myers Squibb = 3/30 (10.0%); Vivli = 4/84 (4.8%); the Yale Open Data Access Project = 27/159 (17.0%)). Forty-two (6.8%) additional requests reported results through preprints, conference abstracts, or on the platform's website (ClinicalStudyDataRequest.com = 12/313 (3.8%); Supporting Open Access to Researchers-Bristol Myers Squibb = 3/30 (10.0%); Vivli = 2/84 (2.4%); Yale Open Data Access Project = 25/159 (15.7%)). Across six prominent clinical data sharing platforms, information on studies and request metrics varied in availability and format. Most data requests focused on secondary analyses and approximately one-quarter of all approved requests publicly disseminated their results. To further promote the use of shared clinical data, platforms should increase transparency, consistently clarify the availability of the listed studies and supporting documentation, and ensure that research findings from data requests are disseminated.
Read full abstract