Curious Dr. George | Plumbing the Core and Nibbling at the Margins of Cancer

The Role of “Rapid Science” in Facilitating and Rewarding Collaboration in Biomedical Research

Sarah Greene, MS, Chief Executive Officer, Rapid Science, Brooklyn, NY;
Email sg@rapidscience.org

Q: You are the leader of Rapid Science, an innovative initiative intended to speed up the process of research and validation of new information, and its use in practice, especially in Cancer. Can you explain this effort briefly for our readers, particularly that focused on sarcoma?

A: It is now a truism that significant advances in this hypercompetitive era of personalized medicine require shared resources such as massive datasets and specialized expertise. This understanding has resulted in many more research grants for multi-institutional and interdisciplinary projects. Indeed, the term collaboration is so fashionable in the scientific literature and RFPs that it has become practically invisible, like echoes of “incentivizing” and “leveraging” on the SF-to-Palo Alto Caltrain. Consider Joe Biden’s “Make it a team sport. Collaborate… rapidly change a million lives” and Sean Parker’s goal to speed immunotherapy cures by forging collaborations.A few years ago, I participated on two funded, multi-institutional research projects as executive director of Cancer Commons (attempting to fill the very large shoes of Dr. Lundberg). From this experience, and many others from the base of my nonprofit startup Rapid Science, it became clear that massive funding, online platforms, and shared datasets do not engender true collaboration or its expected improved outcomes.
There is an abundance of literature pointing out that the current incentive system of ‘publish or perish’ has resulted in dysfunction in the research establishment. The pressure on scientists to publish in high-impact, elite journals often creates competition on research teams that have been funded to collaborate. Lacking a means of tracking, assigning provenance, and rewarding the varied contributions of individual researchers, there is little incentive to share early findings and insights that will further project goals and is largely responsible for today’s reproducibility crisis.

In science… what gets measured is what gets rewarded, and what gets rewarded is what gets done. (Michael Nielsen)

The Rapid Science team of scientific and medical advisors, and a strong board of directors led by Drs. Bruce Alberts and Larry Marton, has conceptualized the C-Score to track and reward collaborative research. We intend this to serve as an antidote to publishing as the sole means of measuring the impact of scientists’ labors and as the primary arbiter of their careers.
Requirements for the design and implementation of this algorithm include a collaboration platform suitable for researchers to selectively share, discuss, and publish findings on a funded project (with software to track these activities). And secondly, editorial expertise is required to facilitate interaction among subgroups of the project and to reduce the burden collaboration places on scientists’ time and attention.
The Rapid Science platform for multi-institutional research teams was launched this month. The system enables posting and discussing findings in a continuum from closed to open access, with each participant controlling when and how widely to share their findings. The members of our first pilot group, Sarcoma Central, are sharing patient data from their institutions with the goal of advancing early detection and new therapies for this rare disease involving dozens of subtypes.
Strong editorial oversight is critical for facilitating meaningful discussion and assisting in coauthoring/disseminating early findings as preprints and publications (e.g., null results, datasets, unfinished experiments, case reports, and posters that are not generally accepted in high-impact journals). In this scenario, subject experts – those PhDs who have traditionally served as gatekeepers for scholarly publications — work alongside the research team to ensure optimal trafficking of ideas and to orchestrate an internal process of “organic” peer review as findings are discussed and iterated in the closed environment. We believe this scrutiny results in far more reliable output to an open audience than traditional peer review that occurs when the project has been terminated.
Working with pilot groups, we will continue refining our tools and methods to test the hypothesis that realigned incentives will result in authentic collaboration, more reliable/valuable outcomes, and earlier open dissemination of research results. We are seeking additional funded research teams to participate; and most crucially, alliance with funders, administrators, and other stakeholders who currently adjudicate scientists’ careers.

Sarah Greene’s contact info is included in the author affiliations at the top of this page.
Copyright: This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.