Curious Dr. George | Plumbing the Core and Nibbling at the Margins of Cancer

Fibrolamellar Hepatocellular Carcinoma: Still Rare but Deadly

John R. Craig, MD PhD, Retired pathologist and formerly Medical Director, St. Jude Cancer Center, Fullerton, CA; Member, Board of Directors, FibroFoundation

Q: You were the lead author in a 1980 paper in Cancer that clearly delineated an unusual form of Hepatocellular Carcinoma that you termed “Fibrolamellar Carcinoma”. Now, 37 years later, what insights of importance can you share about this unique malignancy?

A: CRISPR/Cas 9 technology, fruit flies, mice and zebrafish are among the tools being used in numerous academic laboratories, encouraged by the Fibrolamellar Foundation, to determine whether the 400kb deletion found on Chromosome 19 in 90% of patients with Fibrolamellar Hepatocellular Carcinoma (S. Simon PhD, Rockefeller University) is a driver mutation.
In 1980, with renowned liver pathologist co-authors Hugh Edmondson and Robert Peters, I compiled and published the first large series of Fibrolamellar Hepatocellular Carcinoma (FL-HCC) cases in the journal Cancer.
This rare cancer has an annual detection rate of approximately 100-200 patients in the USA and occurs primarily in young adults 15-30 years of age.
After our publication, we received many consultations by pathologists who were eager to share patient information and observation. Over the next 25 years, additional publications introduced unusual findings such as increased serum vitamin B12 binding globulin and other tumor markers, such as des-carboxy prothrombin, and plasma neurotensin. Unfortunately, none of these observations advanced either the detection of the tumor or improved treatment.
Some patients are cared for at academic medical centers, but neither chemotherapy nor radiation treatment has been found to be useful. Complete surgical resection offers the best hope but is usually performed late in the course of the disease since the young patients are often thought to be in good health and have few symptoms.
In recent years, these young patients often connect by social media and have developed Facebook pages. They communicate about their disease, their suffering, treatment options, and acceptance of their disease. There is an annual fall meeting of patients and families to share their experiences.
The family of one young patient (Tucker Davis) answered the plea of their son, and in his honor, in 2008, established the Fibrolamellar Foundation with the goal of finding a cure.
The mutation described above is the result of a fusion of the first exon of DNAJB1 with exons 2-10 of PRKACA. This mutation results in a functional chimeric protein, DNAJ-PKAc, which is highly expressed in almost all FL-HCCs. Little is understood about how the mutant PKA kinase may drive cancer formation.
Numerous academic laboratories are developing models to study this genetic deletion and attempt to learn how it changes the hepatocyte and promotes metastasis. There is hope that treatment may be discovered by interrupting this mutation effect within the malignant cells.
Protein kinases are involved in complex intracellular signaling involving cell proliferation, motility, angiogenesis, anti-tumor immune reactions and other functions. There are more than 518 kinases known within the human genome but the functions of most are not understood. However, small molecule kinase inhibitors are already active in current cancer treatment for chronic myelogenous leukemia, acute lymphoblastic leukemia, and several other cancers of lung and breast. However, no kinase inhibitors are known for this mutant kinase in FL-HCC.
In our initial article, we suggested a possible etiology due to modern industrial life with pesticides or chemicals. But a recent search of old records in a large reference academic center identified some patients with this cancer long before 1940. Thus, our modern industrial contamination may not be a reason for tumorigenesis.
Similar to many other organizations representing rare diseases, the Fibrolamellar Foundation is a philanthropy that has encouraged collaboration by major academic centers and scientific research meetings bringing together diverse scientists to discuss the models and consider investigations. Ultimately, collaborative clinical trials will be necessitated since this malignancy is rare.
We believe that collaborative research with multiple experts in diverse fields who share data and concepts will be necessary in order to apply the knowledge and develop the understanding of how to connect this chimeric protein mutation and ultimately produce an effective treatment.
Copyright: This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Curious Dr. George | Plumbing the Core and Nibbling at the Margins of Cancer

A Proposed New FDA Drug Approval Pathway: “Conditional”

 

Al Musella, DPM, President, Musella Foundation For Brain Tumor Research & Information, Inc., Hewlett, NY. Marty Tenenbaum, PhD, Founder and Chair, Cancer Commons, Los Altos, CA

Q: The delay time from discovery/observation, through validation to approval and distribution/use of new cancer treatments remains excessive. With promising experimental treatments, advanced computer technology and biostatistics, creative alternatives to traditional randomized clinical trials, and a government seeking efficiencies, might it now be time for the FDA to issue: “Conditional Approvals”?
A: The first advances in oncology occurred at a time when there were no regulations. Doctors had ideas, and put them to work immediately. They adjusted and combined treatments as needed until they were optimized and became standard treatments. Many types of cancer were cured by this work.
Unfortunately, for patients, with glioblastoma, pancreatic cancer and other rarer cancers, the prognosis remains dire: average survival with currently approved treatments is less than 2 years. These patients can’t afford to wait a decade or more for new drugs to be approved.
The good news is that the oncology drug development pipeline is full of promising targeted- and immuno-therapies that have already demonstrated safety and at least some evidence of effectiveness. However, under current regulations, it will take years before the average patient can get access to these potentially life-saving treatments. Moreover, it is likely that a cure will involve intelligent combinations of treatments. Under current regulations, combination testing cannot even begin until these drugs are approved. And what if a new treatment was not effective as a monotherapy, but could be an essential component of a multi-drug cocktail, say to block a resistance pathway. Catch 22. Under current regulations, many good ideas will never get to patients, and those that do get approved have to be priced so high that many patients cannot afford them.
We would like to propose a new pathway to FDA approval – the “Conditional Approval” – that addresses these issues. It would allow the FDA to approve a treatment that shows safety and a biological effect in a small group of patients. (Click HERE for more details). The twist is that it would require patients using these drugs to participate in a registry where their doctors submit details on the treatments they use, side effects and outcomes.
Conditional approval would be granted to treatments that have been proven safe in a clinical trial(s) with at least 25 patients, and have demonstrated biologic activity: an improvement in a biomarker, brain scan, progression free survival or overall survival.
Once approved, the treatment could be offered as if it had a standard approval, and could not be denied by insurance as being “experimental”. However, all patients who use a conditional treatment would be required to participate in a registry for the duration of the conditional approval period, and to sign a consent form acknowledging and agreeing to the risks inherent in undergoing a treatment whose safety and efficacy have not been fully tested.
The FDA would conduct periodic reviews of this registry data, with three possible determinations: 1) If the safety is questionable or if the results look worse than the standard treatments, conditional approval would be withdrawn, and the manufacturer could continue on the standard paths of approval. However, the FDA could not use these results against the standard approval tracks, as the patient population was not controlled and patients were combining other treatments with it; 2) If the results look at least 20% better than the standard treatments, in the first 50 patients over a predetermined period of time, full approval is granted; or 3) If the results are similar to standard treatments, the conditional approval is maintained until the review shows either the treatment is good enough for full approval or bad enough to withdraw approval.
The decision to try a conditionally approved drug, alone or in combination, would be up to treating physicians, who could consult with peers through a network linked to the registry or use a decision support app that exploits the registry as a database (e.g., show me all treatments and combinations that have been tried on similar patients, sorted by most effective, most cost effective, best risk / benefit ratio, cost, or least side effects.) Such apps could also support low cost point-of-care ‘registry trials,’ whereby patients are dynamically assigned to treatment arms based on expert recommendations and clinical outcomes for similar patients.
It is painfully obvious that the way to cure our currently incurable cancers is to use a combinational approach. We may well have the necessary tools available today—but we are not allowed to use them. When faced with certain death, we believe it is acceptable to not have 100% proven safety and efficacy. We will be approaching the FDA with a request to pilot conditional approval in brain cancer – because life and death decisions should not be made based on regulations – they should be based on what is best for the patient, as determined by the patient and his/her doctors.
We have been working on this plan for a while, but we think now is the time for it to actually be approved. Everything is coming together – like the perfect storm:

  1. We finally have a few experimental treatments in the pipeline that look really good.
  2. The new President is slashing regulations and calling for faster FDA approvals and for slashing drug prices.
  3. Computer technology and biostatistics have reached the point where our plan for a registry trial can be just as reliable as traditional phase 3 trials – maybe more so.

IF this proposal is put into effect, it could lead to rapid advances in the treatment of brain tumors, and the possibility of a breakthrough cure in a few years, instead of the decades it would take on the current path.
We need your support and will be reaching out in a few weeks for help with writing letters and making phone calls. Meanwhile, we welcome your thoughts.
Copyright: This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
 

Curious Dr. George | Plumbing the Core and Nibbling at the Margins of Cancer

AACR: Advances in Immunotherapy to Continue

Srivani Ravoori, PhD, Associate Director, Science Communications; American Association for Cancer Research

Intro: The American Association for Cancer Research (AACR) publishes a forecast blog post at the start of each year to ask prominent cancer research leaders what they envision the new developments will be in areas like immunotherapy, precision medicine, cancer prevention, and health disparities.

In this excerpt from the 2017 post in Cancer Research Catalyst, we interviewed immunotherapy expert Elizabeth Jaffee, MD, on her views on what might develop in immunotherapy this year. Dr. Jaffee is the Dana and Albert “Cubby” Broccoli Professor of Oncology and Professor of Pathology at Johns Hopkins University School of Medicine and the Sidney Kimmel Comprehensive Cancer Center at Johns Hopkins. She is also the Associate Director of the Bloomberg~Kimmel Institute for Cancer Immunotherapy at Johns Hopkins. Dr. Jaffee was recently named the President-Elect of AACR for 2017-2018.

Q: The American Association for Cancer Research (AACR) is arguably the World’s most important professional organization of volunteers in the cancer field. As we enter 2017, what does AACR consider the field’s greatest challenges and opportunities?

A: “The good news in the field of immunotherapy is that we are learning a lot more about the signals that tumors send to inhibit an effective immune response against them,” says Jaffee, who is a past board member of the AACR.We have already turned this knowledge into therapeutics that inhibit some of these signals (checkpoint inhibitors) so the T cells can be effective in attacking the cancer cells, and developing therapeutics that can activate certain other cells within the tumor microenvironment (checkpoint agonists) to help further activate the T cells, she says.
With these approaches, we have been able to convert some metastatic cancer patients with weeks to live into those with chronic disease living a better quality of life, Jaffee adds. In 2017, we are going to see checkpoint inhibitors being approved for more cancer types and as first-line treatment for some cancers, she notes.
The bad news, however, is that these drugs only work for about 20 to 25 percent of all cancers. Further, these drugs can unleash autoimmunity in patients who respond. The side effects can be controlled currently with steroids in some patients, but this year we will learn more about ways to deliver these drugs in a more targeted way to circumvent the toxic side effects, Jaffee says.
“In 2017, I expect to see the development of new drugs that target additional immune checkpoints,” says Jaffee. One reason why almost 70 percent of cancers do not respond to checkpoint inhibitors is that the cancer cells inhibit different pathways that affect T-cell function. Therapeutics targeting immune-evasion mechanisms other than the PD-1/PD-L1 checkpoint, such as IDO, CD40, OX40, TIM-3, LAG-3, and KIR, are already in early clinical development. We will see them progress to clinical testing, alone or in combination with PD-1/PD-L1 drugs, and some of them may be approved or may come close to approval this year, Jaffee predicts.
This year, we will also see a lot of preliminary data identifying new biomarkers of immunotherapy response, according to Jaffee.

AACR Immunotherapy 2017 What Advances Can We Expect

 
Other approaches to get more patients to respond to immunotherapies include activating T cells using vaccines, radiation therapy, or different types of immune-activating chemotherapies, Jaffee says. Combining immune checkpoint inhibitors with agents that can help uncover cancer antigens, such as PARP inhibitors that can make new tumor antigens available to the T cells, or epigenetic agents that can turn on the expression of certain proteins, is another avenue. “We will start seeing results from such studies this year,” Jaffee notes.
We are likely to make more progress this year in personalizing cancer treatment with vaccines, Jaffee predicts. “We are starting to understand the importance of neoantigens for targeting by the immune system,” Jaffee notes. Tumors of many patients who respond to immunotherapy create neoantigens constantly. If we can identify them by sequencing the tumors, we can develop vaccines against them to jump-start the immune system, she says. “We are going to see several clinical trials trying this approach this year.”
“As a member of the Blue Ribbon Panel, one of the 10 areas we identified as at the point of making huge progress is basic research to better understand the mechanisms behind immunotherapy response,” says Jaffee. Answers to questions such as, “What makes a pancreatic cancer that doesn’t respond to immunotherapy different from melanoma that responds to immunotherapy?” or “Why do some tumors that have the biomarker of response not respond while some that do not have the biomarker respond?” or “How to make CAR T-cell therapy work in solid tumors?” can only be found by pursuing more basic science research, she notes.
“We have the technology to find answers to many basic research questions and there is excitement among academia, industry, and federal agencies to work together; however, we need more funding to pursue such important studies,” says Jaffee. While she is concerned about the uncertainty regarding the scientific priorities of the new administration, she is cautiously optimistic.
Copyright: This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Curious Dr. George | Plumbing the Core and Nibbling at the Margins of Cancer

A Fully Integrated Histo-Molecular Pathology Report

Margaret L. Gulley, MD, Professor of Pathology and Laboratory Medicine, University of North Carolina at Chapel Hill

Q: The CAP, ASCO, ASCP and AMP have developed guidelines for interpretation and reporting of NGS variants in cancers. These analyses are performed in a wide range of types of labs and involve professionals from many disciplines. The reports should take into consideration the bioinformatics, molecular data, source of specimen, age, gender, and location of patient, treatment background, morphologic diagnosis, immunohistochemical tumor profile, and special stains. Who is the best qualified and positioned person to consolidate all of this information and produce a final integrated pathology report and how should this be accomplished?

A: Pathologists are the best professionals to synthesize data from all the tests done on a given tumor specimen via an integrated report that is actionable for downstream medical decision-making.
Hematopathologists enthusiastically embrace new technologies and they are the role models for data integration and analysis of microscopy, flow cytometry, histochemistry (IHC, ISH, FISH), karyotype, PCR, or sequencing.
College of American Pathologists guidance for reporting molecular test results suggests that all results on a given tissue be synthesized by one pathologist, typically the histopathologist although increasingly the molecular pathologist who performs ancillary genomic testing. Since molecular results are best interpreted in the context of histomorphology of input tissue (e.g. percent malignant cells), and in the context of the clinical dilemma to be solved by the test, good communication is essential to assuring that professionals selecting the tissue block, doing the test, and interpreting results provide answers to pertinent medical questions. Access to patient medical records promotes high quality interpretation of histo-molecular findings.
Pathologists are in the best position to allocate precious (often small) specimens and to prioritize which ancillary tests are most critical in a given clinical scenario. Certain tests are feasible only on selected specimen preparations (e.g. karyotype requires fresh tissue), and the pathologist is vital to assuring that tissue is processed in a manner that maximizes success.
In many cases, ancillary tests are ordered by the surgical pathologist who feels comfortable synthesizing results of the test with histomorphology, even if different professionals (e.g. cytogeneticists, molecular pathologists) performed and interpreted raw data. What about ancillary tests ordered by clinicians, sometimes months or years after the microscopist issued their histopathologic interpretation? Specimen requirements must be considered (e.g. fixative type, size, % malignant cells) along with whether to test primary vs metastasis, recent biopsy vs an older or larger resection, in situ vs invasive components, etc. There needs to be better compensation for the expert work of the surgical pathologist to understand the clinician request in order to retrieve and select the best archival tissue portion for the test. Compensation is also needed for the resources required to incorporate test findings into a revised integrated report.
Ancillary tests have benefits and limitations that are best understood by the testing laboratory, so it may be realistic for the testing laboratory to synthesize data from the surgical pathology report, rather than expecting the surgical pathologist to integrate lab data. When a histopathologist feels uncomfortable performing an integrative interpretation of lab data, the histopathologist should at least make it clear in their (addendum) pathology report what test(s) were ordered on which block in order to facilitate work by another professional in synthesizing findings and minimizing accidental repeat testing.
The criteria pathologists use for tumor diagnosis and classification are evidence-based and are updated periodically by various professional groups. Increasingly, ancillary tests are value-added components of standard-of-care surgical pathology workups. Since every patient is different and every tumor is different, pathologists should be valued for their expert judgment and for taking responsibility for translating histo-chemical findings into expert consultations.
Copyright: This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Curious Dr. George | Plumbing the Core and Nibbling at the Margins of Cancer

Paying for Precision Oncology – Who Decides?

Patricia Deverka, Principal Researcher, Health Care Group
American Institutes for Research, Chapel Hill, NC; Adjunct Associate Professor, Center for Pharmacogenomics & Individualized Therapy, University of North Carolina at Chapel Hill

Q: Paying for Precision Oncology – Who Decides?
A: There is tremendous enthusiasm for the scientific rationale and clinical promise of precision oncology amongst researchers, oncologists, industry and subgroups of cancer patients. Nearly three-quarters of the compounds in the oncology pipeline have the potential to become personalized medicines and over 90% of the $25 billion personalized medicine market in 2015 came from the sale of oncology drugs that required use of a companion diagnostic, such as Herceptin and Her2 testing. The typical companion diagnostic analyzes single gene mutations or abnormal gene expression profiles, in contrast to next generation tumor sequencing (NGTS), which assays genomic alterations in tens to hundreds of genes simultaneously. Tumor profiling using NGTS now occurs frequently at major cancer centers across the U.S., however reimbursement for both the test and the off-label use of targeted therapies is unpredictable and challenging for most molecular tumor board staff, oncologists and their patients. Without coverage by either private or public insurers, most patients will not have access to NGTS tests and targeted therapies.
Payers typically use a stepwise approach for coverage decision-making when evaluating the evidence supporting the technical and clinical aspects of new molecular diagnostic tests. Based on published studies, technology assessments and professional guidelines, payers assess:
1) analytic validity – whether a new test is accurate and reliable,
2) clinical validity – if the result is medically meaningful, and
3) clinical utility of the test – whether results affect clinical decisions and improve health outcomes.
Payers are one of key stakeholder groups that require evidence of clinical utility for decision-making, however clinical utility cannot be demonstrated until analytic and clinical validity are established. As compared to “ single test/single result” assays, each of the two validation steps are inherently more difficult for payers to assess with NGTS tests, due to the complexity of the technology, as well as variation in informatics analyses and procedures for interpretation and reporting of sequence variants. Economic considerations such as whether use of the test will lead to cost offsets (e.g., reductions in hospitalizations) are potential factors in test evaluations, but they also depend on evidence of clinical utility since changes in resource utilization must be linked to an alteration in patient outcomes, such as improved response or avoidance of side effects compared to an alternative approach.
When determining medical policy for an insured group, assessment of the evidence base typically leads to a determination of whether a test is “medically necessary” and therefore covered, or “experimental/investigational” and not covered. NGTS tests create particular challenges for payers given that they include both established and novel targets and test results are used to guide the use of off-label therapy, thereby challenging the standard approach to evidence evaluations. There is growing recognition that the traditional reliance on randomized controlled trials to demonstrate clinical utility is not feasible and that use of basket trials, observational studies, registries, n-of-1 studies and modeling may be required. These study methods are less familiar to payers and will require education and published examples. Nevertheless, more payers are acknowledging that flexibility in evidence requirements is required and oncology is one of the most likely areas for innovation.
However while many payers view NGTS clinical applications as a reasonable hypothesis, they still consider this approach to be investigational. What is needed now is stronger evidence to support the proof of concept of using NGTS to guide treatment decisions. One example is the National Cancer Institute’s basket trial, Molecular Analysis for Therapy Choice (NCI-MATCH), which pairs patients’ genomically profiled tumors with a treatment regardless of anatomical location. The study has recently expanded the number of targeted therapy treatment arms and is actively enrolling patients. In the meantime, the current model of health care delivery encourages standardization and use of treatment pathways as a mechanism to promote quality of care and reduce unnecessary costs. Thus the purported advantages of NGTS tests such as individual patient customization run counter to current system incentives.
Without convincing evidence that NGTS tests lead to better outcomes through better decision-making, payers will continue to struggle in the near-term to develop transparent, evidence-based affirmative coverage decisions. One example of how the Blue Cross Blue Shield Association is addressing the problem is a new subscription service called Evidence Street. Researchers in this group conduct technology assessments of new drugs, devices and tests, including molecular diagnostics. They work with test developers, commercial laboratories, drug companies and professional societies to ensure more consistent and transparent evidence standards for their technology assessments. Evidence Street does not make coverage decisions, but provides evidence to support Blues plans in their technology evaluations throughout the country.
So what is the path forward? There needs to be a combination of efforts from all stakeholders. Researchers in both for-profit and academic settings need to design and publish context-specific NGTS clinical utility studies, using a variety of appropriate methods. They should involve patients, caregivers and payers in study planning to ensure relevancy of results and use a range of outcome measures, including those developed by various research groups supported by NHGRI, such as the Clinical Sequencing Exploratory Network (CSER). Whenever possible studies should be conducted in practice-based research networks to reduce data collection costs and reflect usual care.
There are efforts underway to collect test use and patient outcomes data more systematically as part of multi-stakeholder supported registries (ASCO’s Targeted Agent and Profiling Utilization Registry; Molecular Evidence Development Consortium) and through third party oncology data aggregators (Syapse, Flatiron). Medicare’s MolDX (molecular diagnostics) program administered by Palmetto GBA allows the option of provisional coverage for tests under its Coverage with Data Development program which is used for promising tests that meet evidence standards for analytic and clinical validity but lack sufficient proof of clinical utility.
While NGTS tests are potentially disruptive technologies, if they are to be widely covered by private and public payers, stakeholders will need to demonstrate evidence of test accuracy, medical relevancy and ability to improve the process and/or outcomes of care. Assessment of clinical utility answers practical questions such as “Can and will we take action on the NGTS test results?” and “Does the outcome change in a way in which patients and other stakeholders find value relative to the outcome without the test?” Given the opportunity costs for patients, families and the health care system, we must work together to answer these questions.
Copyright: This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Curious Dr. George | Plumbing the Core and Nibbling at the Margins of Cancer

A Plea for Gold Standards in Precision Oncology

George Lundberg, MS, MD, ScD, MASCP, FCAP, Chief Medical Officer and Editor in Chief, CollabRx, a Rennova Health Company; Editor at Large, Medscape; Executive Adviser, Cureus; Consulting Professor of Pathology and Health Research and Policy, Stanford University; President and Chair, The Lundberg Institute; @glundberg

Q: Is Precision Oncology Accurate?
A: Precision oncology in 2017 is neither accurate nor precise.
From my clinical pathology background, accurate means correct, as close to a “Gold Standard” as you can get, regarding sensitivity and specificity. Precise means reproducible. Prasad and Galehave recently demonstrated that not even the use of the term “precision oncology” is precise. It has changed often and dramatically.
We need to determine current (they would float with new information) “Gold Standards” for every step (and many sub-steps) in the Brain to Brain loop of molecular testing and clinical actions in oncology.
For example:
WHICH cancers should be subjected to some sort of molecular analysis? All, or all except non-melanoma skin cancers? Only those cancers for which there are FDA approved therapies matched by molecular definition? Cancers for which there are open relevant clinical trials?
WHEN should NGS be performed on a cancer? At initial diagnosis, or not until after spread, or both, or after other treatment failures, or depends on histopathologic diagnosis?
WHERE to sample the tumor? Primary, and/or one or many metastatic sites, or depends on tumor diagnosis?
Should SAMPLE be liquid biopsy and/or solid tumor biopsy or fine needle aspiration cytology or whole tumor segments after surgical removal, or paraffin block, or perhaps other sources?
WHERE should testing be done: local pathology lab, regional reference lab, nearest academic or comprehensive cancer center, or large commercial lab company? Perhaps this should in part depend on lab accreditation, TAT, and price.
HOW TO CALL the variants consistently via bioinformatics and experienced professional judgment, reproducible from lab to lab.
HOW is the mutation(s) identified? As a passenger or driver? Relevant or irrelevant?
DOES a particular mutation, CNV or fusion confer “actionability”?
Would ACTIONABILITY include a clinical trial match as a criterion?
WHAT FDA-approved, or investigational, single drug or combination of drugs, concurrently or in sequence, would be the best choices?
WHEN/WHETHER should clinical trials be large and randomized, or at the level of an n-of-1 after a fully informed consent and expanded access (compassionate use)?
WHO/WHAT should pay for the costs? Patient, insurance company, government, drug company, medical laboratory, or medical treating institution?
HOW MUCH COST for the NGS testing and for the drug (which can cost $100,000- 200,000 or more per drug, per patient/cancer) can be justified by QALY or research?
WHAT THERAPEUTIC STRATEGY might make scientific sense for that patient’s cancer?
WHETHER there is ANY currently available drug that is a reasonable choice (estimated 10% of cancers).
HOW to ascertain and how to report frequency and severity of unanticipated adverse clinical effects?
WHAT outcome can be anticipated and communicated to the patient? Is there a possible cure? A few weeks or months of average/median extended life seems to me a low “bar” to affirm cost-effective “success.”
Acceptable QUALITY OF LIFE during treatment? Meaningful? Useful? Worthwhile? Pain free? Pain tolerable? Desired? Comfortable? Connected? Freely chosen?
Roughly 1.7 million Americans a year are diagnosed with potentially lethal malignancies. About 600,000 die. Establishing GOLD STANDARDS, consistent with current knowledge, is really important for many patients, the field, the public health and many micro-economies.
Copyright: This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Curious Dr. George | Plumbing the Core and Nibbling at the Margins of Cancer

Precision Oncology: Requirements for the Next Leap Forward

Razelle Kurzrock, MD, Chief, Division of Hematology and Oncology, UCSD School of Medicine; Senior Deputy Director, Clinical Science; Director, Center for Personalized Cancer Therapy; Director, Clinical Trials Office, UCSD Moores Cancer Center, San Diego, California

Q: Some workers in the field of precision oncology are growing despondent because of observed limitations of therapeutic effectiveness. What do you believe are now the best approaches to consider in order to move towards potential cures?
A: The pillars of precision oncology are genomically-targeted and immune-targeted therapies. Of interest, the fields of immunotherapy and genomics, often considered to be separate, are actually married to each other. Inhibiting checkpoints exploited by the tumor to shield itself from the immune machinery can reactivate the immune system. But, once reactivated, the immune system must be able to differentiate tumor cells from normal elements. The mutanome produces neo-antigens that permit this differentiation—and the more genomically aberrant the cancer, the more likely that the reactivated immune system will eradicate it. In contrast, with genomically-targeted therapies, the fewer the aberrations, the less likely that resistance will occur.
Yet, in treating patients, we are using old clinical trial paradigms rather than adjusting to the realities unveiled by genomic interrogation.
For instance, we administer genomically targeted therapies, mainly as monotherapy, to patients with heavily-pretreated, advanced metastatic disease—ie., the equivalent of heavily-pretreated chronic myelogenous leukemia (CML) blast crisis (where the response rate to imatinib is vanishingly low) (see Table). These patients have highly complex molecular portfolios and we should not expect that genomically targeted monotherapies will be curative.
We should not be then wringing our hands and talking about the limitations of our therapy–but rather the limitations of our approach.
The real eye opener is that so many patients with such late-stage disease respond at all.  The response rate of metastatic solid tumors to genomically targeted agents is actually much higher than that for imatinib in late-stage CML (see Table). Yet, CML is considered the poster child for successful genomically-targeted therapy, with the median survival increasing from about 4-5 years to a near normal life expectancy.
The key ingredients for transforming CML included:

  1. Identifying the driver (Bcr-Abl)
  2. Identifying matched targeted therapy (imatinib)
  3. Moving from late-stage disease (blast crisis—the biologic equivalent of metastatic solid tumors) to newly diagnosed disease.

In solid tumors, as in CML, the cancer evolves with time and becomes more genomically complex. If we therefore want to use genomically targeted treatments to cure more patients with solid tumors, we need to fundamentally change our approach:

  1. Move to newly diagnosed disease
  2. Use customized combinations rather than monotherapy or random combinations for metastatic disease, which inevitably harbors complicated molecular alterations that differ from patient to patient.

Intriguingly, for patients whose tumors have highly chaotic genomes, immunotherapy is most effective. Indeed, in diseases such as melanoma, characterized by high tumor mutational burden, the long-term disease-free survival with immunotherapy may now be about 50%–an incredible improvement over the ~18% two-year survival seen with traditional chemotherapy.
In summary, we should be elated with the results yielded to date by precision oncology. If we, however, want to leap from improvements to cures, we need to stop relying on age-old clinical trial designs and adjust our strategy to the biology of each patient’s cancer.
 

Table: Precision Treatments and Response Rates

(All solid tumor results were obtained in late-stage, metastatic disease analogous to heavily-pretreated blast crisis of CML)

 

Cancer diagnoses Precision Treatments Biomarkers Response rates (%)
CML (newly diagnosed)CML (heavily-pretreated blast crisis) Imatinib BCR/ABL ~100%~0-10%
Colorectal cancer Pembrolizumab Mismatch repair deficiency ~70-80%
Gastrointestinal stromal tumors Imatinib KIT ~50-80%
Hodgkin lymphoma (refractory) Nivolumab
Pembrolizumab
PD-L1/PD-L2 amplification ~65-87%
Melanoma Dabrafenib plus
Trametinib
BRAF V600E ~50-60%
Non-small cell lung cancer Crizotinib
Vemurafenb
Erlotinib
Osimertinib (T790M)
ALK, ROS1
BRAF
EGFR
EGFR T790M
~60-70%
~40%
~70%
~70%
Ovarian cancer Olaparib BRCA ~50%
Prostate cancer Olaparib BRCA ~86%

Copyright: This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Curious Dr. George | Plumbing the Core and Nibbling at the Margins of Cancer

Clarity Still Awaited for Prostate Cancer Diagnosis


Nelson N. Stone, MD, Professor of Urology and Radiation Oncology at The Icahn School of Medicine at Mount Sinai, NY, NY; CEO and Founder, 3DBiopsy, Inc., Aurora, CO. E. David Crawford, MD, Professor of Urology and Radiation Oncology at The University of Colorado, Anschutz Campus, Aurora, CO; Medical Advisor and Founder, 3DBiopsy, Inc., Aurora, CO

Q: Whether and how to decide to do a prostate biopsy when a patient is found to have a “high” PSA remains a debated issue. The Lancet published a large controlled UK study on 1/19/2017 evaluating multi-parametric MRI and TRUS (PROMIS) that some observers have found to be informative. Do you believe that these findings will affect clinical practice?
A: Yes and no: in a landmark study (PROMIS) Ahmed et al. found an 88% sensitivity and 45% specificity for mpMRI in biopsy naïve men for clinically significant prostate cancer. The authors proposed mpMRI as a screening test before biopsy (to eliminate 25%). The caveat here was the biopsies were performed both by a transperineal mapping (TPM) and a transrectal (TRUS) approach-with the latter being only 50% accurate. This would argue against doing TRUS biopsies and taking the men with a positive MRI to the operating room for a more aggressive mapping procedure. If TPM is known to perform better than TRUS (70% vs 30% cancer detection rate) why are only 5% of biopsies performed by TPM?
The TRUS needle takes a 17mm specimen and most lesions (regardless of aggressiveness) are invisible to ultrasound. This makes the TRUS procedure “semi-blind” at best. Trying to span the longer sagittal length of the gland (often > 50 mm) with the TRUS needle requires multiple in-line punctures resulting in more than 50 biopsies per procedure (as opposed to 12 for TRUS). Devices developed specifically for TPM could improve the diagnostic accuracy and shorten the time required. It could also offer the potential to treat patients with targeted focal therapy, an option for more than 1/3 of men with prostate cancer. A biopsy needle and actuator that takes one specimen across the length of the prostate (between 20 and 60 mm), a 3-D image guided tracking program that provides a digital map for focal therapy and a pathology device that preserves the integrity of the longer core allowing the pathologist to identify the location and of every lesion are currently under development. A system such as this will be able to accurately select patients for surgery or observation and open the door for accurate and safe focal ablation.
The initial step in evaluating prostate cancer risk is the PSA test, most of which are ordered by the patients’ family physician. The TRUS biopsy misses cancers in 30% of patients. In patients with prostate cancer, 75% have low to intermediate risk disease, but because prostate cancer is multifocal in 70% of patients, smaller potentially lethal cancers often coexist with the low risk cancers. Out of an abundance of caution, physicians recommend surgery (or radiation) knowing that as many as 50% could have avoided it. In those selecting observation, 40% switch to active treatment because aggressive cancer declares itself or because of worry. It is no wonder that the USPSTF has said that PSA testing does more harm than good.
Crawford et al. has recently published data that suggests if the PSA < 1.5 ng/ml the risk of prostate cancer is low. This argues for getting baseline testing in all men and in the 70% with low PSA’s reducing testing to every 5 years. For those with a slightly elevated PSA, a PHI score or a liquid marker such as SelectMDx (urine test) could help further reduce unnecessary biopsies by identifying men at risk for high grade cancers.
Copyright: This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Curious Dr. George | Plumbing the Core and Nibbling at the Margins of Cancer

AACR: Advances in Precision Medicine to Continue

Srivani Ravoori, PhD, Associate Director, Science Communications; American Association for Cancer Research

Intro: The American Association for Cancer Research (AACR) publishes a forecast blog post at the start of each year to ask prominent cancer research leaders what they envision the new developments will be in areas like immunotherapy, precision medicine, cancer prevention, and health disparities.

In this excerpt from the 2017 post in Cancer Research Catalyst, we interviewed precision medicine expert George Demetri, MD, Professor of Medicine at Harvard Medical School, Director of the Ludwig Center at Dana-Farber/Harvard Cancer Center, and Director of the Center for Sarcoma and Bone Oncology at Dana-Farber Cancer Institute. Dr. DeMetri is a founding member of the CollabRx Editorial Advisory Board and Chief Editor for CollabRx Sarcoma. Here’s what he had to say about developments in precision medicine for treating cancer – as well as his thoughts on aspects of the 21st Century Cures Act, federal support for cancer research, and the value in supporting basic science.

Q: The American Association for Cancer Research (AACR) is arguably the World’s most important professional organization of volunteers in the cancer research field. As we enter 2017, what does AACR consider the field’s greatest challenges and opportunities?

A: “The good news is that we are still uncovering virtually monogenic diseases – diseases that are driven by single oncogenic fusions or mutations,” says Demetri, a board member of the AACR. Therapies targeting single mutations, such as NTRK fusions, lead to durable and dramatic responses, he notes.For the vast majority of common cancers, however, it is not a simple monogenic problem. We need more combination therapies and more research to find where the Achilles’ heel is, Demetri says. “This is where we, as professionals, need to be careful about overstating the value of precision targeting to the public without also getting too negative.”
“Cancer diagnostics are going to get better and better,” says Demetri, and predicts that we may be on the verge of putting together a composite set of predictive and prognostic biomarkers. “Our diagnostic tools are getting so sophisticated that we can put cancers into different bins at different times in a patient’s treatment course.” With treatment, cancers acquire new mutations to thrive, and with technological advances we can now, in many cases, track the different mutations that are likely driving the disease and match them with different drugs.
While Demetri notes that we have to be intellectually honest about the fact that most patients treated with targeted therapies develop resistance, he is not yet giving up on our aim of finding cures. It may appear as not achievable now, but we are getting there through combination therapies, he adds.
We really need to hone our ability to pick combinations that are not cross-resistant and truly synergistic or complementary, he notes. One of the many approaches is to tie targeted therapies with less-targeted, more multifunctional modalities such as immuno-oncology, which can trigger the immune system. “Even though checkpoint inhibitors are a multibillion dollar market, I would say they are still in their infancy as far as our extent of understanding goes,” notes Demetri.

 
We are likely to see more efforts in developing very potent epigenetic drugs, according to Demetri. Drugs that target EZH1, EZH2, and bromodomain inhibitors are an alternative way to addressing the bad wiring in cancer cells, notes Demetri, who expects to see more studies testing combinations of epigenetic therapies with targeted therapies and immunotherapies.
This year, Demetri predicts, we will gain further understanding into the smaller, molecularly defined subsets of cancer, and develop even better, more precisely targeted therapies.
A recent development is the work on protein degradation technologies, which make it possible to bring the ubiquitin-proteasome system to degrade a protein of interest in a very catalytic way, Demetri says. “This could be a real game changer,” he predicts. Researchers are still trying to understand how to use the protein degradation system appropriately to target undruggable proteins such as Myc and Ras. “I think this is an exciting area of research and this year we are likely to see some proof-of-concept studies. I feel it is just about to hit the mainstream,” Demetri notes.
Progress with cancer genomic medicine depends on a key element, data sharing—large genomic datasets made available to all so the significance of the genetic alterations present in patients’ tumors could be gleaned through collective evidence. However, data sharing comes with many challenges, such as protection of patients’ privacy and ownership of the data.
Progress with breaking the barriers of genomic data sharing will come from continued advocacy from the patients rather than the professionals, Demetri says. “It is vital to leverage our interactions with patients and patient advocates who want the same things that we do to push the kind of data sharing that will advance the field.”
In the event that the efforts from the federal government to further data-sharing initiatives are insufficient, we may see the private sector jumping in to build databases, he notes. “A lot of this will depend on the next heads of the NIH and the NCI,” he adds.
Regarding the provisions in the 21st Century Cures Act to roll back FDA regulations to accelerate drug development, “I like the idea that we can streamline and simplify and have more transparency in the rules for therapeutic development in cancer,” Demetri says. “I’m optimistic that we will keep the focus on both safety and efficacy.” Ultimately, if a drug does not work sufficiently well to justify its use or if it is prohibitively expensive compared to other equally effective options, Demetri says he would trust our community of professionals would have the integrity to not use it, and to explain these choices and options to our patients.
We are in the post-genomic era where we need to overlay other elements (such as epigenetic compensatory mechanisms, metabolic or anatomic resistance mechanisms, etc.) on top of simple genotyping and basic interpretation of the genotype, Demetri says. Adding this is more complicated and will take a lot of effort and money to fund the necessary research. “I feel that the advances against cancer are very powerful and will go forward no matter what, but the question is, how fast can we get there and at what scale and scope?” Demetri asks.
“Reading the tea leaves of the new administration, we may be facing less than optimal federal support for cancer research, but I’m hopeful that we will see more private sector-based and philanthropy-based partnerships that will step up to support cancer researchers at this important juncture,” Demetri says.
“What I’d like to see this year is some time for positive reflection to realize the importance of public funding of science, which is tied to the importance of funding investigators who can follow their instincts to make new discoveries,” says Demetri. He adds, “Fundamental, curiosity-driven research is the only way we are going to get to unthinkable breakthroughs – real paradigm shifters akin to kinase inhibitors in the late 90s and immuno-oncology drugs more recently.”
“We need to recognize that there is a social good to funding basic research: without fundamental scientific understanding, applied medical research is limited to moving already existing therapies around the chessboard. We need science-based novel agents and new approaches to change the therapeutic approach and help patients in ways we might not be able to conceive of today,” Demetri notes and adds, “We have not told that story well enough and I’d like 2017 to be the year where we make that a little clearer to the public so there’s more support for basic science. That is critical.”

Copyright: This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Curious Dr. George | Plumbing the Core and Nibbling at the Margins of Cancer

Why the 21st Century Cures Act is a Good Thing

Mary Woolley, President and CEO of Research!America

Q: You attended the December signing by President Obama of the 21st Century Cures Act and are recognized to be a strong supporter. Yet harsh criticism of it has quickly appeared in JAMA, BMJ, a variety of other venues, as well as on these pages. Please tell our readers why this is good legislation and how the public health will be protected from exploitation in this very different regulatory world.
A: The bi-partisan 21st Century Cures Act is grounded in a commitment to assuring that our nation’s research ecosystem has the capacity to accelerate the pace at which safe and effective medical advances reach patients. The Act will expand the efficiency, reach and impact of medical discovery in a manner that sustains crucial safeguards against unsafe or ineffective products. The law finances more research, helps to reduce the administrative cost surrounding basic research, and takes additional steps to overcome challenges the Food and Drug Administration (FDA) faces. Patient groups, health care professionals, academic leaders, industry leaders and the FDA and the National Institutes of Health (NIH) were frequently consulted regarding provisions of this bipartisan bill, and their insights were incorporated. We at Research!America were closely involved throughout development of the bill, and are pleased that it crossed the finish line last December.
After years of automatic spending cuts and flat-funding, researchers have been stressed as they work to find solutions to deadly and complex diseases. The 21st Century Cures provides some relief in that regard with an initial $352 million in FY17 to support the NIH Precision MedicineBRAIN, and Cancer Moonshot initiatives. Congress recognizes that these dollars are targeted and temporary; they do not supplant the need to grow NIH’s annual budget. As reflected in surveys that Research!America commissions regularly, Americans recognize the importance of federally-funded research and support streamlining the pursuit of medical research and innovation.
The FDA, which has for years been underfunded, will also receive new funding with an initial $20 million in FY17 to improve efficiencies in the R&D pipeline. This new funding, in combination with other provisions of the law, is particularly meaningful as it will give the FDA more flexibility to recruit additional experts needed to assure that our regulatory system can properly evaluate rapidly evolving science in areas such as immunology and regenerative medicine.
One important example of rapidly evolving science is the potential to diversify the evidence base used to evaluate the safety and efficacy of medical advances by leveraging “real world evidence” (RWE). The Cures Act defines real world evidence as “data regarding the usage, or the potential benefits or risks, of a drug derived from sources other than randomized clinical trials.” While concerns have been raised that the RWE provisions would force the FDA to relax critical safety and efficacy standards, these provisions were developed with agency input. This section of the law is designed to empower, not require, the FDA to capitalize on real world data. Real world data will be used when — and only when — it is appropriate to do so.
Faster medical progress saves lives. The 21st Century Cures Act will fuel faster progress. It’s incumbent upon research advocates to engage elected officials to build on the Cures Act, and ensure that adequate funding is provided to make the promise of science and innovation a reality in our lifetime.
Copyright: This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.