October 16-17, 2017  |  Orlando, Florida USA

An Official Satellite Event to the American Society of Human Genetics (ASHG) Annual Meeting

REQUEST A BROCHURE SPEAKING OPPORTUNITIES

The Personalized/Precision Medicine Blog

The official blog of the Annual Personalized and Precision Medicine Conference provides readers with information, insight and analysis regarding the field of personalized and precision medicine, genomics, genomic interpretation and the evolution of healthcare in the post-genomic era.

Novartis, Pfizer and Thermo Fisher to Lead Panel Discussion on Developing a Universal Test NGS-Based Companion Diagnostic Designed to Support Multiple Drug Development Programs

In 2015, Novartis, Pfizer and Thermo Fisher Scientific announced a long-term partnership, with the hopes of accelerating the development and registration of several new non-small cell lung cancer (NSCLC) drugs. Over the past year, this collaboration has worked on developing and commercializing a multi-marker, universal next-generation sequencing oncology test that will serve as a companion diagnostic for NSCLC across multiple drug platforms. 
 
Executives from Novartis, Thermo Fisher and Pfizer will be leading a panel discussion at the 8th Annual Personalized and Precision Medicine Conference, taking place October 12-13, 2016 in San Francisco, CA.  For more information on the conference, visit: http://personalizedmedicinepartnerships.com. 
 
The three executives that will lead this panel discussion are:
Mark Stevenson, MBA, Executive VP and President of Life Science Solutions, Thermo Fisher
Anne-Marie Martin, Ph.D., Senior VP, Global Head of Precision Medicine, Novartis
Hakan Sakul, Ph.D., VP and Head of Diagnostics, Pfizer
 
As there is a growing demand for targeted therapies to replace the “one-size-fits-all” paradigm, stakeholders in the health care industry are looking to satisfy the ultimate goal of providing more personalized therapies and timely access to the appropriate clinical trials. With the continued use of next-generation sequencing, which allows for multiple genes to be simultaneously tested from a single sample, personalized and precision medicine are on the forefront of revolutionizing the health care industry.
 
The NGS-based companion diagnostic test for NSCLC is being developed using Thermo Fisher’s Ion PGM Dx System and Oncomine assays. Hakan Sakul commented on the collaboration, “the Thermo Fisher Scientific NGS panel is aligned with a number of our clinical development programs, providing us with an opportunity to accelerate the development of each of these potential therapies for NSCLC patients with targetable genetic alterations.”
 
Mark Stevenson of Thermo Fisher also commented that, “the potential to generate a paradigm shift through this agreement – from one test for one drug, to one test for multiple NSCLC therapies, represents a significant step forward in realizing the promise of precision medicine.” For more information on the collaboration, visit: http://news.thermofisher.com/press-release/life-technologies/thermo-fisher-scientific-signs-development-agreement-next-generation.
 
Rate this blog entry:
Continue reading
2677 Hits
0 Comments

Syapse Founder Jonathan Hirsch to Present a New Vision of Precision Medicine in Oncology at 8th Annual Personalized & Precision Medicine Conference

Syapse has been a constant presence in the precision medicine industry, and has recently furthered its involvement in oncology through a partnership with Henry Ford Health Systems. Henry Ford Health System, one of the largest national integrated health systems, will provide cancer outcomes data to Syapse. In return, Syapse will use this data in its software platform, which will enable faster, global learning gained from real-world experiences. Through this agreement, there will be a launch of an oncology precision medicine program, with the hope of providing precision medicine to patients in the greater Midwest.

Jonathan Hirsch, president and founder of Syapse, will be highlighting his company’s efforts at revolutionizing precision cancer care at the 8th Annual Personalized and Precision Medicine Conference. To learn more information, visit: http://personalizedmedicinepartnerships.com/.

In the next few months, Henry Ford Health System will launch the Syapse Precision Medicine Platform software across its cancer care facilities, including the new $10 million destination cancer center in Detroit.  Both companies are optimistic that this partnership will advance innovative cancer clinical care, especially for patients in the Midwest. Jonathan Hirsch stated, “We believe that precision medicine will be a core enabling technology for health systems to transform to value-based care.” For more information on this partnership, visit: http://syapse.com/blog/henry-ford-press-release-june-2016/.

 

Rate this blog entry:
Continue reading
1075 Hits
0 Comments

The Elephant in the Room: Drug-Induced Disease Diagnosis

The Elephant in the Room: Drug-Induced Disease Diagnosis
 
Guest Blogger: Kristine Ashcraft, Chief Executive Officer at YouScript
 
 
 
 
One of our clinical pharmacists tells a story of his car breaking down in high school. He had it towed to a local mechanic as none of his attempts to get the car going were successful. The gas gauge was showing one quarter of a tank. After running numerous diagnostics to troubleshoot the issue, the mechanic finally determined that the vehicle was out of gas and the gas gauge was broken. Luckily, no needless repairs were done.
 
This story mimics a common occurrence in healthcare. We often treat the problems our patients describe to us without fully investigating the cause of those problems. For example, let’s say patient is newly diagnosed with depression and started on the antidepressant, citalopram (Celexa). That patient may or may not respond to that medication. Furthermore, that patient may experience, sleeplessness, nausea, anxiety or a myriad of other symptoms. When patients experience side effects of medications, clinicians, who are eager to do the right thing, often perform more diagnostic tests and prescribe additional medications to treat those side effects, attributing them to a new disease or condition rather than recognizing that they were caused by a new medication, such as citalopram in the example above. In a way, it is like trying to fix the car without actually knowing what it is that went wrong.
 
This is not the provider’s fault; the clinical decision support tools available are not great at warning when a medication is the probable cause of a new symptom. Providers are used to looking for allergies to medications as well as inappropriate combinations of drugs (e.g., drug interactions). Providers are also aware of physiologic conditions like poor renal or hepatic function which may alter the metabolism and clearance of certain medications. But, providers were likely never taught how common it is for genetic variations to impact drug response in their patients. As a result, at present, we are limited in looking for potential adverse drug events (ADEs) by systems and/or software that may miss the diagnosis simply because they are not designed to look for it.
 
 
Take Elise Astleford for example. An active, retired minister, she had to give up her weekly bridge games because she was experiencing memory problems.  She was frightened that she was in the early stages of Alzheimer’s Disease. Genetic testing was ordered and it was determined that levels of her allergy medication, chlorpheniramine (Clor-Trimeton), were 2-3 times that of what they were expected to be. Like 6% of Americans, she was not genetically able to produce the liver enzyme CYP (pronounced “sip”) 2D6. CYP2D6 is responsible for converting the allergy medication Elise was taking into a compound the body is able to flush out. This did not create a problem immediately, but the drug levels continued to build up over time. When she stopped this allergy medication, her memory problems resolved. But most patients like Elise do not have genetic testing ordered.
 
Another patient in Florida was experiencing dizziness, fainting, fatigue, and shortness of breath that landed her in the ER numerous times. She was missing a lot of work and her quality of life was suffering. She was referred to a pulmonologist for her breathing issues who learned that she had started an antidepressant, citalopram (Celexa), several weeks before the issues had started. Genetic testing was ordered and the YouScript software predicted a greater than 200% increase in her metoprolol levels, caused by a combination of her CYP2D6 genetics and the antidepressant, which is a CYP2D6 inhibitor. When she was switched from metoprolol (Lopressor, Toprol XL) to bisoprolol (Zebeta), all of her symptoms resolved.
 
Drug-drug interactions are not the only form of adverse drug events; patients may also experience drug-gene interactions.  But they cannot be found if they are not sought, much like failing to look at the gas gauge in the automobile example.  Another way of looking at the problem is to assume an automobile is traveling down a highway at a high rate of speed when a cloudburst occurs.  Sometimes, the windshield wipers cannot keep up with the volume of rain.  On other occasions, the wiper blades may be worn or otherwise inadequate.   In either of these examples the driver (clinician) cannot “see” the road (diagnosis) because the tools being used are not adequate.
 
Let’s examine what we already know about genetics and adverse drug events:
 
1. In 2009, we spent as much on medication related issues as we did on medication.[i]
 
 
2. About half of medications do not work as intended. [ii]
 
 
 
3. The FDA states in drug development guidance that, “Drug-gene interactions should be considered to be similar in scope to drug-drug interactions.”[iii]
 
4. The majority of commonly prescribed medications are metabolized by five main enzymes in the liver called Cytochromes or CYPs (pronounced “sips”) – CYP2D6, CYP2C19, CYP2C9, CYP3A4 and CYP3A5. Click here to see an alphabetical list of medications impacted by genetic variability.
 
5. 93% of patients have a variation in at least one of these five main CYPs. [iv]
 
 
6. A recent study showed that in patients over 65 taking multiple medications, one or more of which is known to be impacted by genetics, ER visits were 71% lower and hospitalizations 39% lower compared to patients on similar medications of similar health and age from a retrospective database.
 
 
7. Any healthcare provider you speak to will confirm that patients of the same age, health status, weight, height, and race do not respond to the same drug and dose in the same way. We know that some of that variability in response is caused by genetics and interactions with other medications. The issue is that the vast majority of the time we are treating these factors as an unknown when we actually have the tools today to treat it as a known.[v]
 
 
I was recently discussing this with Phil Dyer, the Senior Vice President of Healthcare Management Services at Kibble and Prentice, who has over 30 years of experience in medical professional liability for physicians, surgeons, group practices and hospitals. His response was “res ipsa loquitur”; Latin for “the thing speaks for itself.”
 
Genetic testing of cytochrome P450 pathways and interpretation with YouScript software is akin to replacing worn wiper blades, which allows the driver/clinician to see far more clearly and react more rationally to the issues at hand. We must not assume that new symptoms are necessarily a different or new disease or even another manifestation of the current disease, nor assume that the current “windshield wipers” are sufficient to clear the view to find the real problem.  Judiciously used, genetic testing can provide a much clearer view of the diagnosis.
 
The next time you or a patient you treat complains of new symptoms, I hope you will remember to first evaluate current medications as the culprit before prescribing an additional test or medication to treat that symptom, so that this prescribing cascade can be avoided.
 
 
 
 
 
 
Rate this blog entry:
Continue reading
1421 Hits
0 Comments

UC San Diego Research Team to Present Proof of Concept for Predicting Drug’s Side Effects Through Analyzing Red Blood Cells

Researchers at the University of California, San Diego have discovered proof of concept in determining whether a drug will produce adverse side effects based on assessing blood samples. This predictive model predicts how variations in an individual’s genetic backgrounds will affect how the drug is metabolized, and therefore if a side effect is probable.

This UCSD study focused specifically on red blood cells, as they are the simplest human cells. The study was conducted amongst 24 individuals to determine why some individuals experience side effects from ribavirin, a drug used to treat hepatitis C, and others did not. It was discovered that a side effect of ribavirin is that it can cause anemia, or a decrease in red blood cell levels, which occurred in around 8 to 10 percent of patients.

“A goal of our predictive model is to pinpoint specific regions in the red blood cell that might increase susceptibility to this side effect and predict what will potentially happen to any particular patient on this drug over time,” said UC San Diego alumnus Aarash Bordbar, who was part of the research team as a Ph.D. student.

Aarash Bordbar will be presenting this research at the 8th Annual Personalized & Precision Medicine Conference, taking place on October 12-13, 2016 in San Francisco, CA. For more information about conference, visit: http://personalizedmedicinepartnerships.com/.

By predicting how variations in patients’ genes impact how they metabolize the drug, pharmaceutical companies could soon have the ability to conduct predictive screenings before beginning clinical trials. Therefore, this model has the potential to revolutionize what is known about the variance in metabolizing drugs.

In the future, the UCSD research team strives to develop a predictive model for platelet cells, which are vastly more complex than red blood cells, as well as a predictive liver cell model. As this organ is where the majority of drugs are metabolized, and where many drug side effects are manifested, this finding would be revolutionary to pharmaceutical companies in clinical trials. To learn more about this study, visit: http://bit.ly/20oo7rp.

Rate this blog entry:
Continue reading
1091 Hits
0 Comments

Genelex Releases Two Interactive High Risk PGx Drug Charts to Further Precision & Personalized Medicine, and to Reduce Adverse Drug Effects

Recently, Genelex responded to requests by various stakeholders in the personalized and precision medicine industry by creating High Risk PGx Drug Charts, which are available on the YouScript software. By providing these helpful resources when interpreting PGx results with these resources, Genelex hopes to improve the evaluation of drug-drug, drug-gene, and cumulative interactions in real-time.This interactive clinical decision support (CDS) tool analyzes specific patient’s complete drug list and phenotype results, to ensure more predictable patient responses. There are two versions of the chart available through YouScript: one sorted by major drug specialties and the other sorted alphabetically.

As a thought leader in the field of pharmacogenomics, Genelex strives to revolutionize how we can lessen the occurrence of drug side effects. CEO Kristine Ashcraft will be sharing her insights on pharmacogenomics at Arrowhead’s 8th Annual Personalized & Precision Medicine Conference.

For more information on the High Risk PGx Drug Charts, visit: http://genelex.com/blog/new-and-improved-high-risk-pgx-drug-charts/.

Kristine Ashcraft has been a critical part of the leadership team at Genelex since 2000, and has served many roles since then. Ashcraft’s presentation will focus on Genelex’s long-standing commitment to precision medicine and pharmacogenomics, and two critical questions: who do we test, and how do we do it?

Included in the High Risk PGx Charts are drug-gene pairs, the worst-case-scenario impact rating for the interaction, and any other available guidelines from a variety of organizations who developed dosing guidelines around the world.

Rate this blog entry:
Continue reading
1181 Hits
0 Comments

Sage Bionetworks Redefines Framework of Patient Data Sharing Through Release of mPower

In the midst of several colliding perspectives on personal data sharing from both patients and researchers, it is challenging to comprehend how clinical study designs should be conducted to benefit both stakeholders. Sage Bionetworks recently began sharing data from over 9,000 participants of mPower, a mobile health research study for Parkinson’s Disease. As one of the first observational assessments of human health to achieve this scale, its success is attributed to the unique study design which emphasizes transparency and trust between participants and researchers.

John Wilbanks, Chief Commons Officer, Sage Bionetworks, will be sharing this case study at Arrowhead’s 8th Annual Personalized & Precision Medicine Conference on October 12-13, 2016 in San Francisco. For more information about the conference, CLICK HERE.

By developing an electronic consent process which enables participants to determine their individual data-sharing preferences, Sage Bionetworks addressed any misconceptions that patients’ information will be shared broadly and without their consent. They also allowed patients to change their data-sharing setting at any time during the study. That being said, more than 75% of the participants who qualified and consented decided to share their data broadly.  After ensuring transparency, the research team was then able to collect self-reported outcomes and qualitative sensor data, which were shared for secondary analysis.

In return, Sage Bionetworks excluded commercial resale, marketing uses and re-identification of data donors. Therefore, by prioritizing transparency and emphasizing return of information, mPower was able to redefine how patient data sharing studies are conducted. For more information, visit: http://go.nature.com/1S3cMeY.

Rate this blog entry:
Continue reading
1144 Hits
0 Comments

Key Leaders in Precision & Personalized Medicine To Meet This Fall for the 8th Annual Personalized & Precision Medicine Conference

Arrowhead Publishers has announced the addition of several new speakers to the agenda for the upcoming 8th Annual Personalized & Precision Medicine Conference, taking place on October 12-13, 2016 in the San Francisco Bay Area.
 
As one of the longest running conferences in the industry, this conference will gather a variety of key thought leaders in the private and public sectors to gain multi-stakeholder perspectives on this constantly evolving industry.
 
“The Annual Personalized and Precision Medicine Conference continues to be a vital meeting place for diagnostics companies, laboratories and other companies and organizations whose goal is the successful implementation of precision medicine at the local, national and global level,” said John Waslif, Managing Director of Arrowhead Publishers, the organizers of the conference.
 
“We are pleased to be able to bring together leaders from some of the most innovative life science companies in the world to address key areas related to the advancement of precision medicine, including reimbursement, value-based medicine, informatics, pharmacogenomics, pharmacogenetics, precision medicine clinical trials, and companion diagnostics.”
 
In addition to presentations, there will be a variety of extended Q&A sessions, panel discussions, and networking opportunities. One unique aspect of this conference is the pre-conference online networking suite, which provides a platform to ensure professional introductions can be established before the event.
 
With over 20 presenters, the 8th Annual Conference will focus on several key themes, including companion diagnostics, genomics data infrastructure, and emerging reimbursement policies. 
 
Kristine Ashcraft, Chief Executive Officer, Genelex, will be presenting on the crucial nature of the shift to value-based care within the healthcare system. With a variety of evolving regulatory and quality initiatives, Ashcraft will discuss the variety of factors that will ensure the success of precision medicine.
 
Dr. Jeremy Stuart of Selah Genomics will be highlighting the community health perspective of precision medicine. He will cover several case examples of precision-guided tests, and elaborate on the current solutions and challenges that exist in the precision medicine industry.
 
While genomic and related technological discoveries have allowed for revolutionary advances in personalized medicine, it is crucial to maintain a more long-term goal that these trials must be replicated on a larger-scale in the near future. Dr. Nicholas Schork from J. Craig Venter Institute believes that we must adapt the design of clinical trials for personalized medicine with this goal in mind in order to achieve long-term success.
 
Mobile technologies have the potential to revolutionize how we monitor and collect data medical data, but we are not yet able to achieve maximal impact due to growing concern over data privacy of individuals. John Wilbanks of Sage Bionetworks believes that there can be a mutually beneficial relationship created for all invidivuals involved. At the conference, WIlbanks will discuss his strong belief that creating social contracts between participants and researchers, while allowing maximal utility from the data systems, is key into furthering personalized and precision medicine. He will provide a recent case study which leveraged Apple’s ResearchKit framework, and will elaborate upon why this was a success for medical research.
 
One of the founders of Sinopia Biosciences, Aarash Bordbar, will be sharing his recent advances in personalized network models, and how they have led towards a better understanding of the side effects of pharmaceuticals. Dr. Bordbar’s presentation will provide an excellent opportunity to gain further insight into how we understand both individual patient physiology and pharmaceutical compounds in relation to side effects.
 
For more information on the 8th Annual Personalized & Precision Medicine Conference, please visit: http://personalizedmedicinepartnerships.com
 
For more conference information, please contact:
 
John Waslif: Managing Director
Arrowhead Publishers
866-945-0263
This email address is being protected from spambots. You need JavaScript enabled to view it.
Rate this blog entry:
Continue reading
1096 Hits
0 Comments

Challenges and Opportunities in Precision Medicine Outlined at the 7th Annual Personalized and Precision Medicine Conference

(Guest Blogger: DeAunne Denmark, MD, Ph.D., Research Associate, Oregon Health and Science University)

A stellar group of speakers and participants from all over the world assembled in Baltimore on October 5-6, 2015 for the 7th Annual Personalized and Precision Medicine Conference. While broad in scope, and covering diverse topics from public policy and data privacy to regulation and reimbursement, from healthcare VC interests to cutting-edge platform innovations, several common themes emerged that reflect some of the most central challenges and opportunities in PM today. Thoughtfully moderated by Mollie Roth, JD, of PGx Consulting, the message that current systems are dangerously unsustainable, and effective change will require major paradigm shifts, large infrastructure investments, and open, bilateral, transparent collaborations inclusive of the myriad of stakeholder interests – most importantly, patients – seemed to resonate strongly with all. 
 
DAY ONE kicked off with Elaine Lyon, Ph.D., Co-Medical Director at ARUP and Associate Professor at the University of Utah, sharply covering a core theme across areas and stakeholders – value. Pharmacogenomics requires balancing high complexity with high risk, and clearly demonstrating decision support utility for clinicians – “reasonable + necessary = useful” – to produce better responses faster, and more cost-effectively. She acknowledged important industry challenges, e.g., the absence of supportive evidence is often interpreted as evidence against, large cultural biases, and the reality that PM’s formidable task is to change the standard of care.
 
Jen Madsen, MPH, Health Policy Advisor at Arnold & Porter, skillfully covered intricacies of policy related to regulation and reimbursement for PM, providing a valuable primer on the 21st Century Cures Bill, PAMA rules, CMS reporting, ADLT payment for innovative tests, and FDA changes, including LDT guidance and authority on software. She reminded us that the upcoming election year would amplify the already daunting complexity, and to expect debates around major underlying issues: scope of practice, tests are NOT devices, and intended use, e.g., allowing computers to make clinical decisions.
 
Erick Lin, MD, Ph.D., Director of Medical Affairs at Ambry Genetics, reviewed his company’s experience in adopting NGS panels, emphasizing the need for technologically agnostic and flexible applications, as well as the acknowledgment of limitations and profound uncertainty inherent at the leading edge – “we don’t know what we don’t know.” 
 
Kristine Ashcraft, COO at Genelex, tackled how efforts to advance PM are being affected by the move to pay-for-performance, citing a distinct incongruence regarding the PMI announcement in the midst of an apparent “war on reimbursement,” e.g., FDA endorsement of drug-gene guidance equal to drug-drug, yet extremely poor coverage for even the most well-validated. For any given intervention, answering key questions – who to target, how to further stratify, does it actually improve care AND lower costs in the relevant population – through predictive modeling can provide real evidence that such genetic risk assessment improves patient care with positive financial impact. As asserted by others, “learning healthcare systems” will be key to PM in practice. 
 
Paul Sheives, JD, VP of Reimbursement & Regulatory Policy at the ACLA, reminded us that, as LDTs are created to fill a medical need and are critical to decision-making, innovation precedes approval. He highlighted new fast-tracking strategies, including the Cures Initiative, and proposals from the Diagnostic Testing Working Group and Association of Molecular Pathologists to distinguish high and low-risk, and re-framed PAMA as, ideally, a way to positively reflect value.
 
Morning presenters then convened in a panel for deeper discussion on some pressing hurdles facing PM advancement - better tests are often more complex and thus harder to get approved, current reimbursement is not based on value but on costs, gaps in unequivocal evidence that a test does what it says, and lack of standardization. Shifting paradigms to incorporate algorithms as stand-alone evidence as payment support and increasing collaboration were offered as powerful strategies to forge the way ahead.
 
The afternoon began with straight-talk by Michael F. Christman, Ph.D., President and CEO at the Coriell Institute, into the ideals of PM in practice. As the gold standard for high-quality biomaterials, the Coriell Personalized Medicine Collaborative (CPMC) collected sequence information on millions of genomes to harvest some of the industry’s lowest hanging fruit – drug response. GeneDose generates a complex risk report and regimen using alternative evaluation methods to model real-time drug-gene and gene-environment interactions. He concluded with another call for payor progressiveness– “the private sector provides information for physicians to make better decisions.”
 
Gitte Pedersen, CEO at Genomic Expression, enlightened the audience to the promise of RNA sequencing as another powerful tool highly amenable to PM applications, as exemplified by her work in single-payer Denmark whose national biobank is the world’s largest. 
 
Anya Schiess, MBA, General Partner and Co-Founder at Healthy Ventures, offered a compelling perspective on PM investment. Driven by digital health, i.e., the application of technology to healthcare, a surge has been initiated by several current trends - big delivery changes that affect all players, risk-shifting as payment moves from defined benefit to defined contribution, design of panels that are both accurately representative and adequately personalized, and the “retailization” of healthcare. She wisely emphasized the industry’s nascency, and that as it matures, “customer interface is king.” Fragmentation of the stack and who will win the value remain open questions – “everyone is waiting for the wave, and players must stay flexible if they want to catch it when it crests.”
 
Birgit Reitmaier, Ph.D., Head of Technologies & Biomarkers at Merck Serono division, and Elaine Cheung, Director of Strategic Partnerships at Illumina, team-presented their new collaboration to better address complex decision-making through a universal NGS-based cancer test. Each company can draw on the strengths and expertise of the other to support their individual challenges, while retaining autonomy and proprietization where appropriate. As in other areas, standards of quality, concordance, reliability, and transparency remain an unmet need that, when addressed, can coherently synthesize the currently fragmented market.
 
In a fantastic keynote, Robert Green, MD, MPH, Director of Genomes2People and Associate Professor of Medicine at Brigham and Women's Hospital/Harvard Medical School, shared some early results of the MedSeq Project which addresses core questions related to effective and efficient use of germline sequencing in practice. Undiagnosed disease is a clear need, but only 20-40% of cases are successfully solved, while generating large incidental findings. Does having this information actually provide benefit? Could it even cause harm? How do we make sense of and use it constructively? As he wisely summarized – “Screening for screening’s sake does not necessarily give a net positive health outcome.” The generation of a complete single-page WGS summary is an early project success and proof-of-concept that a person “doesn’t have to be a geneticist to read a report from whole genome analysis.” He outlined the need to differentiate clinical utility as broad or narrow and apart from personal and societal utility, as well as measure value in terms of gains in prevention, i.e., the costs of NOT knowing, concluding with the assertion that “discovering a variant thru screening is NOT the same as discovering in a person with known family history.” 
 
Ken Ramos, MD, Ph.D., PharmB, Associate VP for Precision Health Sciences and Professor of Medicine at Arizona Health Sciences Center, reviewed some of his institution’s experience with PM platforms, emphasizing the need to include multiple ‘omics for optimal care contextualization. Reiterating previously-raised disparity issues, he stressed the importance of identifying subpopulations most relevant to a geographical area, then individually tailoring programs accordingly. 
 
William Knaus, MD, FACP, Director of Applied Genomics Research and the Center for Biomedical Research Informatics at NorthShore University Research Institute, introduced an exciting new project designed to engage and empower patients. Health Heritage is a personalized, consumer-controlled tool that automatically and agnostically integrates family history data into an EHR, then securely sharing and distilling it to derive composite assessments of disease risk. Importantly, information gets into the right hands AT the point-of-care to communicate what the risk is, why it exists, and what can be done. Providers don’t need to be involved, thereby minimizing workflow impacts and optimizing returns through alignment of payer, provider, and patient interests. 
 
DAY TWO kicked off with back-to-back presentations on highly innovative PM applications. Elaine Mardis, Ph.D., Professor and Co-Director of The Genome Institute at Washington University School of Medicine, introduced exciting work in the design of patient-unique cancer vaccines. Such immunotherapies can even be combined with existing cancer drugs, e.g., checkpoint inhibitors, in priming strategies that “wake-up” the immune system prior to vaccination. She cited pediatric glioma, especially the recurrent setting, as an area of notable success to date.
 
Angela Davies, MD, FRCP (C), CMO at Champions Oncology, enlightened participants to Patient-Derived Xenografts (PDx), which show distinct promise for functional validation in drug development, allowing faster go/no-go decisions and improved mechanistic understanding to help ameliorate the massive Phase 3 failure problem. PDx models can also better pair translational clinical trial with likely responders, particularly when more diverse biobanks and increased cohort sizes are achieved. 
 
Doctoral candidate Tea Pemovska from the Wennerberg Lab at the University of Helsinki’s Institute for Molecular Medicine Finland (FIMM) spoke about individualized systems medicine, incorporating phenotype-to-genotype approaches such as unsupervised hierarchical clustering to create functional profiles and for relationship discovery. 
 
In a refreshing interactive format, Richard Watts, VP of Business Development in Personalized Healthcare at Qiagen, exposed an array of complex issues surrounding the commercialization of targeted therapies and companion diagnostics. Audience members were challenged to dial in rapid responses to tricky real-world scenarios ranging from panel-based drug recommendations, to balancing test sophistication with increasing development costs and time-to-market, to using in vitro data in clinical decision-making. 
 
David Resnick, JD, Partner at Nixon Peabody, made accessible the convoluted realm of intellectual property (IP) by summarizing a few recent high-profile cases (Myriad, Mayo, and Ariosa v. Sequenom) to illustrate some critical issues raised by PM technologies. He offered the valuable take-homes that, due to IP’s rapid evolution, patents will be much narrower, i.e., specific to methodology or sequences, and even the experts still don’t know exactly what IS patent-eligible, especially regarding computer-based inventions like software or algorithms. 
 
Mark Gerstein, Ph.D., Professor of Biomedical Informatics at Yale University, delved into the murky waters of privacy’s central dilemma – how to balance risk with the imperative to share. He described major difficulties with the current framework, including lack of interoperability, large data files, encrypting, and vulnerability to hacking, and suggested some hybrid solutions to promote better balance: privacy licenses for scientists, reduced glory incentives for hackers, management by larger entities, e.g., UK government, and quantification, including both private and public sector leakage. 
 
Bartha Knoppers, Ph.D., Director of the Centre of Genomics and Policy at McGill University, offered policy lessons from the Canadian experience, focusing on stratification in particular. Advocates for sharing, the Global Alliance for Genomics & Health lays out a framework for a “coalition of the willing” that agrees to act responsibly in the protection of human rights as a move away from patient protectionism.
 
Laura Lyman Rodriquez, Ph.D., Director of the Office of Policy, Communications and Education at the National Human Genome Research Institute, spoke to how major policy and privacy issues can be progressed by increasing participant involvement. The question of balance among multiple interests arose yet again, as scientific aims, responsible but not over-protection, and advancement must be reconciled with the autonomy and transparency required for participants to trust in the process.
 
Christopher Ianelli, MD, Ph.D., CEO of iSpecimen, took us full circle from participation back to protection again with an overview of a highly innovative new venture dedicated entirely to the reuse of remnant clinical specimens. Prompted by the recognition that a veritable goldmine of research was being actively wasted, iSpecimen was created as the steward to get discarded tissue and other samples into the R&D pipeline by matching with those who need them in a diversified data-flow network, while simultaneously promoting a more progressive consent process. 
 
Participation remained the theme of the second panel discussion among the afternoon presenters who agreed that defining property and ownership rights of data or samples may not even be necessary as long as patients retain control. The consensus that participant representation be not token, but instead provide truly meaningful input in the entire research system was an inspiring and fitting conclusion to this highly successful, engaging, and information-rich event.
Rate this blog entry:
Continue reading
4384 Hits
0 Comments

Proposed Changes to the Common Rule: Protection to Participation in Action

Guest Blogger: Christopher Ianelli, MD, Ph.D., CEO, iSpecimen (Dr. Ianelli will be giving a presentation entitled "Protection to Participation in Action: Engaging Patients for the Use of Their Clinical Remnants in Research" at the 7th Annual Personalized and Precision Medicine Conference in Baltimore on October 5-6, 2015.
 
On September 2, the government announced a proposal to update the Federal Policy for the Protection of Human Subjects, also known as the “Common Rule”. According to the Department of Health & Human Services (HHS) the proposed changes are intended to strengthen human subjects’ protection while facilitating important research. Since the volume and landscape of research have changed substantially since the rule was first published, particularly with the rise of precision medicine, HHS determined that a review and upgrade may be in order. The proposed changes are a real-time example of the shift that has been occurring in healthcare privacy: the shift from patient protection to protection and participation.
 
Untouched since its publication in 1991, the Common Rule ensures that all individuals enrolled in biomedical and behavioral research studies are treated ethically. It consists of several sub-parts and dictates that informed consent and Institutional Review Board (IRB) approval be secured when human subjects (i.e. living people) interact with principal investigators as part of a study or when researchers obtain private information from them. One portion of the Rule very significant to precision medicine research is the portion that governs the use of de-identified, remnant human biospecimens in research.
 
The Common Rule currently says that the use of de-identified remnant specimens for research is “non Human Subjects Research” because individuals a) have no interaction with researchers and b) data are de-identified so there is no private information transfer. Therefore, under the Common Rule, de-identified remnant biospecimens can be used for research purposes without IRB approval and informed consent. As personalized medicine continues to surge forward, the need for human biospecimens has proliferated. And with the proliferation has come the emergence of many new types of research, particularly genomic research, shining a light on the ethics of using patient specimens without their consent and the very definition of what’s protected information.
 
While current law does not require consent or IRB approval, iSpecimen feels it is best practice that consent be obtained whenever possible. As patient-centricity and consumer-directed care have continued to move to the forefront of care paradigms, consent is the best way to ensure that the ethical principles detailed in the Belmont Report on Ethical Principles and Guidelines for the Protection of Human Subjects of Research, namely beneficence (“the goal of maximizing possible benefits of research and minimizing possible harms”), respect for persons (“treating people as autonomous agents, and allowing them to make choices based on their own judgments and opinions”), and justice (“fairness in terms of who receives the benefits from research and who bears its burdens, especially when dealing with vulnerable populations) are followed.
 
These three principles have been the impetus behind the proposed changes to the Common Rule that would affect the use of de-identified remnant biospecimens.  The proposal would require that a patient give informed consent and that an IRB review and approve the use of de-identified remnant specimens for most types of research. It would also make it more difficult for researchers to get a waiver of consent for the secondary use of biospecimens.
 
The proposed changes are open for comments through December 7, 2015. If approved and subsequently published to the Federal Register, there would be a compliance period of three years. While it would be closer to the end of the decade before these changes would fully go into effect, the fact that today, after nearly 25 years we are collectively re-evaluating their merit, is indicative of the healthcare revolution known as precision medicine, which has indeed arrived and must be pursued with as much focus on the interests of the patients as the outcomes of the research.
 
To further your understanding of the proposed Common Rule changes, and the shift from protection to participation in action, please visit the following resources and attend Dr. Ianelli’s session at the the 7th Annual Personalized and Precision Medicine Conference. The presentation starts at 2:05 p.m. on Oct. 6, at the Sheraton Inner Harbor Hotel in Baltimore, MD.
 
Rate this blog entry:
Continue reading
2324 Hits
0 Comments

Clinical Laboratories Facing Paradigm Shifts for Regulation and Reimbursement

Guest Blogger: Jen Madsen, MPH, Health Policy Advisor, Arnold & Porter LLP (Ms. Madsen will be giving a presentation entitled "Public Policy Issues with Regulation and Reimbursement for Precision Medicine" at the 7th Annual Personalized and Precision Medicine Conference in Baltimore on October 5-6, 2015.
 
Over the next year, policymakers in Washington will be making decisions on the regulation of laboratory testing and Medicare reimbursement that will fundamentally change the industry’s relationship with government.  Companies will need a sophisticated understanding of the regulatory landscape to comply with new rules and succeed in the face of new market forces.  
 
Regulation of Clinical Testing. FDA has long regulated medical devices intended for use in humans, including in vitro diagnostics (IVDs), but exercised enforcement discretion with respect to its oversight of laboratory developed tests (LDTs).  
 
CMS regulates laboratories, including those that develop LDTs, under the 1988 Clinical Laboratory Improvement Amendments (CLIA) (42 U.S.C. § 263a).  CLIA governs the accreditation, inspection and certification process for laboratories, and establishes a process for ensuring the analytic validity of LDTs. However, accrediting bodies under CLIA do not evaluate test validation prior to marketing. The clinical validity of many LDTs is described in the medical literature, but the evidence for a new test may not be.     
 
On September 30, 2014, at the close of a 60-day notification period for Congress, the FDA formally released two draft guidance documents which detail how the Agency plans to regulate the laboratory developed tests (LDT) industry. The draft guidance would require labs to notify FDA that they are developing LDTs, report significant adverse events to FDA, obtain appropriate pre-market review from FDA based on level of risk, and adhere to the device Quality System Regulations/current Good Manufacturing Practices (GMP) requirements.
 
Since releasing the guidance, FDA has held a public meeting to seek stakeholder input and received comments from hundreds of interested groups. FDA reportedly has written a final guidance, but it has not yet been released to the public. Key committees in both the House and Senate have taken an interest in the draft guidance, with more hearings possible this fall. The American Clinical Laboratory Association, which believes FDA lacks legal authority to regulate LDTs as medical devices, has both threatened suit and asked Congress to intervene.    
 
Medicare Reimbursement Changes. Meanwhile, CMS has drafted, but not released, regulations implementing Section 216 of the Protecting Access to Medicare Act of 2014 (PAMA). The law requires laboratories to report to the federal government data on the payments they receive from private payers, Medicare Advantage Plans, and Medicaid managed  care plans.  Failure to comply with the reporting requirements carries the potential for civil money penalties of up to $10,000 per day -- so laboratories need to pay attention to these new rules and be ready to comply.  
 
CMS will use this data to calculate a weighted median paid for each laboratory test, and those amounts will set new Medicare payment rates for tests starting January 1, 2017. But meeting that deadline could prove difficult, as the statute envisioned final regulations for the data reporting requirement being completed by June 30, 2015. Once the proposed regulations are released, stakeholders will have 60 days to comment on them. Then CMS will need to finalize the regulations, implement an electronic system for labs to submit their payment data, train labs on how to use it, and calculate median payment rates after laboratories collect and report their data.
 
The proposed rule is expected to answer questions such as:   
 
Which laboratories are “applicable laboratories”? Laboratories that receive half or more of their Medicare revenues from the CLFS or PFS must comply with the reporting requirement, but CMS may establish exceptions for labs that have low volume or low Medicare spending. Independent laboratories, pathology practices, physician office laboratories and hospitals are waiting to learn whether and how they will be affected.  
 
Which tests will qualify for special payment rules? The law defines a new category of “Advanced Diagnostic Laboratory Tests” or ADLTs. New ADLTs will be paid at their “list charge” at the time of launch for the first three quarters that they are on the market, and will report payment rates annually to set new Medicare rates. Tests that use algorithms to analyze multiple biomarkers and report a single result, and tests that are FDA cleared or approved, are defined by law as ADLTs. But CMS has the discretion to include (or exclude) other tests.
 
How will new tests receive codes, coverage, and payment? CMS will be required to establish temporary HCPCS codes for new ADLTs that will be effective for two years, unless extended by the agency. CMS must establish codes for existing ADLTs, and companies may also request a unique identifier. Labs and diagnostic test makers need to understand the new rules for code assignment and how they relate to obtaining Medicare coverage. PAMA allows CMS to reduce the number of Medicare contractors processing claims for laboratory tests. Will one or two contractors’ policies define coverage across the country?
 
The broader questions facing the industry include: 
 
Will Congress act on questions of regulatory oversight? The House of Representatives passed the 21st Century Cures Act on July 10 but was silent on the LDT issue despite the ACLA’s objections. The House Committee on Energy and Commerce continues to discuss the issue, and it may be included in companion legislation in the Senate later this fall. But Congress could defer action on the issue until it begins work on a reauthorization of device user fee law, which is set to expire in 2017. FDA has already started discussions with the industry on a new user fee agreement.  
 
How will regulatory changes affect the lab industry? More certainty about the regulatory and reimbursement environment could ease concerns of reluctant investors about future returns, but require more up-front capital to prove clinical validity to the FDA. And lower Medicare payments could influence other payers’ rates and reverberate throughout the industry.  
 
Want to learn more about these upcoming regulatory and reimbursement changes? Attend the presentation “Public Policy Issues with Regulation and Reimbursement for Precision Medicine,” given by Jen Madsen, Health Policy Advisor at Arnold & Porter LLP, at the 7th Annual Personalized & Precision Medicine Conference, a satellite event of the American Society for Human Genetics Annual Meeting. The presentation starts at 9 AM on Oct. 5, at the Sheraton Inner Harbor Hotel in Baltimore, MD.
Rate this blog entry:
Continue reading
2030 Hits
0 Comments

Is Diagnostics the New Black of Biotech?

Guest Blogger: Gitte Pedersen, Chief Executive Officer, Genomic Expression (Ms. Pedersen will be giving a presentation entitled "Making RNA Sequencing Actionable in The Clinic" at the 7th Annual Personalized and Precision Medicine Conference in Baltimore on October 5-6, 2015.

Diagnostics have traditionally been the stepchild of biotech. Why develop a cheap $100 dollar test that a patient will use once compared with a cancer drug that costs the $5,000 per month for the duration of the patient’s life? And for the above reason, VC’s and investors in general have not had an appetite for diagnostics.
 
It most certainly did not help that Myriad’s patents on the BRCA gene were proven invalid by the Supreme Court. So now you can’t protect yourselves from competitors by filing patents! Patents are a cornerstone of the way biotech works. It gets worse. The high value diagnostics such as OncotypeDx from Genomic Health and CardioDx from Corus CAD reduce the cost of care by eliminating other costly treatments. However, these two companies had to conduct extensive clinical studies costing hundreds of millions of dollars to get key opinion leaders on board, and even after presenting data, they faced the hurdle of first, reimbursement, and then, adaption.
 
Adaptation has proven a very hard one because both tests take revenue away from the same doctors who use the test, and in a commercially driven health-care system, such as the one we have in the United States, doctors are not giving up revenue without a fight. Thus, it’s been a hard and long road for the few companies that succeeded. The companies that did succeed enjoy a market cap of +$500 million market cap/diagnostic product launched.
 
And for the above reasons, diagnostics remains less than 2 percent of the total cost of health care, whereas the answers to those tests drive 70 percent of all clinical decisions. If we want more evidence-based health-care delivery, we have to change the above dynamics! Two relatively recent companies had the guts, vision, and money to do so, and both have come out of nowhere and within a very short time, made it from Zero to one to use Peter Thiel’s definitions.
 
Foundation Medicine, founded by ThirdRock, has redefined the diagnostic category and revolutionized cancer treatment, despite the fact that it has very little IP to protect its innovation. However, it moved so fast and raised so much money, that competition just couldn't keep up. And now, Roche has acquired a majority share of Foundation Medicine for +$1 billion. Foundation Medicines’ comprehensive cancer panel falls under the “One Assay” concept that assists doctors in identifying treatments that work on the patient’s individual genetic alterations. This kind of information can save lives of patients who otherwise do not respond to treatment. It is the new hope for cancer patients.
 
Theranos is also a “One Assay” concept but went after the existing tests that are standard to perform in a blood sample. We all go to the doctor once a year for our annual checkup. Last time, it took three vials in order to send my blood to three different labs. Theranos solves this problem by enabling all the tests to be performed in as little as one drop for much less than the existing tests cost. Theranos’s plan is to make standard blood tests a commodity that you get at the pharmacy, similar to your flu shot. You save the scheduling and waiting in the doctor’s office, and you have the results when you meet him or her, making the visit more productive.
 
Both Foundation Medicine and Theranos are revolutionizing delivery of health care, and they have also made very nice returns for their investors. Which leads me to my next point. Visiting your doctor once a year is not an effective way to deliver health care. It corresponds to flying an airplane and turning on the radar once per hour. We need to monitor our health in a much more continual manner to prevent disease. And if we can do that, we will be able to, for the first time in history, reduce the cost of care. For that reason, I also believe we are about to see many more companies revolutionize delivery of health care, either with novel diagnostic concepts or new models for monitoring health and delivery of health care.
 
Genomic Expression is focusing on developing cancer diagnostics that save lives and make health-care delivery more effective—and we are launching products now. We got started with a large grant in Denmark overcoming some of the hurdles of funding with non-dilutive capital. If our RNA assay is a recording studio, we can develop disease prognostics algorithms/songs if we have access to historic samples with clinical outcome data. And, as a matter of fact, we do have that access. So stay tuned; we signed up partners and are very much looking forward to our first year with revenue.
 
Learn More about Genomic Expression HERE.
Rate this blog entry:
Continue reading
2641 Hits
0 Comments

Precision Medicine: The Future is Now

By John C. Nelson, MD, MPH
 
The Affordable Care Act provides a blueprint for health system reform in America. While increasing access to care, the quality of that care must be maintained or improved; and, costs must decrease.
 
Innovation will be a key factor to bring about change. Personalized Medicine based on genetic information (or Precision Medicine as President Obama referred to it in his State of the Union address), has the potential to be one such key.
 
Many see the use of genetics as only a way to study and identify those predisposed to various rare diseases. Indeed, there is much promise in this approach. However, a more far-reaching use of genetic information would be to gain a better understanding of how patients are likely to respond to drugs.
 
Nearly all of us have taken, or will take, medications at some time in our lives. Every single one of us has unique, drug-metabolizing capabilities centered around the cytochrome P-450 (CYP) enzymes in our liver. These enzymes are genetically determined and dictate how we break down medication – essentially, whether we are normal, rapid or poor metabolizers.
 
For example, when a “standard” dose of a drug is given to a patient, there is an expected result. However, if a drug is broken down too rapidly by a rapid metabolizer, the necessary level of medication may never be achieved. Conversely, if broken down too slowly by a poor metabolizer, unusually high concentrations of the drug may accumulate with toxic, even fatal, side effects as a result. Studies have shown that more than 75 percent of patients have variations in at least one CYP pathway, and therefore may not metabolize medications the way a caregiver might expect.
 
This is knowledge that should make all of us responsible for patient care stand up and pay attention.
If a clinician knew before a drug was prescribed how a patient would react, prescribing would take a major leap toward becoming a more exact science.
 
Enter the solution: genetic testing that tells us with more certainty before a drug is prescribed if it will be effective. The science of pharmacogenomics – the study of the many genes that can affect individual differences in drug response — has made this possible today.
 
EHRs and ePrescribing software already check for drug-drug interactions; however, without the ability to identify drug-gene interactions, important interactions may be missed.  Genetic testing, therefore, improves the ability to determine which patients are at risk.
 
Impacting Costs and Patient Outcomes
Approximately $3.5 billion is spent annually on extra medical costs as a result of adverse drug events (ADEs), which is when a patient is harmed by a reaction or complication from taking a medication. ADEs account for more than 700,000 annual ER visits for Medicare patients. Similarly, 2-8 percent of hospital admissions for Medicare patients occur due to ADEs, resulting in tremendous costs.
 
Per a 2014 study, current methods to detect ADEs miss more than a third of potential drug interactions, masked due to unknown patient genetics. This kind of information has prompted the FDA to recommend that drug-gene interactions should be considered as important as drug-drug interactions.
 
The Triple Aim, a concept developed and supported by the Institute for Healthcare Improvement, describes a roadmap for improving the healthcare system by focusing on three factors:
 
Improving the patient experience of care (including quality and satisfaction)
Improving the health of populations
Reducing the per capita cost of health care
 
All of these objectives may be accomplished by the appropriate use of genetic testing.
 
The Role of Technology
Currently, most EHR, HIE and other healthcare software systems don’t have the ability to store and mine individual genetic data in a meaningful way. Clinical decision support software already exists that can compare a patient’s genetic capacity to metabolize medications against their medication list to determine the best medications and dosages for that patient.
 
A greater sense of urgency is needed to integrate this kind of information and technology into EHRs in a user-friendly, patient-centered way. This will improve the drug interaction alerts physicians consider when prescribing. It’s not enough to store this information merely as a lab report. Ideally, clinically-actionable genetic results like these should be a part of the patient face sheet – as discreet data – so it can quickly and easily be accessed and used, for example, when a patient visits the emergency room.
 
There is currently no magic bullet to transform the healthcare system in America. However, a large step towards accomplishing the goals of the Triple Aim can be achieved with the appropriate use and availability of genetic testing and making consideration of that useful clinical information available to physicians throughout the healthcare software ecosystem.
 
Dr. John C. Nelson is a practicing physician in obstetrics and gynecology at the Health Clinic of Utah. In practice since 1975, he became Medical Director at Genelex Corporation in 2013, and served as President of the American Medical Association from 2004-2005. 
 
Learn More
Want to learn more about how precision medicine, and pharmacogenetic testing, can change the healthcare landscape? Attend the presentation “Precision Medicine in a Pay-for-Performance World,” given by Genelex Chief Executive Officer Kristine Ashcraft at the 7th Annual Personalized & Precision Medicine Conference. The presentation starts at 10 a.m. on Oct. 5, at the Sheraton Inner Harbor Hotel in Baltimore, MD. 
Rate this blog entry:
Continue reading
2216 Hits
0 Comments

Can Personalized Medicine Scale?

Guest Blogger: Anya Schiess, MBA, General Partner, Co-Founder, Healthy Ventures (Ms. Schiess will be giving a presentation entitled "Healthy Ventures: A New VC Perspective" at the 7th Annual Personalized and Precision Medicine Conference in Baltimore on October 5-6, 2015.

For genomics to achieve its expected market growth, the industry will evolve from custom hardware/software/applications, where companies are almost fully vertically integrated, into an ecosystem where entrepreneurs can leverage each other’s innovations. The total amount of genome sequence data generated is doubling every 7 months and the market is expected to quintuple, to over $15 billion, by 2020 1,2. While some skeptics doubt the every day utility of genomic information, it’s difficult to argue that with so much data generated, new clinically important applications will be discovered. Commercializing new applications will require a re-organization of the industry.
 
As industries evolve, they often fragment. The early computer companies were vertically integrated, manufacturing everything from the processors to the software applications. As it matured, the industry fragmented into layers, and specialized companies came to dominate their layer. Early examples include Intel and microprocessors, and Microsoft and operating software. Recently, we’ve seen the same fragmentation with web infrastructure businesses, e.g. Twilio and cloud communications, AWS and cloud storage, Slack and managed collaboration. 
 
The genomics industry will do the same. First, we are seeing the hardware/software separation with the sequencing layer led by Illumina, and the computing layer led by AWS. Next, we will see bioinformatics fragment. For example, Spiral Genetics and data compression, DNAnexus and genome assembly, Bina (Roche) and data processing, SolveBio and the reference data layer. The companies that currently own the customer channel – like Invitae, Counsyl, and Foundation Medicine – will go from vertical integration to a focus on the application layer, outsourcing the rest of the stack. This will free tremendous working capital to focus on the new applications enabled by the data explosion. Without focusing on the new apps, these pioneers of genomic testing will be disrupted by nimbler competitors building the future. We’ll also see new categories of applications like genomic interpretation apps and pharmaceutical R&D applications. These will be built on top of an outsourced, fragmented, and more nimble stack.
 
At Healthy Ventures, we are investing into this fragmentation. We love companies that are creating the infrastructural backbone for the new genomics industry. We look for layers/fragments least likely to be commoditized and where the time-to-maturity is shorter; examples of which include the reference data layer. The company must have more than just a product; it must have the vision and ability to ‘own’ its layer, creating a durable business with a large absolute market, like OneCodex is doing in microbial genomics. Perhaps most difficult, the company must be relevant today, in a highly concentrated market with few end-users doing very high throughput analysis, and also relevant tomorrow in a much larger, but much more distributed market where bioinformaticists are in short supply yet everyone has access to a sequencer.
 
  1 Stephens, Zachary D, et al. “Big Data: Astronomical or Genomical?” PLOS Biology, 7 July 2015.
  2 Genetic testing for cancer alone is expected to be $9B. Source: Foundation Medicine.
Rate this blog entry:
Continue reading
2710 Hits
0 Comments

GONE FISHING and Other Things To Do While Waiting For The Fallout From The Ariosa Decision To Settle Out

Guest Bloggers: Ron Eisenstein, Partner, Nixon PeabodyDavid Resnick, JD, Partner, Nixon Peabody (Mr. Resnick will be giving a presentation entitled "Balancing Public and Private Intellectual Property Interests in a Post Mayo-Myriad Age: What is Likely to Be Patentable in Diagnostics - and What Should Be Patentable - to Best Serve the Public Interest" at the 7th Annual Personalized and Precision Medicine Conference in Baltimore on October 5-6, 2015.

By now most people have read about the U.S. Court of Appeals for the Federal Circuit's long awaited decision in Ariosa v. Sequenom. Whether or not you agree with the decision, almost all, including the Federal Circuit, agree that the technology in question revolutionized prenatal diagnostics. Sequenom developed the relevant technology to use fetal DNA in the mother’s blood to detect genetic abnormalities. This technological advance allowed diagnostic tests, for example, screening for Down’s Syndrome, to be done with a small sample of the mother’s blood, avoiding the serious risks of amniocentesis.  

The parties agreed that the underlying discovery upon which the patent (6,258,540,) was based was that a small portion of maternal blood from a pregnant woman contains cell-free fetal DNA (cffDNA). This cffDNA was found in serum and plasma, portions of the blood previously discarded in any analysis of maternal blood samples. It was this cffDNA obtained from a minimally-invasive blood draw performed on the mother that could be used to determine if the fetus had certain birth defects. The parties further agreed that the patent does not claim cffDNA or paternally inherited DNA themselves. Rather, the patent claims methods of using cffDNA. The district court, on summary judgment, had found that since the claims used conventional methods such as PCR amplification, the claims were  directed to a natural phenomenon and not eligible for patent protection  under 35 U.S.C. 101 in light of the Supreme Court's decision in Mayo. On June 12th, the Federal Circuit affirmed this decision. 
 
Many patent practitioners have denounced the decision. One argument is that it flies in the face of common sense. If serum and plasma were routinely discarded, their use can hardly be considered conventional. 
 
Others will point to Judge Linn's strange concurrence. After explaining why this case differs from Mayo, and that the claims were worthy of patent protection, he states that he is constrained by the Mayo decision to find the claims unpatentable. As Judge Linn points out, what he and the others are relying on is dicta, a judge's editorializing.  While persuasive, dicta is not binding. 
 
Second, Judge Linn ignores the Supreme Court's subsequent opinions in Myriad and Alice.  In Myriad the Court found that altering a natural material, even in a conventional manner, was sufficient for patent eligibility. In Alice the Court discussed how significant contributions to the field could result in patent eligibility.  Finally, while mentioning that the Mayo Court had pointed to the claims in Diamond v. Diehr as involving conventional steps that when considered as a whole integrated the statutory exception (an algorithm) and transformed the method into a patent eligible invention, Judge Linn failed to explain why the present claims failed to accomplish the same transformation. 
 
So what are we supposed to make of this decision? Some will just "go fishing" and stop trying to get method patents, certainly method directed to diagnostics. Indeed, practitioners have seen a decrease in filings of patent applications directed to diagnostics over the last few years. There has also been a decrease in patent litigation.  In fact, one recent study suggests that the decline in patent litigation is a result of the Supreme Court's recent decisions on patent eligibility. 
     
While no one wants unnecessary litigation, the question is what effect these decisions will have on health care and other scientific advances. We know for certain that these decisions will hurt licensing revenue for universities and nonprofits. Their impact on investment for the biotech industry will be somewhat more nuanced.  While innovators are likely to be negatively impacted by less valuable patent protection, others in the industry will enjoy reduced concern about being blocked by patents.  Will this help the consumer? Maybe. The current structure of the biotech industry places substantial emphasis on private investment. That investment is based upon a robust U.S. patent system that, for a limited time, blocks others from offering competing products  and undercutting attempts to recoup the costs of R&D.  
 
One can ask whether Seqenome would have made the same investment if it knew that others would be able to offer competing tests without the same development costs. What about Myriad Genetics? If there had been no Myriad to undertake the substantial clinical testing burden, would we have the data that permits women with "bad" tests to take preemptive action and prevent the development of breast cancer?  These are the questions that industry is likely to be prosposing to Congress in the coming weeks and months. The outcome of such efforts is unclear. 
 
The pharmaceutical industry has not helped itself by providing opponents with opportunities to criticize the effects of patents, such as substantially raising the cost of medications that have been in use for years. As a result of the current state of patent law, the research community may need to look to a different system of funding translational research. Given the climate in DC for at least the last decade, we know the money is unlikely to come from the government. 
 
In view of the problems with starting a new paradigm, we suggest that the patent system, even with decisions like Ariosa and Myriad, remains the most viable alternative. While the Federal Circuit's most recent decision in Ariosa does not help, it does not change the fact that the Supreme Court held that some patents involving laws of nature or natural phenomenon remain patent eligible. Certainly it does not affect alternative claiming strategies. For example, kit claims, device claims, or even method of treatment claims remain viable approaches. 
 
One key to successfully obtaining meaningful patent protection is to truly understanding how a given method will be used commercially and focusing the claims fairly narrowly to provide narrow but essential protection. Remember that claims which may no longer issue in the United States are still valid in Europe and many other countries. Additionally, if the most dire predictions prove true, a legislative solution becomes increasingly likely.  Moreover, the Federal Circuit could hear the Ariosa case en banc. Alternatively, the Federal Circuit could reach a different decision in other cases having different records. Such things have happened before.  
 
While the pendulum on patent eligibility has taken a fairly significant swing in the direction of patent ineligibility, history suggests that these pendulum swings a constant readjustment of competing concerns and that the situation is unlikely to be static. We suggest you stay tuned. 
Rate this blog entry:
Continue reading
2357 Hits
0 Comments

Design, Access & Obama’s Precision Medicine Initiative

NIH is Off and Running 
NIH would seem to be wasting no time in getting President Obama’s Precision Medicine Initiative off the ground. Having begun planning in February, it announced just two days ago, the formation of a panel of precision medicine experts to lead initial steps towards designing the project. Notably absent in any of the announcements to date is how the government will succeed in engaging Americans. Perhaps these experts will propose engagement strategies behind these closed doors? Even if so, are they the right folks to create the ticket to success? There is, after all, no initiative without Americans’ enthusiastic support. So, engagement is of vital importance. 
 
Wherefore Public Engagement? 
Citizen science and crowd sourced funding have proven successful strategies for engaging/exciting average folk in/about scientific research But they can’t hold a candle to what celebs have done for public awareness of genetics. And yes, of course I’m thinking about Angelina Jolie, who isn’t? She certainly has been credited with doing ‘ a lot’ to raise awareness about the clinical benefit of BRCA testing and preventative actions. Face it, if you want to get something noticed, America’s beloved celebs –perhaps- do it best. Well, maybe not because they’re so genius at it but because when they talk we pay attention.  And so, how will the government grab enough attention to get its million DNA donors? 
 
Will Engaging Be Fun? 
While commentators have questioned the government’s technological ability to successfully undertake the project, the challenge of recruiting a million folks- I mean regular folks-to complete the project is a timely manner is a bit, dare I say, thought provoking. Clinical research is serious business, to be sure. Yet, the American appetite for fun and fluff suggests that a successful engagement strategy will need to appeal to this appetite. 
 
I can well imagine the public’s imagination being captivated by Apple announcing that all I phone 7 (for example) purchasers will get, free of charge, a spit cup to submit a DNA sample along a finger print protected new app that contains their genome sequence, on the condition that purchasers share their genome, albeit it encrypted and secured, with government researchers. New IPhone 7 users would then –literally- have their genome at their fingertips, but so too would the initiative. I can also imagine great public interest resulting from a Justin Beiber public service announcement saying he’ll give away a million (of the same) spit cups to fans so they can see if they have predispositions to Beiber- like reckless behavior, so as to encourage young people to behave engage sensibly, while giving his image a more mature make over.  
 
And so, in the context I just sketched, I’m wondering what the government will come up with to engender public enthusiasm-that is enthusiasm which converts to participation. I admit I don’t recall any federal efforts to engage the public, save for the military marketing, replete with catchy jingles. Remember ““Uncle Sam wants YOU for the U.S. Army”?
 
 
(The 1917 poster was used to recruit soliders for both WW I and WW II). Or, the catchy jingle “Be all you can be: Join the Army”? So, does NIH need a genome slogan and a genetic jingle? Maybe. Perhaps they should hire Jon Stewart can write the slogan and Jay Z the jingle… 
 
Let’s Talk Ethics
Sarcasm aside, engaging the public will be more challenging than in countries with universal health insurance and a national health service, like the UK or Iceland. Yes, our fragmented health care system poses obvious technological complexities that must be solved to enable access to clinical-phenotypic- data. Arguably, equally, if not more important, is the issue of how access to the project participation will be ferreted out given the that there are people who do and don’t have insurance, those whose medical records are and aren’t digitized, who are or aren’t proficient in English, who do or don’t believe that their genetic risks are preordained by God, etc.  In other words, the million-person cohort needs to be diverse. The project needs to collect data from people who live in all types of different geographical places, and situations. But how will diverse be defined? What will be inclusion/exclusion criteria? Will the inclusion criteria exclude persons whose medical record is not digitized? That is, those with a paper record can’t participate? Will those without health insurance be ineligible? What database biases could ensue from the implementation of said criteria?  
 
Difference is important to democracy. It’s also vital for genetics research. So, integral to study design, are some important ethical considerations. For one, is excluding individuals who lack an EHR or even health insurance from participating fair or even ethically defensible?? 
 
Thinking More Globally
Many have worried about personalized medicine becoming available only to the rich. This argument is based on the high cost of new diagnostics and therapeutics. A more fundamental issue is whether it’s ethically defensible to interject certain exclusion criteria, and interject bias dataset. As I’ve argued elsewhere the principle of beneficence may require maximal data sharing to permit genomic benefit sharing amongst all persons, not just an entitled few. If the initiative’s data set excludes persons lacking an EHR or lacking insurance, we will be faced, in my view, with the unfortunate consequence that not only will enthusiastically engaged persons will be ineligible to participate, but also a bias that could limit generated knowledge in some important ways. If true, such a bias lead to some adverse impacts. That said, for a western industrialized country, this would be an unfortunate consequence because it will evince the possibility that the principles of democracy, and social justice, will not extend to precision medicine.
Rate this blog entry:
Continue reading
1980 Hits
0 Comments

Precision Medicine: The Start of a Whole New Era

LIMITS TO PROGRESS

Nearly 30 years after the start of planning for the Human Genome Project, 25 years after the Project began, and 12 years after it’s completion was announced, genomic research progress is becoming impactful. Lives are being radically changed for the better thanks to more precise, diagnostics and targeted treatments. Patients are starting to get the right medicine in the right dose at the beginning of treatment, not halfway through. However, although tumor profiling is regularly done to determine which of available cancer treatments is most appropriate, and pharmacogenetic/pharmacogenomic testing is more available thanks to an increasingly crowded market, personalized medicine has yet to help the majority of patients, e.g. those suffering from common and complex diseases.

This is largely because the US government does not yet have large studies linking genomic to clinical data.  Numerous other countries, however, do. The British 100,000 genomes project, the Saudi Arabia’ Genome Mapping Project, the Genome of the Netherlands, with similar efforts starting up in Belgium, and other European Countries are cases in point. Many reasons for the US lag have been cited, including the lack of data management standards and system inter-operatability. Notably, the private sector is leading the way.  The largest current repository is privately owned, by 23 & Me (800,000 samples), Craig Venter is spearheading the Microbes and Metabolites Fuel an Ambitious Aging Project, which will sequence a million genomes, and some pharmaceutical companies are establishing in house sequencing projects, i.e.“The Search for Exceptional Genomes”).

THE LONG AWAITED LARGESCALE PROJECT

Recognizing the medical power of personalized medicine to prevent, treat and in some cases cure disease, President Obama announced that the US will get into the game with a $215 million dollar precision medicine initiative to expand initial successes into a large-scale effort. While the initiative will use data from existing cohort studies, widespread participation, including the public and the breadth of stakeholders, is vital to the 2nd core objective of creating a research database of 1 million genomes with related clinical and other types of information which will generate a new taxonomy of disease based on what the National Academy of Science http:www.nap.edu/catalog/13284/toward-precision-medicine-building-a-knowledge-network-for-biomedical researchcalls an information commons and knowledge network of disease and treatments.

Excitement is high and planning began within weeks of the initiative’s announcement. The NIH hosted a two day workshop where experts, representing a broad range of diverse disciplines and white papers, presentations and thoughts about opportunities, challenges and strategies for successful implementation, http://www.nih.gov/precisionmedicine/workshop.htm. 2500 viewers engaged through WebEx. Regulatory officials recently announced plans to use the $10 million expected to build full or hybrid cloud storage, open-source data sharing platforms and Google-like search tools.

CAUTIOUS OPTIMISM

It is easy to get excited about this new initiative, particularly because it promises to propel precision medicine efficiently and effectively. But, combining and mining data from disparate third party sources will be no small feat, particularly considering the adoption process of electronic medical records and existing interoperatability problems in linking existing databases. Cautious optimism may be prudent, given that existing smaller scale sequencing projects are moving at a snail’s pace or stalling out. The Million Veteran Program, (http://www.whitehouse.gov/sites/default/files/microsites/ostp/kupersmith.pdf) launched in 2011, intends to combine individual genetic information to determine associations between genetics and health status to better screen, diagnose, and prognose disease and develop targeted treatments, only recently (less than 6 months ago) awarded the genomic analyses contract. As of last year at this time, i.e. three years into the project, only 200,000 veterans had enrolled. To be fair, though, a recent report indicates that the Project already has 343,000 samples and has partially analyzed 200,000 participants[CI6] (http://www.technologyreview.com/news/534591/us-to-develop-dna-study-of-one-million-people/. Progress has been, in other words, slow. If you are unaware of the project’s status, that could be because the government has said little. Another relatively large scale project, the National Children’s Study, which was designed to collect 100,000 genomes at birth, and slated to start next year, was shut down at the end of last year based on a report that found that the design was not feasible. Congressional battles about funding issues did not help matters. These events will hopefully inform planning efforts so that the current initiative will not succumb to the same.

Pundits have expressed concern about the government’s ability to manage such a large dataset given technological failures of HealthCare.gov. The insurance exchange site crashed and shut down for 5 hours upon launch. A technological glitch related to income verification prevented an unknown number of public from submitting applications and a hacker claimed to have obtained 70,000 records containing personal identifying information.

DATA SHARING

A separate concern is whether the initiative will permit broad based data sharing. Currently, while publicly funded research is deposited into publicly accessible databases, as a practical matter only research institutions with the resources to devote to filing laborious applications can hope to gain access. Small labs and institutions with far lower operating budgets cannot devote scarce resources to securing access to data, and thus their research is severely limited. Without data access, they simply cannot pursue certain research and thus talented researchers are beginning to redirect their careers to plant and animal genomes, since they are more readily accessible. http://www.sciencedirect.com/science/journal/22120661/3/4

ENGAGING THE PUBLIC

Public engagement is clearly vital. And media coverage may be greasing the skids for success. After all, when the President of the United States announces a biomedical initiative in his (or perhaps some day her) State of the Union address, many take notice; http://www.whitehouse.gov/the-press-office/2015/01/30/fact-sheet-president-obama-s-precision-medicine-initiative.  When USA Today reports on details of the initiative, its clear that precision medicine is trying to make its way into every home in America; http://www.usatoday.com/story/news/nation/2015/01/30/obama-precision-medicine-initiative-white-house/22547019/.

A NEW RESEARCH PARADIGM?

If the precision medicine initiative can overcome both of these impediments, it will have achieved a great deal. Good planning is a start and timely execution important. If the initiative succeeds in being transparent, engaging the public and enabling broad based data sharing then a new paradigm of open and collaborative research will have been established. Such will undoubtedly propel precision medicine significantly forward and ideally support precision medicine efforts underway in various places around the globe. 

The opportunity for thoughtful public input is great given the discussions that will occur at the plethora of upcoming precision medicine conferences. Hopefully, the NIH and the FDA will invite public comment, because the discussions that will occur at precision medicine conferences are likely to generate valuable insights.

Rate this blog entry:
Continue reading
4626 Hits
0 Comments

New Policy Primer on Genomic Sequencing: Who Will Pay? Who Should Pay?

Genomic sequencing is moving into clinical care, but this technological advance threatens to outpace our ability to use it effectively in clinical practice and to address the associated health policy issues. A key issue is whether payers will cover sequencing and what evidence will be needed to document its value. The TRANSPERS (Ctr for Translational & Policy Research on Personalized Medicine) Center at UC San Francisco has just released a Policy Primer for Genomic Sequencing, the press release for which can be viewed here: http://tinyurl.com/owbzaya

Rate this blog entry:
Continue reading
2412 Hits
0 Comments

Bridging the Education Gap in Precision and Personalized Medicine

Uptake of genomic and precision medicine is hindered in part by an undereducated health care workforce. Last Spring, UCSF launched the first large-scale open online course on this topic for health care providers to an international audience of 13,000. 
 
The course aims to provide participants with baseline knowledge of genomics, an overview of the clinical applications of genomic medicine, the skills to evaluate the clinical validity and utility of new tests, and an appreciation of the associated ethical and social issues inherent in this field. 
 
Precision Medicine has the potential to change fundamentally how health care is practiced, but requires a health care workforce that understands the complexities of this field. As one of the instructors of the course, Jeanette McCarthy’s research on the genetic underpinnings of infectious and chronic diseases has led to over 50 publications in peer-reviewed journals.
Rate this blog entry:
Continue reading
2612 Hits
0 Comments

Review and Interpretation of FDA’s Draft Guidance for Regulation of Lab-Developed Tests

Foley & Hoag’s Bruce Quinn, MD, Ph.D., a national expert on Medicare policy and health reform has offered his thoughts on the FDA’s recent notice of intent to issue draft guidance for regulation of lab-developed tests (LDTs) in a couple of forums recently. Here are the urls:

http://tinyurl.com/l2awysc

http://tinyurl.com/kqaejrm

Rate this blog entry:
Continue reading
2443 Hits
0 Comments

Medicine’s Future: Genomics for Practicing Doctors

New discoveries in genomics research are exciting to physicians because they can see that genomics targeting may provide greater effectiveness in patient treatments. In collaboration with the Genomics Medicine Institute (GMI) at El Camino Hospital and Genetic Alliance, NCHPEG (National Coalition for Health Professionals Education in Genetics) has developed a CME curriculum on genomic medicine.

Rate this blog entry:
Continue reading
2611 Hits
0 Comments

About Us

Our events give attendees a conference experience that encompasses learning, networking and professional growth. We strive to facilitate connections. At our events, attendees, speakers, sponsors and exhibitors have opportunities to network and then to utilize those connections to further their professional goals. At Arrowhead Publishers, our focus is on bringing life sciences industry professionals together to help move research forward. Learn more about us at www.arrowheadpublishers.com.

Contact Us

For general information about the conference, please contact us at:
 This email address is being protected from spambots. You need JavaScript enabled to view it.
 +866.945.0263
 +866.945.0263
 5780 Lincoln Drive, Suite 205, Edina, Minnesota 55436 USA