© ISTOCK.COM/ALASHI

In 2003, the US Food and Drug Administration (FDA) approved the first targeted therapy for non-small cell lung cancer (NSCLC). AstraZeneca’s Iressa (gefitinib) had shrunk some patients’ tumors in small clinical trials, and despite some concern about the robustness of the results, regulators OK’d the therapy as an option for patients when chemotherapy and other generalized cancer-attacking treatments had failed. But after only a few months on the market, it was clear that something was amiss: for up to 90 percent of patients, the drug simply didn’t work.

While AstraZeneca struggled to explain this disappointing outcome, within a year three academic groups independently came to the same conclusion: gefitinib, it appeared, was only effective in patients harboring certain mutations in EGFR, a gene coding for a cell-surface protein receptor involved in cell signaling; patients without these mutations had no noticeable response to the drug. Based on...

In retrospect, gefitinib’s initial failure hardly seems surprising: patients are genetically diverse, and such variation can affect how people respond to particular therapies. But until recently, doctors didn’t have much of a choice in treatments, says Ronenn Roubenoff, head of global translational medicine in musculoskeletal diseases at Novartis Institutes for Biomedical Research. “In the old days, we used to just have a few drugs and so we’d give them to everybody,” he explains. As a result, outcomes were often poor—prompting some patients to stop taking (and purchasing) their meds altogether—and neither physicians nor drug companies could explain why, he adds. “Everybody’s kind of bumbling around in the dark.”

Today’s emerging field of precision medicine—previously referred to as personalized medicine—takes advantage of this genetic diversity to tailor treatments to those patients most likely to respond. Rather than administering therapies based solely on the disease, new strategies to identify biomarkers, such as EGFR mutations in NSCLC patients, help doctors select the best therapies for individual patients.

This evidence-based approach could reduce trial-and-error prescribing and help avoid adverse reactions to drugs—and might even lower the ultimate cost of health care. In 2015, in recognition of this potential, President Obama announced a nationwide research effort, the Precision Medicine Initiative, to support the approach’s use in oncology (where precision medicine has had the most success so far) and for diverse other conditions, including neurodegenerative, autoimmune, and cardiovascular diseases.

This enthusiasm is mirrored by the drug development industry. That same year, 28 percent of drugs approved by the FDA were considered precision therapies, and an October 2016 report predicted the global precision medicine market will reach nearly $113 billion by 2025. But such a transition inevitably brings challenges. For companies built on the traditional, so-called blockbuster model of drug development—which aims to produce drugs that are applicable to many patients, thus generating high returns—precision medicine presents a very different commercial landscape.

By definition, precision drugs target only a subset of a particular patient population, reducing pharma’s ability to recoup development costs through sales. Moreover, as gefitinib’s setbacks illustrate, drug design is now just part of a therapy’s development. Diagnostic tests to identify responsive patients must be created alongside a new targeted therapy, and determining which biomarkers are reliable indicators of a therapeutic response can often involve sifting through unprecedentedly large volumes of data early in preclinical and clinical testing.

For companies built on the traditional blockbuster model of drug development, precision medicine presents a very different commercial landscape.—Peter Bergethon, Pfizer

Faced with these challenges, leaders in pharma are looking for opportunities to partner with other drug companies, tech firms, regulators, and even patients themselves. For precision medicine to work, “as an industry, we need to collaborate more,” says Sy Pretorius, chief scientific officer of life sciences consultancy firm Parexel. “It’s rare that one team can do it all by itself.”

“There’s a recognition that the problem requires a broad swath,” agrees Peter Bergethon, vice president for quantitative medicine at Pfizer’s neuroscience and pain research unit. “The idea that a single person, single group, or single discipline has the answers to solve a problem like personalized medicine is now appreciated as not being true. It’s in the nature of the problem.”

Timing it right

To realize the potential of precision medicine, health-care providers need biomarkers predictive of a therapy’s impact, as well as companion diagnostics to detect them. Ideally, the availability of those diagnostics matches up with the availability of the therapies they complement.  “Biomarkers are like pizza delivery,” Roubenoff notes. “You need it fresh, hot, and on time. It doesn’t do anybody any good to have the biomarker developed five years after the drug.”

But synchronizing drug and diagnostic development is easier said than done. To date, only a handful of targeted therapies have been approved simultaneously with a companion diagnostic. “What most people don’t appreciate is that companion diagnostic development takes several years and adds an additional layer of complexity to what is already a pretty complex development process,” says Pretorius. The resulting dilemma is known as precision medicine’s chicken-and-egg problem. On the one hand, waiting to explore companion diagnostics risks holding up effective therapies. On the other, investing heavily in diagnostics early on is risky, as many drug candidates never make it to market. (While a diagnostic test may be useful on its own, it offers lower return on investment without the accompanying drug.)  

To tackle this problem head-on, many companies are making diagnostics expertise a more integral part of drug discovery, whether through in-house development, acquisitions, or partnerships. In 2013, for example, pharmaceutical giant Eli Lilly signed a collaborative agreement with Qiagen, maker of the gefitinib diagnostic test and of a diagnostic for the cancer drug Erbitux (cetuximab), which Lilly developed in partnership with Bristol-Myers Squibb. And last year, AstraZeneca cut a deal with diagnostics company Abbott to develop a test to identify patients who might benefit from an asthma drug; AstraZeneca noted that it expects half of its drug launches to include companion diagnostics by 2020.

The result of these partnerships is that pharma can adopt a more integrated, evidence-based approach to drug development. “We have been able to drive the biomarker process earlier into the drug discovery paradigm,” says Roubenoff. At Novartis, whose collaborators include biomarker discovery company SomaLogic, “we start looking for the biomarkers basically at the time that we start finalizing the drug candidate,” he adds. “There’s a lot of pre-work that wasn’t really on the agenda 10 or 15 years ago that’s now become pretty routine.”

The FDA is also now working to help smooth the transition to synchronous development, in part by clarifying its expectations. In 2014, the agency issued guidance aimed at “stimulating collaboration” between diagnostics and drug companies, and last year, the agency published updated guidelines on how companies can merge drug and diagnostic pipelines to increase the likelihood of simultaneous approval. But there’s still a way to go, Pretorius says. “I firmly believe regulators in general are trying to come to terms as quickly as possible with how to deal with it,” he says. “It’s just so new, I don’t think as a society we’ve got our arms completely around it yet.”

Dealing with data

Before a drug and its companion diagnostic can be approved, both must pass muster for efficacy and safety. But testing a precision therapy differs markedly from testing blockbuster drugs. Patients can no longer be drawn from the population at large, but must be carefully selected—perhaps based on their DNA—to match the aims of the trial. “There’s a school of thought in the industry that randomized clinical trials have served their purpose,” says Pretorius. Newer, stratified approaches to clinical testing are becoming increasingly common, with the promise of reducing costs by increasing the probability of success.

See “Clinical Matchmaker,” The Scientist, June 2015.

The idea that a sin­gle person, single group, or single discipline has the answers to solve a problem like personalized medicine is now appre­ciated as not being true. It’s in the nature of the problem.

This approach presents its own set of hurdles in terms of both trial design and statistical analysis. To draw meaningful conclusions, researchers need more clinical data, including genomic and potentially other omics information from every patient involved. “The sheer volume of data associated with precision medicine is itself a significant challenge,” says Pretorius. “Collecting, managing, interpreting these mega–data sets is more difficult than most people appreciate.”

One way that pharma is addressing this issue is through sharing data and analyses. “For genomics, virtually all companies are now taking an open approach,” says David Goldstein, director of Columbia University’s Institute for Genomic Medicine, which recently entered a $30 million alliance with drug developer Biogen Idec. This openness perhaps “reflects an awareness in companies that they don’t have the ability to do it all,” he adds. The government is getting involved, too. In 2015, the FDA launched a cloud-based portal called precisionFDA to foster a collaborative approach to developing the tools needed to evaluate DNA sequence data. Participants already include health-care providers, academic medical centers, pharmaceutical companies, and patients.

Right now, the main obstacles to data sharing are technological, says Michael Blum, director of the Center for Digital Health Innovation (CDHI) at the University of California, San Francisco. He and his team are working towards “an information commons [to] describe not only millions of individuals, but life science, molecular, and omics data,” he explains. In addition to being able to host massive data sets, such platforms must be designed to ensure that personal medical data are secure. “The model for sharing the data with industry is still being developed,” he says.

But even as the tools for sharing and processing such large volumes of data evolve, researchers continue to up the ante, developing new medical devices that can record patient data regularly, even outside the clinic. With wearable technology, “you can now collect heart rate and blood pressure data on an ongoing basis, instead of when you happen to be in the office and checking it,” says Blum. “That really changes the paradigm of what we learn.”

Looking further ahead, Pfizer has teamed up with IBM to launch the Blue Sky project, which aims to develop technologies such as Internet-connected motion sensors and other devices to monitor patients in their homes and upload the data in real time. The technologies are being evaluated in trials with Parkinson’s patients, with the view that the almost continuous data series could allow researchers to track the effect of a treatment and pick out relevant variation in patient responses. “You’d be identifying the noise, rather than hoping it gets washed out in a population study,” explains Pfizer’s Bergethon. “That’s the precision medicine approach.” Such technologies could also “allow you to do faster, less expensive trials,” he adds.

Like companion diagnostics, these new digital solutions to data handling are likely to involve collaboration among diverse companies if they are to become a regular part of precision medicine in the future. “I think that we will wind up with companies partnering with the guys who are good at doing the different pieces,” says Roubenoff. “Everything is now moving, or has moved, toward an integrated approach that requires biomarkers, requires consideration of digital endpoints, and requires that the core aspect of the chemical or biological medicine has a lot more around it than was the case even a few years ago.”

If successful, it’s an approach that ultimately promises to benefit patients—the motivation for advancing precision medicine in the first place, Roubenoff adds. After all, he says, “we’re not really interested in giving people drugs that don’t work.” 

Interested in reading more?

Magaizne Cover

Become a Member of

Receive full access to digital editions of The Scientist, as well as TS Digest, feature stories, more than 35 years of archives, and much more!
Already a member?