Skip to main content

Amy Abernethy, MD, joined Verily just last week and she is already talking about the next generation of clinical research. That is only natural given her work as Principal Deputy Commissioner at the FDA and, prior to that, Chief Medical Officer of Flatiron Health.  

Now, as President of Verily’s Clinical Research Platforms, Abernethy will be able to focus on shaping smarter design trials with the necessary raw compute power to drive clinical research into the modern era.  

Health Evolution Editor-in-Chief Tom Sullivan spoke with Abernethy about why she chose to join Verily, her vision for clinical research with the company, how the controversial FDA decision to accelerate approval of Biogen’s Alzheimer’s drug shapes her thinking about drug discovery, development and approval, and more.  

What inspired you to join Verily? I imagine you had a lot of opportunities …  

Abernethy: While I was at FDA one of the things that I had been thinking a lot about was the importance of updating our clinical research infrastructure and our approach to clinical research in America and beyond. During the pandemic one of the things that was evident was that our clinical development engines for discovery and drug development are humming along pretty well. Our regulatory apparatus was one of the things I worked on at FDA to make sure that regulatory operators could scale up as more clinical drugs are coming down the pipeline. But it was really clear that our clinical research capabilities are very 1995 and we have to update that infrastructure.

So then the question became: where do I point my arrows? I looked at a number of different approaches, everything from running a health system to going to large pharma to becoming an investor to starting a company on my own. My belief is that we really need to put software and hardware to work in order to revise how clinical trials and clinical research are done. And that has to align with a regulatory perspective as well as the compelling study designs in 2021 and study designs of the future. So I chose Verily because I felt like it had the engineering and the software development prowess, as well as the data management capabilities. The company had already built some of the core elements and is poised to take it to the next level. I felt like it was a great match between what I know how to do and the organization’s vision to pull these pieces together. 

Following that, what is your vision for achieving that future state of clinical research? Or what’s your charge going into the role? 

Abernethy: I have a ton of work to do to understand what is already done, and what needs to be done next. You can imagine that I am going to spend very detailed time getting under the hood and building a 100-day plan to identify what we need to do and how we’re going to do it. That being said, I think that the areas of focus will likely be in four critical elements. 

First, making it easier for all people to participate in clinical research. That’s certainly been a massive focus at Verily and it was one of the things that attracted me to the company. Now I see the opportunity to look at participant recruitment and outreach capabilities to understand what other horsepower we can add and how to make that as inclusive as possible for everyone.  

The second one is what I call Wick Away the Work. We should be leveraging software and data and putting it to work to make it so that anything that can be automated, whether through software or algorithms, is automated within the context of clinical research. That’s something many companies are poised to do, but Verily is particularly situated to do because of the Alphabet history and ongoing focus on using, for example, sensors and devices and smart capabilities for data collection as well as remote patient evaluation with video. There are so many things in clinical trials we do by hand because we’ve always done it that way but it’s just not necessary anymore. 

Third is better use of data. We should be thinking about how we use data from all sources to complete the clinical research dataset but we have to do so in a way that is transparent, fully traceable and we have to understand the implications of data quality. There are a lot of elements to being smarter about the data we collect. The data that we use is going to be an important part of improving clinical trial infrastructure. 

And then the last one is something that I don’t think gets enough discussion which is improving study design. We saw during the pandemic, for example, the use of platform trials like a live recovery trial in the UK to test multiple interventions simultaneously. These are not brand-new trial designs, we’ve been talking about them for a decade or more, but they’re complex to put in place and we should be using software and data to reduce the complexity. We should be trying to understand how to overcome some of the cultural and contractual types of errors. We need to do be smarter about study design and we should be much smarter about things like eligibility criteria and the use of biomarkers. So those are other critical design features we need to be better at. 

What are some of the big projects at Verily and what should our CEO readers understand about the potential for breaking new ground in the short term?  

Abernethy: The big one is Project Baseline. It started as an epidemiological study to collect all kinds of critical biologic and patient-reported data, bio sensor type details about a cohort of participating individuals. Project Baseline also aims to build the software and capabilities that make it easier for people to participate as well as collect and analyze data and generate research and regulatory grade datasets. Project Baseline is going to be a huge critical foundational moonshot and, in fact, has been pointing the way for how to build new clinical research infrastructure. And now I’m joining the Baseline and Verily team to take this to the next level.  

Relative to the future of clinical research, and from your perspective in the private sector and with FDA, how does the controversial decision about Biogen’s Alzheimer’s drug change the way you think about drug discovery and development or clinical trials for the future 

Abernethy: Like everyone, I watched carefully to see what was going to happen and I think there are a couple of things that are particularly important about leveraging the accelerated approval pathway. I’m an oncologist … so let me just say that out loud because to me the accelerated approved pathway has been critical to some of the spectacular progress we’ve made in cancer. So I see it as an interesting development here that signals part of where the future is headed. It set us up in a situation just like we’ve seen over and over again in oncology where we have a biological hypothesis, therefore biological plausibility for a clinical outcome based on that possibility, but then the expectation is that the sponsors are going to come in and present the confirmatory information. That’s what we’ve seen over and over again with accelerated approval in cancer and what we see now with the Biogen drug is that pathway being leveraged unexpectedly. That’s okay as a mechanism to strike a balance between what is presumably compelling clinical data and biologic data but needs more information to confirm the findings. 


Read more: Biogen’s Alzheimer’s drug gets FDA approval: 5 things to know


What does this tell me about the future? A couple of things. The first is that I think we will be seeing more and more continuous evaluation of products across their life cycle. We’re starting to see signals of that on the device side and FDA’s CDRH [Center for Device and Radiological Health] has been very clear that it sees total product lifecycle evaluation of devices as the future. We’ve been starting to see that more and more in the drug and biologic side and I think that the Biogen Alzheimer’s drug approval is pointing in that direction. Is this pointing in the direction of conditional approval? I don’t know. I’ve always believed the Europeans are onto something but, then again, that could come from me being an oncologist.  

In the Health Evolution Forum Work Group meetings, you explained that one of the biggest challenges is that people developing AI, algorithms and other technologies need access to real world data to produce accurate models. So with Verily, what’s your vision for enabling that access to real world data? 

Abernethy: And I would actually add access to unbiased real-world data. That’s one of the things to really understand how we’re going to do that. I think, though, that the first part of what you hit is that access to the datasets to build the algorithms is going to be a key element. That’s absolutely true whether those datasets are held by Verily or datasets which somebody else holds and we learn together to responsibly build the algorithms. I’m not really sure what that’s going to look like yet. Certainly one of the things we’ve been talking about and helping with is how to bring these pieces together in a responsible way. I wouldn’t be surprised if this is a conversation that I keep finding myself in the middle of because you cannot leverage these datasets if you can’t access them. But in accessing the data you have a huge responsibility to get it right and you know I don’t know exactly how we’re going to solve it yet. Verily may have already solved it and I don’t know yet, but as we move further and further down the path toward using these datasets, we’re going to need to figure that out. 

In doing your homework to decide whether to join Verily, what surprised you most about the enterprise?  

Abernethy: Several things surprised me. The first one when I started looking was that I discovered that there have been a number of really critical senior leadership hires over the last six to twelve months. A chief revenue officer, a chief marketing officer, hiring the CFO from Tesla, and a general counsel from Amgen. These examples point in the direction of a company that’s putting in all of the operational acumen that goes along with its engineering bench strength and scientific and medical bench.  

The second thing I was surprised at is how many things they had already built. I was struck by that especially on the research side. I had already written down a laundry list of things I thought needed to be built for the clinical research of tomorrow and Verily already has built quite a few of them. So I found that very compelling.  

The third thing was something somebody said to me in one of my interviews which is that ‘you can have all these elements but at the end of the day, you absolutely need raw compute power.’ A company that is built on that foundation is able to create something to deliver that horsepower. That’s critical because I want to make sure that we’re very thoughtful about managing the privacy and security and the unique regulatory function of many elements in the right way. That’s particularly important in a world where we’re all kind of on edge around privacy.   

This wasn’t a surprise, but the company has access to the people who know how to build the kind of scale as it relates to compute power. You can’t do the things we’re talking about in clinical research 2.0 without really intensive computational capability. So there’s the importance of figuring out how to have enough compute power to pull off complex adaptive design trials. I just don’t think people have really thought about how hard some of these things are going to be if you’re trying to run it as a single academic medical center or operating on a shoestring budget, which is what I’ve seen in the past. 

Tom Sullivan

Tom Sullivan brings more than two decades in editing and journalism experience to Health Evolution. Sullivan most recently served as Editor-in-Chief at HIMSS, leading Healthcare IT News, Health Finance, MobiHealthNews. Prior to HIMSS Media, Sullivan was News Editor of IDG’s InfoWorld, directing a dozen reporters’ coverage for the weekly print publication and daily website.

X