Multi-institutional meeting centers on making the ‘impossible’ possible in cancer research

March 22, 2019
panel

Lawrence Livermore National Laboratory Chief Computational Scientist Fred Streitz (holding microphone) and other panelists discussed advances in the Joint Design for Advanced Computing for Cancer (JDACS4C) Program. Part of the Cancer Moonshot project, JDACS4C is a partnership among the Department of Energy, National Cancer Institute and four DOE national laboratories. Photos by Julie Russell/LLNL (Download Image)

Multi-institutional meeting centers on making the ‘impossible’ possible in cancer research

Computer scientists, biomedical engineers, cancer biologists and bioinformaticians from eight Department of Energy (DOE) national laboratories, health-related government agencies and universities converged at Lawrence Livermore National Laboratory (LLNL) March 6-7 to discuss ongoing efforts to advance cancer research through computation, identify existing challenges and brainstorm future multidisciplinary projects.

During the two-day scoping meeting at LLNL’s High Performance Computing Innovation Center (HPCIC), panelists updated attendees on the progress of DOE collaborations with the National Cancer Institute (NCI) and the Frederick National Laboratory for Cancer Research (FNLCR). Presenters spoke of the importance of converging cancer biology and computation and the impact of emerging techniques such as artificial intelligence and machine learning on cancer research and precision medicine. Attendees participated in breakout discussion groups designed to encourage researchers across various institutions to engage with each other, draft “lean-in” challenges to focus attention and discuss potential collaborations.

“It’s exciting for DOE researchers to get to work on something really meaningful that touches people’s lives — everyone knows somebody who has cancer, so it brings our best thinking to the table,” said Carolyn Lauzon, program manager in DOE Office of Science’s Advanced Scientific Computing Research program. “I’m hoping long term that this kind of meeting is part of a much longer process to inject high-performance computing into NCI in a way that they currently aren’t using HPC. We don’t see a lot of cancer researchers using our leadership computing facilities, even though we know they have challenges that could. It would be great if we could participate and engage with them in a way that creates a cultural shift in how they use this technology.”

Panelists and presenters discussed advances in ongoing cancer-related projects including the Joint Design for Advanced Computing for Cancer (JDACS4C) Program. Part of the Cancer Moonshot project, JDACS4C is a partnership among DOE, NCI, FNLCR and four DOE national laboratories. It is comprised of three pilot projects: a cellular-level pilot to develop computational models to predict how tumors will respond to anti-cancer drugs; a molecular-level pilot looking at simulating the molecular interactions and dynamics of the RAS protein, whose mutations are tied to about a third of human cancers; and a population-level pilot aimed at applying deep learning and other advanced computing technologies to cancer surveillance data to predict patient outcomes. The effort is supported by a collaborative open-source software platform incorporating deep learning called CANcer Distributed Learning Environment (CANDLE).

LLNL Chief Computational Scientist Fred Streitz, who is co-leading the RAS protein pilot, said although “some people said [the project] was impossible,” it has made tremendous strides in multiscale modeling on LLNL’s Sierra supercomputer, allowing researchers to analyze the protein interactions with plasma membranes on both a macroscopic and microscopic level using machine learning.

“We now have this capability and it actually works, and we are seeing the buy-in of the biology community because they understand now what’s possible,” Streitz said. “Before Sierra was taken away (moved to classified research), we managed to run a very large simulation campaign, literally between 100,000 to 200,000 simulations representing many hundred milliseconds of time on this system that we can explore, and that’s amazing. That’s what gets people up in the morning — getting some of the answers. We can now start to explore the science, and that’s what’s very exciting.”

The project is transitioning to Lassen, Sierra’s unclassified counterpart. Project co-lead Dwight Nissley, director of the Cancer Research Technology Program at the Frederick National Laboratory, said the team is moving to the second stage, attempting to understand the activation process of RAF proteins and the interaction of RAS and RAF proteins. In a broad sense, he said, NCI is interested in the interface between computation and biomedical research to help close gaps in experimentation.

keynote
Peter Nugent, a deputy for Scientific Engagement in the Computational Research Division at Lawrence Berkeley National Laboratory, gave a keynote speech about his research into using machine learning analysis of telescopic images of space to automate the search for supernovae. The research was an example of a “multidisciplinary triumph” that successfully implemented advanced technologies to more quickly sort through massive amounts of data, resulting in unprecedented breakthroughs.

“There are things that are just technically not possible to do experimentally now, either because of lack of the technology or the resolution in the experimental outputs isn’t sufficient to understand molecular details,” Nissley said. “Right now, we’re spreading the good news of what’s possible to do with these types of collaborations, and I’m hoping these kinds of efforts will spread across the NCI so that other groups will also take advantage of the computational capabilities at DOE and here at Livermore. Whenever you come to a meeting like this, you see collaborative possibilities and it stimulates new ideas and new thinking.”

NCI Program Director Emily Greenspan said the vision of the NCI/DOE partnership is a cancer-focused national learning health care system that derives knowledge from all relevant patient data. The goal, she said, is to develop predictive models and simulations for what cancer researchers should see, and prescriptive analytics for what researchers should do. Achieving this goal, she said, would require building a scalable data infrastructure, new forms of data analysis and predictive modeling and developing a large and capable data science workforce.

“We have this starting point of our interagency collaboration and we feel like now is the opportune time to expand and broaden the impact, engagement and potential collaborations that are possible,” Greenspan said. “We hope to use this scoping meeting to build a community at this intersection and really start to develop the vision for what cancer challenges can compel innovation and advanced computing, and on the flip side, what computing innovations can drive new knowledge and innovation in cancer.”

Frederick National Laboratory took a lead role in organizing the meeting as part of an ongoing effort supported by the NCI. The effort looks to expand involvement in the broader cancer research community and define leading-edge efforts that push the limits of cancer research challenges, employing capabilities and experts at the DOE laboratories in computing, light sources, advanced materials and other areas, according to FNLCR Director of Biomedical Informatics and Data Science Eric Stahlberg.

“Even as the collaboration between NCI and DOE started formally with a signed memo of understanding in June 2016, it was clear that there would be many future opportunities to explore how the agencies and their respective communities could join together and truly lean in to key challenges both in computing and in cancer research,” Stahlberg said. “This meeting represents a major step in that direction, catalyzing the formation of ideas and to start the building of the community to carry new ideas and concepts forward.”

An afternoon panel featured representatives from eight different DOE national laboratories, where researchers discussed the capabilities and resources at DOE in both computing and expertise and how those resources have led to partnerships and long-term projects.

“DOE has the largest collection of the fastest machines on the planet of any single organization,” said Argonne National Laboratory Associate Lab Director Rick Stevens, principal investigator for CANDLE and co-lead of the cellular-level JDACS4C pilot project. “We have the largest number of computer scientists working in such an organization that does science [and] that means we also have a lot of mathematicians and people that do graphics and statistics, so there’s this huge resource that’s been created over the last 50 years. While we didn’t create exascale to do cancer, now having created it we want to apply it to cancer.”

Peter Nugent, a division deputy for Scientific Engagement in the Computational Research Division at Lawrence Berkeley National Laboratory, gave the first day’s keynote speech, detailing his research into using machine learning analysis of telescopic images of space to automate the search for supernovae. Combined with a faster data pipeline, the work dramatically reduced the amount of time researchers needed to spend searching for supernova candidates. The case study served as an example of a “multidisciplinary triumph” that successfully implemented advanced technologies to sort through massive amounts of data and reached unprecedented breakthroughs.

“High-performance computing and working with people in computer science, database, networking and I/O helped improve a pipeline and go from a very simplistic idea of just processing to getting an answer the next day and moving on, to opening up completely new avenues of science people didn’t even think were possible,” Nugent said.

In the following panel, professors and researchers from multiple universities including Harvard, Stanford and the University of Pittsburgh, along with several health-related institutions, discussed challenges in cancer research, namely barriers to accessing medical data, optimizing prevention and health and gaining the trust of physicians who may someday use the models to determine treatment plans.

“We don’t just want to do the big simulations because it’s something we want to do, we want it to be usable,” said Amanda Randles, a former Lawrence Fellow in LLNL’s Applied Scientific Computing program who is currently a biomedical engineering professor at Duke University.

The scoping meeting drew about 80 attendees. On the meeting’s second day, Warren Kibbe, chief for Translational Biomedical Informatics in the Department of Biostatistics and Bioinformatics and chief data officer for the Duke Cancer Institute, presented a keynote on “Blue Sky Possibilities at the Intersection of Oncology and Computing.” The day was primarily devoted to attendees assembling in writing groups that addressed each “lean-in” challenge area and gave presentations on the results.

LLNL’s Director of Strategic Engagements and Alliance Management for the Physical & Life Sciences Directorate Amy Gryshuk, a member of the meeting planning team, said she was pleased with the turnout and the discussions that enabled a forum for people to meet from diversified areas of research.

“A scoping meeting like this provides a forum to discuss future strategic directions and lean-in with forward thinking about what critical challenges we want to tackle in the next 10-15 years.” Gryshuk said. “As we continue to bring these communities together, we can push the limits and ask, ‘what cancer solution do we want for the physician and patient, and then with the DOE computing capabilities that we have and with what advancements are being made, what can we do together collectively?’ The attendees were challenged to get out of their seats, have dynamic interactions and produce some tangible ideas for potential future collaboration.”

DOE Office of Science and National Nuclear Security Administration (NNSA) representatives said the meeting, the first in a series with the cancer research community, was important to allow researchers to explore existing challenges, identify new possible areas for further research and envision how solving complex problems like cancer would push DOE’s advanced computing systems and technologies such as AI and machine learning, ultimately serving DOE’s mission interests.

“There are some very interesting and unique things about trying to solve problems in the cancer space that I think excites the community and inspires new directions on how to attack problems in a completely different way. Without the traditional constraints, maybe a new capability can be created that can then broadly benefit more of the DOE mission,” said Michael Cooke, program manager in DOE Office of Science’s High Energy Physics program. “I’m looking forward to where things might head next.”

NNSA Advanced Simulation and Computing (ASC) Program Manager Thuc Hoang said she hopes that by testing massive, hard-to-classify cancer data sets with some of the ASC program’s uncertainty quantification techniques, researchers will be able test the boundaries of existing computing technologies, while at the same time helping to advance the entire cancer research community.

“We within the NNSA program feel like we are the leader in the HPC business, and we have the tools to help other federal agencies and other communities that aren’t as advantaged in computing to benefit from all the investigations we’ve done,” Hoang said. “We’re essentially sharing the fruits of our labor.”