HPC's potential to transform bioscience explored at conference
That was the theme that emerged from the "Current Challenges in Computing 2013: Biomedical Research" conference, or CCubed, held earlier this month in Napa. Sponsored by IBM with support from the Laboratory, the meeting brought together thought leaders in biomedicine from government, academia and industry to discuss opportunities for accelerating the development of biomedical tools using petaflop (quadrillion floating point operations per second) class supercomputers.
"Computing is at a tipping point where it can play a much larger role in biomedical research," said Fred Streitz, director of LLNL's Institute for Scientific Computing Research and the High Performance Computing Innovation Center. "This is because of the level of computing power we're now reaching and the fact that the biomedical community is becoming aware of HPC potential."
Streitz noted that "the promise of computing in the biology space was overhyped 10 years ago" because supercomputers at the time lacked the number crunching power to handle models and simulations that were accurate enough to advance the field. However, developments in biology and computing over the last 10 years have changed that.
"Computing technology has advanced to where we can model organs, like the human heart beating in near real time," he said. "And that's just the tip of the iceberg. The field is starting to realize they can use computing as a tool in a way they were not able to five years ago."
Kirk Jordan, distinguished engineer in IBM's Research Division, said biomedical research "is an area that has emerged over the last 10 years and on which HPC can have an impact. That impact can be much greater when you bring together multi-disciplinary teams of biomedical researchers and computational scientists. One of our goals was to raise awareness of the medical and computational expertise that can be applied to biomedical research."
Today's supercomputers can also help the biology and biomedical fields address one of their biggest challenges - the enormous amounts of information that need to be processed and correlated, data measured in tera-and-petabytes (trillions and quadrillions of bytes). "Bioscientists are realizing that they're sitting on a gold mine of data and they don't know quite how to get to it. It's all there and they paid for it all. How do you get out the information back out?" said Ken Turtletaub, Bioscience and Biotechnology division leader at LLNL . "It's truly a 'big data' problem."
Applying big data analysis to bioscience is an emerging field called bioinformatics.
The 2013 edition of CCubed attracted leaders in the field including: Anna Barker, director, Transformative Healthcare Knowledge Networks, co-director, Complex Adaptive Systems Initiative Professor, School of Life Sciences Arizona State University, and former deputy director of the National Cancer Institute; Peter Hunter, Friend of the Royal Society and Physiome Project leader; Peter Sorger of Harvard and MIT; Hugo Vargas, head of Safety and Exploratory Pharmacology for Amgen; Sian Ratcliffe, executive director and head of Global Safety Pharmacology at Pfizer; and Kirk Jordan of IBM Research to name a few. Computational scientists representing many different institutions also were in attendance, including IBM's Jeremy Rice and LLNL's Dave Richards of the joint Cardioid project to model the human heart.
Previous CCubed conferences covered global climate modeling, energy resource modeling and network science. The CCubed conference series brings together thought leaders from industry, government and academia to explore what it takes to expand the computational capabilities of a particular field. Kept small and informal, the vibrant mix of field leaders and the casual setting encourages lively conversation, yielding opportunities to establish potential collaborations.
The two days of meetings this year generated some exciting, even futuristic ideas, such as the notion that researchers could write computer programs smart enough to understand the nuances of language that could not only analyze data but generate hypotheses.
But discussion also focused on using HPC to address practical problems and challenges such as those faced by the pharmaceutical industry in developing new drugs. In the drug development cycle, a drug becomes exponentially more expensive as it advances through phases. Consequently, pharmaceutical companies want to establish as early as possible that the drug will work without harming patients. Attendees agreed that CCubed has kick-started the kind of interactions that are likely to evolve into collaborations.
"I think we're currently at an inflection point in the biomedical community where we're generating enormous amounts of data not only at the genomic level, but also at the translational, proteomic and epigenomic levels," Barker said. "We don't, however, have the computational expertise to work with it in a meaningful way. Events like CCubed help advance biomedical research by bringing together eclectic thinkers who offer fresh perspective on how we can begin to manage and analyze this data, and how we can turn it into real knowledge."
For Jordan, CCubed did exactly what it was designed to do. ""This conference's goal was primarily to bring together a variety of people to foster new collaborations, and based on the feedback, several new ones have already begun."