Supercomputers Recruited to Work on COVID-19 Research

A consortium forms to crunch data that might help researchers get a better understanding of the virus faster.

A convergence of technology resources is being put to work to find answers in the fight against COVID-19. The White House Office of Science and Technology Policy and the U.S. Department of Energy reached out to the technology sector, bringing together IBM and other supercomputing powerhouses to support research into the virus.

The combination of private industry, academic resources, and government entities thus far has assembled 16 supercomputer systems that boast some 775,000 CPU cores and 34,000 GPUs. That computing power is tasked with running huge calculations for molecular modeling, epidemiology, and bioinformatics in order to hasten the research time spent on the virus.

Spearheaded by IBM, the key partners in the COVID-19 High Performance Computing Consortium include Amazon Web Services, Google Cloud, Microsoft, Massachusetts Institute of Technology, Rensselaer Polytechnic Institute, NASA, and others. The consortium is accepting research proposals online, then matching researchers with computing resources that might best accelerate their efforts.

John Kolb, vice president, information services and technology and chief information officer at Rensselaer Polytechnic Institute (RPI), says high-performance computing is an area of expertise for the university. “We’re on our third-generation supercomputer, an IBM DCS system, that we put in place in November,” he says. “It’s the most powerful supercomputer for a private university in the country.”

Kolb says the supercomputer’s architecture is meant to move data in and out of memory very quickly in large quantities. That lets users take on data-intensive problems. “It’s also very well-suited for some of the machine learning and AI things our researchers are involved with,” he says.

The effort to fight COVID-19, Kolb says, may include a lot of modeling of very large data sets once they become available. “You can start to look at issues around the spread of the virus and mitigation of the spread,” he says. “There could be some drug repurposing and perhaps development of new therapeutic candidates.”

There may be opportunities for new materials to filter out the virus, Kolb says, or to create items that are in short supply now.

RPI uses the Summit supercomputer architecture system, which is the same system as some of the Department of Energy labs, he says. “It will be interesting to see if we can have runs here that scale up on Summit or do we have runs on Summit that we could take over.” Kolb believes most of the problems the consortium will deal with may be multivariant. For example, that could mean taking into account the number of people, density, the effectiveness of social distancing, and the capacity of hospitals. “We’re clearly trying to explore some things that may have some great promise, but there’s some great computing and science that need to come into play here,” Kolb says.

The greater emphasis in recent years on technology and compute in the public, private, and academic sectors may mean there can be more hands on deck to support research into the virus. “COVID-19 is going to see a fair amount of data analytics and the use of AI and machine learning tools to think through what are the most promising possibilities going forward,” Kolb says. “Across the country and world, we’re developing much more expertise in this area.”

IBM got involved in this fight believing it could coalesce a team around bringing computational capability to bear on investigating the virus, says Dave Turek, vice president of technical computing at IBM Cognitive Systems. “It was prompted by experiences IBM’s had applying computational biology, molecular dynamics, and material science to a variety of scientific problems,” he says.

Bringing scientific perspective and computing expertise together, Turek says, could create a set of resources that can be used broadly. It also gives researchers access to supercomputing they might not otherwise have, he says. “It a massive, massive amount of computing,” he says.

The way the consortium is established, other interested organizations can make their resources available as well, Turek says. “This is really a clearinghouse,” he says. “We have scientists and computer scientists sitting on review committees on proposals that are coming in to ensure the science is dedicated to the most appropriate platform to the task at hand.”

The momentum and application of technology such as supercomputers that was already underway could help narrow the time research efforts may take. “Even inside IBM, we did modeling on the evolutionary pathways of H1N1,” Turek says. “Those skills and experiences have been scaled up and leveraged over time”..[…] Read more »…..

 

Share