SALT LAKE CITY — President-elect Donald Trump says he likes to win, which means that being in second place in supercomputing — behind China — may be a non-starter for him. That’s the hope, at least, among some at the SC16 supercomputing conference here.
Today, China has the world’s two fastest supercomputers. China also has 171 systems on the Top 500 supercomputer list, which was updated this week. That’s the same number as the U.S., and China is moving swiftly to build big systems.
A decade ago, China only had 10 systems on this list. And it now expects to deliver an exascale system in 2020 — three years before the U.S.
“The fact that the U.S. doesn’t have the fastest computers might motivate him (Trump) for leadership there,” said Earl Joseph, a high-performance computing (HPC) analyst at IDC. “I think he likes to be a winner.”
It’s not yet known what Trump is planning for HPC or for science investments, who his science advisors will be or their philosophies. Vendors are still trying to figure out what a Trump administration will do.
Barry Bolding, chief strategy officer at Cray, said his firm prepared for any outcome in the election, but says officials knew more about Clinton’s plans for supercomputing than Trump’s.
“I think we just have to sit down and talk to them — that’s the most important thing,” said Bolding, referring to the upcoming Trump administration. “We have worked through many different administrations and we will work through this administration.”
The government’s role is critical. It is the only entity capable of funding the research needed to advance HPC and cover the cost of systems that will run into the hundreds of millions of dollars. Advances made in these systems help vendors develop smaller systems accessible to a broader range of users.
Joseph believes that Trump may like the business case for HPC investment. Research firm IDC surveyed HPC users and found that during a three-year period every dollar invested generated an average of $551 in revenues, he said.
During the campaign, Trump told Science Debate, in response to written questions, that “scientific advances do require long-term investment.” Supercomputing is used in every area of science, such as space exploration and climate change. Trump has expressed support for “a viable space program” but has labeled climate change a hoax.
The U.S. plan is to have an exascale system ready by 2023, which will require ongoing funding for research and eventually $200 to $300 million to build each machine.
At the SC16 conference, Martha (Marti) Head, a top computational researcher at GlaxoSmithKline Pharmaceuticals, told attendees that developing new drugs, particularly with personalized or precision medicine, requires increasing HPC capabilities.
“The drug discovery industry is facing a crisis at this time,” said Head. It now takes five to seven years to go from a disease hypothesis to a drug candidate, followed by years of clinical trials, she said.
The drug discovery process can’t go faster by simply taking the same methods already in use and trying to shrink them down, said Head. To deliver personalized medicine, new processes will be needed that combine high-performance computing, simulation and data analytics to shrink the disease-hypothesis-to-treatment timeline, she said.
This article was written by Patrick Thibodeau from Computerworld and was legally licensed through the NewsCred publisher network.