Media

Supercomputers

Age of the super brains

February 26, 2018

Media

Age of the super brains

Supercomputers are already today lightning-fast analyzers. Now the next great breakthrough is just around the corner. And the potential is enormous.

The answer to all the ultimate questions of life is actually very simple. It is 42. This was calculated in 7.5 million years by Deep Thought, the supercomputer in the science fiction novel, “The Hitchhiker’s Guide to the Galaxy.”

 

Unlike the machine in a 40-year-old work of fiction, today’s powerful computers provide usable results. In chemistry, for example, they help in molecular simulation for finding new active agents. They make water and energy supplies more efficient and are important helpers in predicting epidemics and earthquakes or in diagnosing illnesses. For example, oncologists in Japan were groping in the dark in the case of a 60-year-old woman until they enlisted IBM’s Watson. This supercomputer required just 10 minutes to compare the data of the sick woman’s diagnosis against millions of cancer studies to find an extremely rare type of leukemia. The doctors adjusted their therapy and the woman was treated successfully with the help of “Dr. Watson.”

Pepper, the little robot from the Japanese mobile communications company Softbank, speaks 20 languages and even recognizes emotions, thanks to the technology of IBM’s supercomputer, Watson.

Record-breaking supercomputers

 

Supercomputers that achieve top computing power with several thousand processors could play a key part in meeting the challenges of the future. “We are facing changes that will prove to be revolutionary,” the U.S. computer science professor and supercomputer expert Thomas Sterling predicts. Thanks to their computing power, Sterling places supercomputers on a par with innovations that have given a decisive impetus to human development, such as the discovery of fire. Competition in the market is correspondingly fierce. China and the United States, in particular, are engaged in a race among high-performance computers.

 

The world’s first supercomputer came on to the market in 1964 in the United States, in the form of the CDC 6600. The Americans dominated the scene for many years, but recently computers from China have made their way to the top. At 93 petaflops – that‘s 93,000,000,000,000,000 calculations per second – the Sunway TaihuLight is the fastest supercomputer by far (as of November 2017). “With its help, complex climate models, for example, can be calculated nearly a hundred times faster than by a computer capable of one petaflops, which would need one year for the task. This adds a whole new dimension to the fight against climate change,” Sterling says. The Sunway is followed by the Tianhe-2, which still has almost twice as much computing power as the Piz Daint from Switzerland, which comes third. The fastest U.S. computer, Titan, is fifth.

We are facing changes that will be revolutionary.“

Thomas Sterling

Professor of Computer Science, Indiana University, USA

The high-performance computer at TU Dresden, Germany, fills a whole hall of its own.
Its peak performance is more than 1.5 quadrillion computing operations per second.

93,000,000,000,000,000

calculations per second performed by the world’s fastest
supercomputer TaihuLight (as of November 2017).

But performance rankings often involve great simplifications. A high level of computing power alone does not help with every scientific question. A big part is also played by the size of the memory – and above all by the programming. Nevertheless, computing power is a major requirement for these super brains to exploit their abilities to the full. For this reason, researchers around the world are already working on the next stage of supercomputers: the exascale computer. With a capacity of 1,000 petaflops, this will be able to perform one quintillion – meaning 10 to the power of 18 – computing operations per second. China says it has already begun building a prototype, and it is followed in this by the United States. So they do not fall behind, the U.S. Department of Energy this summer announced $258 million to support companies to make progress on the exascale computer in the next three years.

 

Meanwhile, the European Union, which has so far been lagging some way behind, is likewise planning to invest heavily in breaking the exascale barrier by 2022, according to Andrus Ansip, the Commissioner for the Digital Single Market. The E.U. estimates that €5 billion will be needed for this. At present, E.U. states are much too reliant on the computing power of supercomputers based in, for example, China and the United States. For instance, as recently as spring 2017, E.U. industry provided only around 5 percent of the power of high-performance computers but used one-third of global resources. Japan, too, is getting involved in this catch-up race and is aiming to top the supercomputer league as early as 2018 with its AI Bridging Cloud Infrastructure.

Digital Industry

 

The digital transformation is making ever greater advances and permeating the value chains of industry.
Here are some examples.

Digital logistics
Autonomous and automatically driven vehicles are directed around the plant site by transponders in the ground. This saves time. At BASF in Ludwigshafen, it currently takes around 22 hours for a conventional railway tank car to be delivered from the plant’ s train station to one of more than 150 loading stations. With autonomously driven, driverless vehicles, the delivery takes only an hour.
Smart energy network
BASF’s power plants are pioneers in the use of large volumes of data to boost efficiency. Information on production and sales volumes, weather data and business cycle indices, is evaluated using special software. The program uses this information to project energy requirements.Its method is to identify new connections and independently to draw conclusions from them.
Digital research and development
The linking and intelligent use of internal and external data supports researchers in identifying new and promising fields for development even more quickly. Ultramodern supercomputers enable them to perform more, and more complex, simulations and modeling in a shorter time. In this way, they provide greater scope for the creativity of employees.
Integration with customers
For automobile coatings, BASF is already using online data from the painting lines of customers with which it is integrated to set the correct shade and make immediate adjustments in the event of discrepancies.
Predictive maintenance
In the steam cracker, where many important basic chemical building blocks for subsequent use in production at BASF are produced, several thousand sensors capture process data such as pressure and temperature around the clock. This information is evaluated by analysis software to predict the best time for maintenance work, avoiding unplanned downtime and operating the plant in the best possible way.
Information on the ground
At production plants, increasing use is being made of industry-specific tablets that make it possible to access information on - site in the plant (augmented reality). Operating instructions or measurements, among other things, are shown on the display. Employees are taught how to handle new digital technologies as part of their initial and continuing training.

Helpers for scientists

 

“Especially in the natural sciences, powerful supercomputers are already indispensable for simulating molecular processes one to one,” says the German philosopher of science and expert in artificial intelligence (AI), Professor Klaus Mainzer. Out of the many possible combinations of these building blocks, they help to single out those that offer the prospect of surprising discoveries and new products. The supercomputer is capable of learning, and it performs an initial selection, so that only the most promising substances find their way into the laboratory. Accordingly, BASF has, since fall 2017, relied on just such a powerful digital helper for developing virtual experiments and answering complex questions. It shortens the time taken to obtain usable results from several months to a few days. (see BASF supercomputer QURIOSITY)

Meters of data connections to the server.

“The challenging problems in the field of chemistry could become drivers for super- computing,” Sterling believes. In his view, they could contribute to investigating the critical boundaries of technology – and how to overcome them. The bottlenecks between processors and memory pose increasing problems to the industry. These become more serious as the masses of data which have to be shifted around, in simulations for example, grow larger. “This bottleneck in traditional von Neumann computer architecture needs to be eliminated,” Sterling says. A new way of thinking is required to bring together computing and memory operations in a smart way. Another technology has already assimilated the elementary logic by which chemical processes work: the quantum computer, which could open up new horizons of knowledge. The next dimension of super brains – thinking in several states at once – is in the starting blocks.

 

Next part: Quantum computers
Continue reading