Menu

Blog

Archive for the ‘supercomputing’ category: Page 61

Jul 17, 2020

New learning algorithm should significantly expand the possible applications of AI

Posted by in categories: information science, robotics/AI, supercomputing

The high energy consumption of artificial neural networks’ learning activities is one of the biggest hurdles for the broad use of Artificial Intelligence (AI), especially in mobile applications. One approach to solving this problem can be gleaned from knowledge about the human brain.

Although it has the computing power of a supercomputer, it only needs 20 watts, which is only a millionth of the of a supercomputer.

One of the reasons for this is the efficient transfer of information between in the brain. Neurons send short electrical impulses (spikes) to other neurons—but, to save energy, only as often as absolutely necessary.

Jul 16, 2020

Supercomputer reveals atmospheric impact of gigantic planetary collisions

Posted by in categories: space, supercomputing

The giant impacts that dominate late stages of planet formation have a wide range of consequences for young planets and their atmospheres, according to new research.

Research led by Durham University and involving the University of Glasgow, both UK, has developed a way of revealing the scale of atmosphere loss during planetary collisions based on 3D supercomputer simulations.

The simulations show how Earth-like planets with thin atmospheres might have evolved in an depending on how they are impacted by other objects.

Jul 9, 2020

The biggest flipping challenge in quantum computing

Posted by in categories: quantum physics, supercomputing

Such noise nearly drowned out the signal in Google’s quantum supremacy experiment. Researchers began by setting the 53 qubits to encode all possible outputs, which ranged from zero to 253. They implemented a set of randomly chosen interactions among the qubits that in repeated trials made some outputs more likely than others. Given the complexity of the interactions, a supercomputer would need thousands of years to calculate the pattern of outputs, the researchers said. So by measuring it, the quantum computer did something that no ordinary computer could match. But the pattern was barely distinguishable from the random flipping of qubits caused by noise. “Their demonstration is 99% noise and only 1% signal,” Kuperberg says.

To realize their ultimate dreams, developers want qubits that are as reliable as the bits in an ordinary computer. “You want to have a qubit that stays coherent until you switch off the machine,” Neven says.

Scientists’ approach of spreading the information of one qubit—a “logical qubit”—among many physical ones traces its roots to the early days of ordinary computers in the 1950s. The bits of early computers consisted of vacuum tubes or mechanical relays, which were prone to flip unexpectedly. To overcome the problem, famed mathematician John von Neumann pioneered the field of error correction.

Jul 7, 2020

Clever Wiring Architecture Enables Bigger and Better Quantum Computers

Posted by in categories: quantum physics, supercomputing

Wiring a New Path to Scalable Quantum Computing

Last year, Google produced a 53-qubit quantum computer that could perform a specific calculation significantly faster than the world’s fastest supercomputer. Like most of today’s largest quantum computers, this system boasts tens of qubits—the quantum counterparts to bits, which encode information in conventional computers.

To make larger and more useful systems, most of today’s prototypes will have to overcome the challenges of stability and scalability. The latter will require increasing the density of signaling and wiring, which is hard to do without degrading the system’s stability. I believe a new circuit-wiring scheme developed over the last three years by RIKEN’s Superconducting Quantum Electronics Research Team, in collaboration with other institutes, opens the door to scaling up to 100 or more qubits within the next decade. Here, I discuss how.

Jun 30, 2020

It happened in just zeptoseconds

Posted by in categories: physics, supercomputing

Australian and US physicists say they have calculated the speed of the most complex nuclear reactions and found that they’re, well, really fast. We’re talking as little as a zeptosecond – a billionth of a trillionth of a second (10-21).

The finding follows a comprehensive project to calculate detailed models of the energy flow during nuclear collisions.

Cedric Simenel from the Australian National University worked with Kyle Godbey and Sait Umar from Vanderbilt University to model 13 different pairs of nuclei, using supercomputers at ANU and in the US.

Jun 28, 2020

Ohio Supercomputer Center Researchers Analyse Twitter Posts Revealing Polarization in Congress on COVID-19

Posted by in categories: biotech/medical, government, robotics/AI, supercomputing

June 25, 2020 — The rapid politicization of the COVID-19 pandemic can be seen in messages members of the U.S. Congress sent about the issue on the social media site Twitter, a new analysis found.

Using artificial intelligence and resources from the Ohio Supercomputer Center, researchers conducted an analysis that covered all 30,887 tweets that members sent about COVID-19 from the first one on Jan. 17 through March 31.

Jun 28, 2020

World’s fastest supercomputer fights coronavirus

Posted by in categories: biotech/medical, supercomputing

The newly crowned Fugaku system is analysing droplet spread in offices and public transport.

Jun 28, 2020

Nvidia confirms AMD-powered supercomputer now part of 5-exaflops giant

Posted by in category: supercomputing

Nvidia has built its very own supercomputer, for some reason.

Jun 23, 2020

Fugaku, world’s fastest supercomputer, searches for coronavirus treatment

Posted by in categories: biotech/medical, supercomputing

Japanese machine can perform more than 415 quadrillion computations a second and has already worked out how breath droplets spread.

Jun 23, 2020

Fifty perfect photons for ‘quantum supremacy’

Posted by in categories: quantum physics, supercomputing

Fifty is a critical number for quantum computers capable of solving problems that classic supercomputers cannot solve. Proving quantum supremacy requires at least 50 qubits. For quantum computers working with light, it is equally necessary to have at least 50 photons. And what’s more, these photons have to be perfect, or else they will worsen their own quantum capabilities. It is this perfection that makes it hard to realize. Not impossible, however, which scientists of the University of Twente have demonstrated by proposing modifications of the crystal structure inside existing light sources. Their findings are published in Physical Review A.

Photons are promising in the world of , with its demands of entanglement, superposition and interference. These are properties of qubits, as well. They enable building a computer that operates in a way that is entirely different from making calculations with standard bits that represent ones and zeroes. For many years now, researchers have predicted quantum computers able to solve very , like instantly calculating all vibrations in a complex molecule.

The first proof of quantum supremacy is already there, accomplished with and on very complicated theoretical problems. About 50 quantum building blocks are needed as a minimum, whether they are in the form of photons or qubits. Using photons may have advantages over qubits: They can operate at room temperatures and they are more stable. There is one important condition: the photons have to be perfect in order to get to the critical number of 50. In their new paper, UT scientists have now demonstrated that this is feasible.

Page 61 of 97First5859606162636465Last