By decree of an anonymous university “supercomputer,” Victoria’s Dan Andrews has opted to extend stage 4 lockdowns. This is once again stalling the economic recovery of the region and plundering the wealth and liberty of millions across the state.
Category: supercomputing – Page 76

Physicists make electrical nanolasers even smaller
Researchers from the Moscow Institute of Physics and Technology and King’s College London cleared the obstacle that had prevented the creation of electrically driven nanolasers for integrated circuits. The approach, reported in a recent paper in Nanophotonics, enables coherent light source design on the scale not only hundreds of times smaller than the thickness of a human hair but even smaller than the wavelength of light emitted by the laser. This lays the foundation for ultrafast optical data transfer in the manycore microprocessors expected to emerge in the near future.
Light signals revolutionized information technologies in the 1980s, when optical fibers started to replace copper wires, making data transmission orders of magnitude faster. Since optical communication relies on light—electromagnetic waves with a frequency of several hundred terahertz—it allows transferring terabytes of data every second through a single fiber, vastly outperforming electrical interconnects.
Fiber optics underlies the modern internet, but light could do much more for us. It could be put into action even inside the microprocessors of supercomputers, workstations, smartphones, and other devices. This requires using optical communication lines to interconnect the purely electronic components, such as processor cores. As a result, vast amounts of information could be transferred across the chip nearly instantaneously.


Scientists use reinforcement learning to train quantum algorithm
Recent advancements in quantum computing have driven the scientific community’s quest to solve a certain class of complex problems for which quantum computers would be better suited than traditional supercomputers. To improve the efficiency with which quantum computers can solve these problems, scientists are investigating the use of artificial intelligence approaches.
In a new study, scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory have developed a new algorithm based on reinforcement learning to find the optimal parameters for the Quantum Approximate Optimization Algorithm (QAOA), which allows a quantum computer to solve certain combinatorial problems such as those that arise in materials design, chemistry and wireless communications.
“Combinatorial optimization problems are those for which the solution space gets exponentially larger as you expand the number of decision variables,” said Argonne computer scientist Prasanna Balaprakash. “In one traditional example, you can find the shortest route for a salesman who needs to visit a few cities once by enumerating all possible routes, but given a couple thousand cities, the number of possible routes far exceeds the number of stars in the universe; even the fastest supercomputers cannot find the shortest route in a reasonable time.”


The quantum state of play — cloud-based QCaaS and Covid-19
Quantum computing requires meticulously prepared hardware and big budgets, but cloud-based solutions could make the technology available to broader business audiences Several tech giants are racing to achieve “quantum supremacy”, but reliability and consistency in quantum output is no simple trick Covid-19 has prompted some researchers to look at how quantum computing could mitigate future pandemics with scientific precision and speed Quantum computing (QC) has been theorized for decades and has evolved rapidly over the last few years. An escalation in spend and development has seen powerhouses IBM, Microsoft, and Google race for ‘quantum supremacy’ — whereby quantum reliably and consistently outperforms existing computers. But do quantum computers remain a sort of elitist vision of the future or are we on course for more financially and infrastructurally viable applications across industries?
Getting to grips with qubits How much do you know? Ordinary computers (even supercomputers) deploy bits, and these bits comprise of traditional binary code. Computer processes – like code – are made up of countless combinations of 0’s and 1’s. Quantum computers, however, are broken down into qubits. Qubits are capable of ‘superpositions’: effectively adopting both 1 and 0 simultaneously, or any space on the spectrum between these two formerly binary points. The key to a powerful, robust, and reliable quantum computer is more qubits. Every qubit added exponentially increases the processing capacity of the machine.
Qubits and the impact of the superposition give quantum computers the ability to process large datasets within seconds, doing what it would take humans decades to do. They can decode and deconstruct, hypothesize and validate, tackling problems of absurd complexity and dizzying magnitude — and can do so across many different industries.
Wherein lies the issue then? Quantum computing for everybody! We’re still a way off – the general consensus being, it’s 5 years, at least, before this next big wave of computing is seen widely across industries and use cases, unless your business is bustling with the budgets of tech giants like Google, IBM, and the like. But expense isn’t the only challenge.
Frail and demanding — the quantum hardware Quantum computers are interminably intricate machines. It doesn’t take much at all to knock a qubit out of the delicate state of superposition. They’re powerful, but not reliable. The slightest interference or frailty leads to high error rates in quantum processing, slowing the opportunity for more widespread use, and rendering ‘quantum supremacy’ a touch on the dubious side.
Quantum computing (QC) has been theorized for decades and has evolved rapidly over the last few years. An escalation in spend and development has seen powerhouses IBM, Microsoft, and Google race for ‘quantum supremacy’ — whereby quantum reliably and consistently outperforms existing computers. But do quantum computers remain a sort of elitist vision of the future or are we on course for more financially and infrastructurally viable applications across industries?

A Quintillion Calculations a Second: DOE Calculating the Benefits of Exascale and Quantum Computers
A quintillion calculations a second. That’s one with 18 zeros after it. It’s the speed at which an exascale supercomputer will process information. The Department of Energy (DOE) is preparing for the first exascale computer to be deployed in 2021. Two more will follow soon after. Yet quantum computers may be able to complete more complex calculations even faster than these up-and-coming exascale computers. But these technologies complement each other much more than they compete.
It’s going to be a while before quantum computers are ready to tackle major scientific research questions. While quantum researchers and scientists in other areas are collaborating to design quantum computers to be as effective as possible once they’re ready, that’s still a long way off. Scientists are figuring out how to build qubits for quantum computers, the very foundation of the technology. They’re establishing the most fundamental quantum algorithms that they need to do simple calculations. The hardware and algorithms need to be far enough along for coders to develop operating systems and software to do scientific research. Currently, we’re at the same point in quantum computing that scientists in the 1950s were with computers that ran on vacuum tubes. Most of us regularly carry computers in our pockets now, but it took decades to get to this level of accessibility.
In contrast, exascale computers will be ready next year. When they launch, they’ll already be five times faster than our fastest computer – Summit, at Oak Ridge National Laboratory’s Leadership Computing Facility, a DOE Office of Science user facility. Right away, they’ll be able to tackle major challenges in modeling Earth systems, analyzing genes, tracking barriers to fusion, and more. These powerful machines will allow scientists to include more variables in their equations and improve models’ accuracy. As long as we can find new ways to improve conventional computers, we’ll do it.

Supercomputer to scan ‘entire sky’ for signs of aliens
Scientists are ramping up their efforts in the search for signs of alien life.
Experts at the SETI Institute, an organization dedicated to tracking extraterrestrial intelligence, are developing state-of-the-art techniques to detect signatures from space that indicate the possibility of extraterrestrial existence.
These so-called “technosignatures” can range from the chemical composition of a planet’s atmosphere, to laser emissions, to structures orbiting other stars, among others, they said.

Calculating the benefits of exascale and quantum computers
A quintillion calculations a second. That’s one with 18 zeros after it. It’s the speed at which an exascale supercomputer will process information. The Department of Energy (DOE) is preparing for the first exascale computer to be deployed in 2021. Two more will follow soon after. Yet quantum computers may be able to complete more complex calculations even faster than these up-and-coming exascale computers. But these technologies complement each other much more than they compete.
It’s going to be a while before quantum computers are ready to tackle major scientific research questions. While quantum researchers and scientists in other areas are collaborating to design quantum computers to be as effective as possible once they’re ready, that’s still a long way off. Scientists are figuring out how to build qubits for quantum computers, the very foundation of the technology. They’re establishing the most fundamental quantum algorithms that they need to do simple calculations. The hardware and algorithms need to be far enough along for coders to develop operating systems and software to do scientific research. Currently, we’re at the same point in quantum computing that scientists in the 1950s were with computers that ran on vacuum tubes. Most of us regularly carry computers in our pockets now, but it took decades to get to this level of accessibility.
In contrast, exascale computers will be ready next year. When they launch, they’ll already be five times faster than our fastest computer —Summit, at Oak Ridge National Laboratory’s Leadership Computing Facility, a DOE Office of Science user facility. Right away, they’ll be able to tackle major challenges in modeling Earth systems, analyzing genes, tracking barriers to fusion, and more. These powerful machines will allow scientists to include more variables in their equations and improve models’ accuracy. As long as we can find new ways to improve conventional computers, we’ll do it.

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim
On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Last week a Google draft paper, discovered by the Financial Times, claimed attaining Quantum Supremacy using a 53-qubit superconducting processor. The paper found on NASA’s website was later withdrawn. Conversation around it has been bubbling in the QC community since.
More on D-Wave’s announcements later – the Advantage system isn’t expected to be broadly available until mid-2020 which is roughly in keeping with its stated plans. The Google work on quantum supremacy is fascinating. Google has declined to comment on the paper. How FT became aware of the paper isn’t clear. A few observers suggest it looks like an early draft.
Quantum supremacy, of course, is the notion of a quantum computer doing something that classical computers simply can’t reasonably do. In this instance, the reported Google paper claimed it was able to perform as task (a particular random number generation) on its QC in 200 seconds versus what would take on the order 10,000 years on a supercomputer. In an archived copy of the draft that HPCwire was able to find, the authors say they “estimated the classical computational cost” of running supremacy circuits on Summit and on a large Google cluster. (For an excellent discussion of quantum supremacy see Scott Aaronson’s (University of Texas) blog yesterday, Scott’s Supreme Quantum Supremacy FAQ)