Archive for the ‘computing’ category: Page 746
Jul 21, 2016
Researchers make leap in measuring quantum states
Posted by Karen Hurst in categories: computing, quantum physics
Another major leap forward in controlling system noise in QC.
A breakthrough into the full characterisation of quantum states has been published today as a Editors’ Suggestion in the journal Physical Review Letters.
The full characterisation (tomography) of quantum states is a necessity for future quantum computing. However, standard techniques are inadequate for the large quantum bit-strings necessary in full scale quantum computers.
Continue reading “Researchers make leap in measuring quantum states” »
Jul 21, 2016
World’s most powerful quantum computer now online at USC
Posted by Karen Hurst in categories: computing, information science, quantum physics, robotics/AI
Good for USC.
Following a recent upgrade, the USC-Lockheed Martin Quantum Computing Center (QCC) based at the USC Information Sciences Institute (ISI) is now the leader in quantum processing capacity.
With the upgrade — to 1,098 qubits from 512 — the D-Wave 2X™ processor is enabling QCC researchers to continue their efforts to close the gap between academic research in quantum computation and real-world critical problems.
Continue reading “World’s most powerful quantum computer now online at USC” »
Jul 21, 2016
Carbon Nanospheres Overcome Electron Spin Decoherence
Posted by Karen Hurst in categories: computing, quantum physics
Another spin on spin in QC.
Monitoring electron spins for a prolonged time period poses to be a major barrier in quantum computing. Scientists from EPFL have discovered the possibility of carbon nanospheres to overcome such barriers, even at room temperature.
Jul 20, 2016
Atom-scale storage holds 62TB in a square inch
Posted by Shailesh Prasad in categories: computing, mobile phones, particle physics
Storage tech doesn’t get much better than this. Scientists at TU Delft have developed a technique that uses chlorine atom positions as data bits, letting the team fit 1KB of information into an area just 100 nanometers wide. That may not sound like much, but it amounts to a whopping 62.5TB per square inch — about 500 times denser than the best hard drives. The scientists coded their data by using a scanning tunneling microscope to shuffle the chlorine atoms around a surface of copper atoms, creating data blocks where QR code -style markers indicate both their location and whether or not they’re in good condition.
Not surprisingly, the technology isn’t quite ready for prime time. At the moment, this storage only works in extremely clean conditions, and then only in extreme cold (77 kelvin, or −321F). However, the approach can easily scale to large data sizes, even if the copper is flawed. Researchers suspect that it’s just a matter of time before their storage works in normal conditions. If and when it does, you could see gigantic capacities even in the smallest devices you own — your phone could hold dozens of terabytes in a single chip.
Continue reading “Atom-scale storage holds 62TB in a square inch” »
Jul 20, 2016
Killer ‘legobots’ are coming: US Military to build brickbots
Posted by Karen Hurst in categories: computing, drones, military
Plug and play is preparing to launch.
DARPA hopes to shrink traditional military machines into single ‘chiplets’ to build a library of components to aid everything from smart drone building to instant language translation. Shown, an artist’s impression of the components that could be shrunk onto a single chip.
Jul 20, 2016
One of the First Real-World Quantum Computer Applications Was Just Realized
Posted by Karen Hurst in categories: computing, engineering, quantum physics
Luv it; and this is only the beginning too.
In the continued effort to make a viable quantum computer, scientists assert that they have made the first scalable quantum simulation of a molecule.
Continue reading “One of the First Real-World Quantum Computer Applications Was Just Realized” »
Jul 20, 2016
Here’s How Google Is Racing to Protect You From Quantum Hackers
Posted by Karen Hurst in categories: computing, encryption, quantum physics
This is a true question especially since China launches their new Quantum Satellite communications in the next few weeks. I do believe some will be protected; however, the broader majority will be a stretch.
The encryption of today will be broken by the computers of tomorrow, even retroactively.
Continue reading “Here’s How Google Is Racing to Protect You From Quantum Hackers” »
Jul 20, 2016
Computer ‘fingerprints’ may give out your identity and location
Posted by Karen Hurst in categories: computing, electronics
Some folks will be freaked out by this while others will luv it.
A visitor tries out an HP Spectre XT laptop computer featuring an Intel Ultrabook processor at the Internationale Funkausstellung (IFA) 2012 consumer electronics trade fair on August 31, 2012 in Berlin, Germany. (Getty Images — Representational Image)
Jul 20, 2016
New study uses computer learning to provide quality control for genetic databases
Posted by Karen Hurst in categories: biotech/medical, computing, genetics, robotics/AI
AI and Quality Control in Genome data are made for each other.
A new study published in The Plant Journal helps to shed light on the transcriptomic differences between different tissues in Arabidopsis, an important model organism, by creating a standardized “atlas” that can automatically annotate samples to include lost metadata such as tissue type. By combining data from over 7000 samples and 200 labs, this work represents a way to leverage the increasing amounts of publically available ‘omics data while improving quality control, to allow for large scale studies and data reuse.
“As more and more ‘omics data are hosted in the public databases, it become increasingly difficult to leverage those data. One big obstacle is the lack of consistent metadata,” says first author and Brookhaven National Laboratory research associate Fei He. “Our study shows that metadata might be detected based on the data itself, opening the door for automatic metadata re-annotation.”
The study focuses on data from microarray analyses, an early high-throughput genetic analysis technique that remains in common use. Such data are often made publically available through tools such as the National Center for Biotechnology Information’s Gene Expression Omnibus (GEO), which over time accumulates vast amounts of information from thousands of studies.