Today, a group of scientists — John A. Rogers, Eric Seabron, Scott MacLaren and Xu Xie from the University of Illinois at Urbana-Champaign; Slava V. Rotkin from Lehigh University; and, William L. Wilson from Harvard University — are reporting on the discovery of an important method for measuring the properties of nanotube materials using a microwave probe. Their findings have been published in ACS Nano in an article called: “Scanning Probe Microwave Reflectivity of Aligned Single-Walled Carbon Nanotubes: Imaging of Electronic Structure and Quantum Behavior at the Nanoscale.”
The researchers studied single-walled carbon nanotubes. These are 1-dimensional, wire-like nanomaterials that have electronic properties that make them excellent candidates for next generation electronics technologies. In fact, the first prototype of a nanotube computer has already been built by researchers at Stanford University. The IBM T.J. Watson Research Center is currently developing nanotube transistors for commercial use.
Nanotechnologists at the University of Twente research institute MESA+ have discovered a new fundamental property of electrical currents in very small metal circuits. They show how electrons can spread out over the circuit like waves and cause interference effects at places where no electrical current is driven. The geometry of the circuit plays a key role in this so called nonlocal effect. The interference is a direct consequence of the quantum mechanical wave character of electrons and the specific geometry of the circuit. For designers of quantum computers, it is an effect to take account of. The results are published in the British journal Scientific Reports.
Interference is a common phenomenon in nature and occurs when one or more propagating waves interact coherently. Interference of sound, light or water waves is well known, but also the carriers of electrical current — electrons — can interfere. It shows that electrons need to be considered as waves as well, at least in nanoscale circuits at extremely low temperatures: a canonical example of the quantum mechanical wave-particle duality.
DNA is similar to a hard drive or storage device, in that contains the memory of each cell of every living, and has the instructions on how to make that cell. DNA is four molecules combined in any order to make a chain of one larger molecule. And if you can read that chain of four molecules, then you have a sequence of characters, like a digital code. Over the years the price of sequencing a human genome has dropped significantly, much to the delight of scientists. And since DNA is a sequence of four letters, and if we can manipulate DNA, we could insert a message and use DNA as the storage device.
At this point in time, we are at the height of the information age. And computers have had an enormous impact on all of our lives. Any information is able to be represented as a collection of bits. And with Moore’s law, which states that computing power doubles every 18 months, our ability to manipulate and store these bits has continued to grow and grow. Moore’s law has been driven by scientists being able to make transistors and integrated circuits continuously smaller and smaller, but there eventually comes a point we reach in which these transistors and integrated circuits cannot be made any smaller than they already are, since some are already at the size of a single atom. This inevitably leads us into the quantum world. Quantum mechanics has rules which are, in many ways, hard for us to truly comprehend, yet are nevertheless tested. Quantum computing looks to make use of these strange rules of quantum physics, and process information in a totally different way. Quantum computing looks to replace the classical bits which are either a 0 or a 1, with quantum bits, or qubits, which can be both a 0 and a 1 at the same time. This ability to be two different things at the same time is referred to as a superposition. 200 qubits hold more bits of information than there are particles in the universe. A useful quantum computer will require thousands or even millions of physical qubits. Anything such as an atom can serve as a quantum bit for making a quantum computer, then you can use a superconducting circuit to build two artificial atoms. So at this point in time we have a few working quantum transistors, but scientists are working on developing the quantum integrated circuit. Quantum error correction is the biggest problem encountered in development of the quantum computer. Quantum computer science is a field that right now is in its very early stages, since scientists have yet been able to develop any quantum hardware.
Weekend Reads: Even tiny fly brains can do many things computers can’t. This 2014 feature showed why making machines much smarter might require processors that more closely mimic brains.
____________________________________________
This weekend we revisit stories from MIT Technology Review’s archives that weigh the question of how far AI can go—and when.
“Apple Inc. has purchased Emotient Inc., a startup that uses artificial-intelligence technology to read people’s emotions by analyzing facial expressions.”
While development is happening everywhere, these companies are the next big things to shoot past the stratosphere.
While a lot of end-of-the-year, turn-of-the-calendar roundups try to focus on the year that was or the year ahead, the space industry is very different. Developments are planned further in advance, so some of the qualifying news that gets companies on this list isn’t scheduled to happen until 2017. The industry is small compared to cloud computing or cybersecurity, for example, but the rate of growth is tremendous. There seems to be a cultural solidarity with spacetech on account of its tightly-knit history of cooperation and the still limited number of private companies that can facilitate space flight.
Nvidia took pretty much everyone by surprise when it announced it was getting into self-driving cars; it’s just not what you expect from a company that’s made its name off selling graphics cards for gamers.
At this year’s CES, it’s taking the focus on autonomous cars even further.
The company today announced the Nvidia Drive PX2. According to CEO Jen-Hsun Huang, it’s basically a supercomputer for your car. Hardware-wise, it’s made up of 12 CPU cores and four GPUs, all liquid-cooled. That amounts to about 8 teraflops of processing power, is as powerful as 6 Titan X graphics cards, and compares to ‘about 150 MacBook Pros’ for self-driving applications.
We humans take for granted our remarkable ability to predict things that happen around us. For example, consider Rube Goldberg machines: One of the reasons we enjoy them is because we can watch a chain-reaction of objects fall, roll, slide and collide, and anticipate what happens next.
But how do we do it? How do we effortlessly absorb enough information from the world to be able to react to our surroundings in real-time? And, as a computer scientist might then wonder, is this something that we can teach machines?
That last question has recently been partially answered by researchers at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL), who have developed a computational model that is just as accurate as humans at predicting how objects move.
Russian scientists from the Siberian Institute of Geology and Mineralogy have succeeded in growing modified diamonds, in what is a step closer to faster computers run on light, the head of the institute said Monday.
Deep Learning in Action | A talk by Juergen Schmidhuber, PhD at the Deep Learning in Action talk series in October 2015. He is professor in computer science at the Dalle Molle Institute for Artificial Intelligence Research, part of the University of Applied Sciences and Arts of Southern Switzerland.
Juergen Schmidhuber, PhD | I review 3 decades of our research on both gradient based and more general problem solvers that search the space of algorithms running on general purpose computers with internal memory.