Menu

Blog

Archive for the ‘computing’ category: Page 577

Oct 10, 2019

Biology Really May Be Our Future

Posted by in categories: bioengineering, biotech/medical, computing, singularity

Many of us are fascinated by our various computing devices — our smartphones, our smart watches, and an ever-growing array of smart devices. What we sometimes forget is that we are biological creatures (at least, until The Singularity), and that even though biology as a discipline has been around much longer than computing, biology may yet supersede it.

If the 20th century was the era of computers, the 21st century may be the era of biology. And the two may even merge. Hello, synthetic biology and biological computing!

Last week SynBioBeta hosted The Global Synthetic Biology Summit, “where tech meets bio and bio meets tech.” People were urged to attend “to see how synthetic biology is disrupting consumer products, food, agriculture, medicine, chemicals, materials, and more.”

Oct 9, 2019

New horizons for connecting future quantum computers into a quantum network

Posted by in categories: computing, internet, quantum physics

Researchers led by Delft University of Technology personnel have made two steps in the conversion of quantum states between signals in the microwave and optical domains. This is of great interest for connecting future superconducting quantum computers into a global quantum network. This week they report on their findings in Nature Physics and in Physical Review Letters.

Conversion between signals in the microwave and optical domains is of great interest, particularly for connecting future superconducting quantum computers into a global quantum network. Many leading efforts in quantum technologies, including superconducting qubits and quantum dots, share quantum information through photons in the microwave regime. While this allows for an impressive degree of quantum control, it also limits the distance the information can realistically travel before being lost to a mere few centimeters.

At the same time, the field of optical quantum communication has already seen demonstrations over distance scales capable of providing real-world applications. By transmitting information in the optical telecom band, fiber-based quantum networks over tens or even hundreds of kilometers can be envisaged. “In order to connect several quantum computing nodes over large distances into a quantum internet, it is therefore vital to be able to convert quantum information from the microwave to the optical domain, and back,” says Prof. Simon Groeblacher of Delft University of Technology. “This will not only be extremely interesting for quantum applications, but also for highly efficient, low-noise conversion between classical optical and .”

Oct 9, 2019

Brain tunes itself to criticality, maximizing information processing

Posted by in categories: biological, computing, neuroscience

Researchers long wondered how the billions of independent neurons in the brain come together to reliably build a biological machine that easily beats the most advanced computers. All of those tiny interactions appear to be tied to something that guarantees an impressive computational capacity.

Over the past 20 years, evidence mounted in support of a theory that the tunes itself to a point where it is as excitable as it can be without tipping into disorder, similar to a phase transition. This criticality hypothesis asserts that the brain is poised on the fine line between quiescence and chaos. At exactly this line, is maximized.

However, one of the key predictions of this theory—that criticality is truly a set point, and not a mere inevitability—had never been tested. Until now. New research from Washington University in St. Louis directly confirms this long-standing prediction in the brains of freely behaving animals.

Oct 8, 2019

Meet America’s newest military giant: Amazon

Posted by in categories: computing, military

The Pentagon’s controversial $10bn JEDI cloud computing deal is one of the most lucrative defense contracts ever. Amazon’s in pole position to win—and its move into the military has been a long time coming.

Oct 7, 2019

‘Breakthrough’ microchip helps heals wounds and damaged organs

Posted by in categories: biotech/medical, computing, genetics

Circa 2017


The cells are converted by a small microchip, similar in size to a penny, which injects genetic code into skin cells, transforming them into other types of cell.

Oct 7, 2019

Rare ‘Lazarus superconductivity’ observed in promising, rediscovered material

Posted by in categories: computing, quantum physics

Researchers from the University of Maryland, the National Institute of Standards and Technology (NIST), the National High Magnetic Field Laboratory (NHMFL) and the University of Oxford have observed a rare phenomenon called re-entrant superconductivity in the material uranium ditelluride. The discovery furthers the case for uranium ditelluride as a promising material for use in quantum computers.

Nicknamed “Lazarus ” after the biblical character who rose from the dead, the phenomenon occurs when a arises, breaks down, then re-emerges in a material due to a change in a specific parameter—in this case, the application of a very strong magnetic field. The researchers published their results on October 7, 2019, in the journal Nature Physics.

Once dismissed by physicists for its apparent lack of interesting physical properties, uranium ditelluride is having its own Lazarus moment. The current study is the second in as many months (both published by members of the same research team) to demonstrate unusual and surprising superconductivity states in the material.

Oct 6, 2019

This Brain Computer Uses Your Jugular Like a USB Cable

Posted by in categories: computing, neuroscience

Unlike Neuralink, Synchron’s stent-like neural implant is a brain-computer interface that’s inserted through the jugular vein.

Oct 5, 2019

How Will We Store Three Septillion Bits of Data? Your Metabolome May Have the Answer

Posted by in categories: biological, computing, information science, neuroscience

For the “big data” revolution to continue, we need to radically rethink our hard drives. Thanks to evolution, we already have a clue.

Our bodies are jam-packed with data, tightly compacted inside microscopic structures within every cell. Take DNA: with just four letters we’re able to generate every single molecular process that keeps us running. That sort of combinatorial complexity is still unheard of in silicon-based data storage in computer chips.

Add this to the fact that DNA can be dehydrated and kept intact for eons—500,000 years and counting—and it’s no surprise that scientists have been exploiting its properties to encode information. To famed synthetic biologist Dr. George Church, looking to biology is a no-brainer: even the simple bacteria E. Coli has a data storage density of 1019 bits per cubic centimeter. Translation? Just a single cube of DNA measuring one meter each side can meet all of the world’s current data storage needs.

Oct 3, 2019

A cyborg magician implanted 26 microchips and magnets in her body

Posted by in categories: computing, cyborgs, media & arts

L AS VEGAS — At a biohacker conference convened here the other day, panelists took to the stage, settled into their chairs, and launched into their slide decks. Not Anastasia Synn.

https://www.youtube.com/watch?v=cn-v5XUl35c#t=1h41m20s

With Frank Sinatra crooning “I’ve Got You Under My Skin” over the loudspeakers, Synn pulled out a giant needle and twisted it deeper and deeper into her left forearm as the music played on. It was only after finishing her routine, capped off by loud applause from the crowd of biohackers, that Synn sat down for a fireside chat about her work as a “cyborg magician.”

Continue reading “A cyborg magician implanted 26 microchips and magnets in her body” »

Oct 3, 2019

Secret Life of a Full-Time Cyborg

Posted by in categories: computing, cyborgs, wearables

Steve Mann invented a precursor to Google Glass in the 1990s—which he now uses almost 24/7. But “the father of wearable computing” has an ominous warning about where technology is taking us next.