Menu

Blog

Archive for the ‘robotics/AI’ category: Page 1258

Jun 20, 2020

Combining AI and biology could solve drug discovery’s biggest problems

Posted by in categories: biotech/medical, robotics/AI

There’s a lot of hope that artificial intelligence could help speed up the time it takes to make a drug and also increase the rate of success. Several startups have emerged to capitalize on this opportunity. But Insitro is a bit different from some of these other companies, which rely more heavily on machine learning than biology.


Machine learning can speed up the creation of new drugs and unlock the mysteries of major diseases, says Insitro CEO Daphne Koller.

[Photo: Ivan-balvan/iStock]

Continue reading “Combining AI and biology could solve drug discovery’s biggest problems” »

Jun 20, 2020

Engineers Put Tens of Thousands of Artificial Brain Synapses on a Single Chip for Portable AI Devices

Posted by in categories: robotics/AI, supercomputing

MIT engineers have designed a “brain-on-a-chip,” smaller than a piece of confetti, that is made from tens of thousands of artificial brain synapses known as memristors — silicon-based components that mimic the information-transmitting synapses in the human brain.

The researchers borrowed from principles of metallurgy to fabricate each memristor from alloys of silver and copper, along with silicon. When they ran the chip through several visual tasks, the chip was able to “remember” stored images and reproduce them many times over, in versions that were crisper and cleaner compared with existing memristor designs made with unalloyed elements.

Their results, published on June 8, 2020, in the journal Nature Nanotechnology, demonstrate a promising new memristor design for neuromorphic devices — electronics that are based on a new type of circuit that processes information in a way that mimics the brain’s neural architecture. Such brain-inspired circuits could be built into small, portable devices, and would carry out complex computational tasks that only today’s supercomputers can handle.

Jun 20, 2020

Quickly Embed AI Into Your Projects With Nvidia’s Jetson Nano

Posted by in categories: robotics/AI, transportation

When opportunity knocks, open the door: No one has taken heed of that adage like Nvidia, which has transformed itself from a company focused on catering to the needs of video gamers to one at the heart of the artificial-intelligence revolution. In 2001, no one predicted that the same processor architecture developed to draw realistic explosions in 3D would be just the thing to power a renaissance in deep learning. But when Nvidia realized that academics were gobbling up its graphics cards, it responded, supporting researchers with the launch of the CUDA parallel computing software framework in 2006.

Since then, Nvidia has been a big player in the world of high-end embedded AI applications, where teams of highly trained (and paid) engineers have used its hardware for things like autonomous vehicles. Now the company claims to be making it easy for even hobbyists to use embedded machine learning, with its US $100 Jetson Nano dev kit, which was originally launched in early 2019 and rereleased this March with several upgrades. So, I set out to see just how easy it was: Could I, for example, quickly and cheaply make a camera that could recognize and track chosen objects?

Embedded machine learning is evolving rapidly. In April 2019, Hands On looked at Google’s Coral Dev AI board which incorporates the company’s Edge tensor processing unit (TPU), and in July 2019, IEEE Spectrum featured Adafruit’s software library, which lets even a handheld game device do simple speech recognition. The Jetson Nano is closer to the Coral Dev board: With its 128 parallel processing cores, like the Coral, it’s powerful enough to handle a real-time video feed, and both have Raspberry Pi–style 40-pin GPIO connectors for driving external hardware.

Jun 20, 2020

FaceApp’s Gender Swap is a Scary Insight to AI and Privacy Concerns

Posted by in categories: privacy, robotics/AI

FaceApp looks pretty harmless. However, when you realise that you are uploading your photos for an AI to work on, things start to look bleak.

Jun 20, 2020

Engineers Design Ion-Based Device That Operates Like an Energy-Efficient Brain Synapse

Posted by in category: robotics/AI

Ion-based technology may enable energy-efficient simulations of the brain’s learning process, for neural network AI systems.

Teams around the world are building ever more sophisticated artificial intelligence systems of a type called neural networks, designed in some ways to mimic the wiring of the brain, for carrying out tasks such as computer vision and natural language processing.

Using state-of-the-art semiconductor circuits to simulate neural networks requires large amounts of memory and high power consumption. Now, an MIT team has made strides toward an alternative system, which uses physical, analog devices that can much more efficiently mimic brain processes.

Jun 19, 2020

Teaching physics to neural networks removes ‘chaos blindness’

Posted by in categories: biotech/medical, drones, robotics/AI

Researchers from North Carolina State University have discovered that teaching physics to neural networks enables those networks to better adapt to chaos within their environment. The work has implications for improved artificial intelligence (AI) applications ranging from medical diagnostics to automated drone piloting.

Neural networks are an advanced type of AI loosely based on the way that our brains work. Our natural neurons exchange electrical impulses according to the strengths of their connections. Artificial neural networks mimic this behavior by adjusting numerical weights and biases during training sessions to minimize the difference between their actual and desired outputs. For example, a can be trained to identify photos of dogs by sifting through a large number of photos, making a guess about whether the photo is of a dog, seeing how far off it is and then adjusting its weights and biases until they are closer to reality.

The drawback to this is something called “ blindness”—an inability to predict or respond to chaos in a system. Conventional AI is chaos blind. But researchers from NC State’s Nonlinear Artificial Intelligence Laboratory (NAIL) have found that incorporating a Hamiltonian function into neural networks better enables them to “see” chaos within a system and adapt accordingly.

Jun 19, 2020

Innovative dataset to accelerate autonomous driving research

Posted by in categories: robotics/AI, transportation

How can we train self-driving vehicles to have a deeper awareness of the world around them? Can computers learn from past experiences to recognize future patterns that can help them safely navigate new and unpredictable situations?

These are some of the questions researchers from the AgeLab at the MIT Center for Transportation and Logistics and the Toyota Collaborative Safety Research Center (CSRC) are trying to answer by sharing an innovative new open dataset called DriveSeg.

Through the release of DriveSeg, MIT and Toyota are working to advance research in autonomous driving systems that, much like , perceive the driving environment as a continuous flow of visual information.

Jun 19, 2020

Deep learning-based surrogate models outperform simulators and could hasten scientific discoveries

Posted by in categories: physics, robotics/AI

Surrogate models supported by neural networks can perform as well, and in some ways better, than computationally expensive simulators and could lead to new insights in complicated physics problems such as inertial confinement fusion (ICF), Lawrence Livermore National Laboratory (LLNL) scientists reported.

In a paper published by the Proceedings of the National Academy of Sciences (PNAS), LLNL researchers describe the development of a deep learning-driven Manifold & Cyclically Consistent (MaCC) surrogate model incorporating a multi-modal neural network capable of quickly and accurately emulating complex scientific processes, including the high-energy density physics involved in ICF.

The research team applied the model to ICF implosions performed at the National Ignition Facility (NIF), in which a computationally expensive numerical simulator is used to predict the energy yield of a target imploded by shock waves produced by the facility’s high-energy laser. Comparing the results of the neural network-backed surrogate to the existing simulator, the researchers found the surrogate could adequately replicate the simulator, and significantly outperformed the current state-of-the-art in surrogate models across a wide range of metrics.

Jun 19, 2020

Amazon says it mitigated the largest DDoS attack ever recorded

Posted by in categories: cybercrime/malcode, robotics/AI

Amazon Web Services recently had to defend against a DDoS attack with a peak traffic volume of 2.3 Tbps, the largest ever recorded, ZDNet reports. Detailing the attack in its Q1 2020 threat report, Amazon said that the attack occurred back in February, and was mitigated by AWS Shield, a service designed to protect customers of Amazon’s on-demand cloud computing platform from DDoS attacks, as well as from bad bots and application vulnerabilities. The company did not disclose the target or the origin of the attack.

To put that number into perspective, prior to February of this year, ZDNet notes that the largest DDoS attack recorded was back in March 2018, when NetScout Arbor mitigated a 1.7 Tbps attack. The previous month, GitHub disclosed that it had been hit by an attack with a peak of 1.35 Tbps.

Jun 18, 2020

The startup making deep learning possible without specialized hardware

Posted by in category: robotics/AI

The discovery that led Nir Shavit to start a company came about the way most discoveries do: by accident. The MIT professor was working on a project to reconstruct a map of a mouse’s brain and needed some help from deep learning. Not knowing how to program graphics cards, or GPUs, the most common hardware choice for deep-learning models, he opted instead for a central processing unit, or CPU, the most generic computer chip found in any average laptop.

“Lo and behold,” Shavit recalls, “I realized that a CPU can do what a GPU does—if programmed in the right way.”

This insight is now the basis for his startup, Neural Magic, which launched its first suite of products today. The idea is to allow any company to deploy a deep-learning model without the need for specialized hardware. It would not only lower the costs of deep learning but also make AI more widely accessible.