Neuromorphic systems carry out robust and efficient neural computation using hardware implementations that operate in physical time. Typically they are event- or data-driven, they employ low-power, massively parallel hybrid analog/digital VLSI circuits, and they operate using the same physics of computation used by the nervous system. Although there are several forums for presenting research achievements in neuromorphic engineering, none are exclusively dedicated to this increasingly large research community. Either because they are dedicated to single disciplines, such as electrical engineering or computer science, or because they serve research communities which focus on analogous areas (such as biomedical engineering or computational neuroscience), but with fundamentally different goals and objectives. The mission of Neuromorphic Engineering is to provide a publication medium dedicated exclusively and specifically to this field. Topics covered by this publication include: Analog and hybrid analog/digital electronic circuits for implementing neural processes, such as conductances, neurons, synapses, plasticity mechanisms, photoreceptors, cochleae, etc. Neuromorphic circuits and systems for implementing real-time event-based neural processing architectures. Hardware models of neural and sensorimotor processing systems, such as selective attention systems, coordinate transformation systems, auditory and/or visual processing systems, sensory fusion systems, etc. Implementations of neural computational systems found in insects, birds, mammals, etc. Embedded neuromorphic systems, including actuated or robotic platforms which process sensory signals and interact with the environment using event-based sensors and circuits. To ensure high quality and state-of-the-art material, publications should demonstrate experimental results, using physical implementations of neuromorphic systems, and possibly show the links between the artificial system and the neural/biological one they model.
Category: robotics/AI – Page 1912
A new photonic chip could run optical neural networks 10 million times more efficiently than conventional chips.
The classical physical limit for computing energy is the Landauer limit that sets a lower bound to the minimum heat dissipated per bit erasing operation. Performance below the thermodynamic (Landauer) limit for digital irreversible computation is theoretically possible in this device. The proposed accelerator can implement both fully connected and convolutional networks.
Previous photonic chips had bulky optical components that limited their use to relatively small neural networks. MIT researchers have a new photonic accelerator that uses more compact optical components and optical signal-processing techniques, to drastically reduce both power consumption and chip area. That allows the chip to scale to neural networks several orders of magnitude larger than its counterparts.
AI chatbots are finally getting good — or, at the very least, they’re getting entertaining.
Case in point is r/SubSimulatorGPT2, an enigmatically-named subreddit with a unique composition: it’s populated entirely by AI chatbots that personify other subreddits. (For the uninitiated, a subreddit is a community on Reddit usually dedicated to a specific topic.)
How does it work? Well, in order to create a chatbot you start by feeding it training data. Usually this data is scraped from a variety of sources; everything from newspaper articles, to books, to movie scripts. But on r/SubSimulatorGPT2, each bot has been trained on text collected from specific subreddits, meaning that the conversations they generate reflect the thoughts, desires, and inane chatter of different groups on Reddit.
Are you prepared for the Age of Machine Intelligence? That’s a time when machines anticipate consumers’ choices before they are made. That age is nearer than many people realize, according to author/futurist Mike Walsh, who said business leaders need to understand how the new reality impacts the decisions they make.
The National Automatic Merchandising Association show, held last week in Las Vegas, made an appropriate setting for Walsh’s message, given the number of exhibits and education sessions featuring artificial intelligence. While these new technologies impact many industries, the convenience services industry has experienced a significant boost in recent years thanks to AI, micro markets, cashless readers, digital signage, telemetry-based remote machine monitoring, smart sensor shelving, facial detection and voice technology.
Walsh, author of “The Dictionary of Dangerous Ideas” and CEO of Tomorrow, a consumer innovation research lab, challenged his listeners during his keynote presentation to think more creatively.
Artificial Intelligence (AI) is a field that has a long history but is still constantly and actively growing and changing. Artificial Intelligence (AI) technology is increasingly prevalent in our everyday lives. It has uses in a variety of industries from gaming, journalism/media, to finance, as well as in the state-of-the-art research fields from robotics, medical diagnosis, and quantum science.
Udacity was born out of a Stanford University experiment in which Sebastian Thrun and Peter Norvig offered their “Introduction to Artificial Intelligence” course online to anyone, for free. Over 160,000 students in more than 190 countries enrolled and not much later, Udacity was born.
Udacity, a pioneer in online education, is building “University by Silicon Valley”, a new type of online university that: – teaches the actual programming skills that industry employers need today; – delivers credentials endorsed by employers, because they built them; – provides education at a fraction of the cost and time of traditional schools.