Toggle light / dark theme

Quasicrystals Are Nature’s Impossible Matter

VICE.


What do a frying pan, an LED light, and the most cutting edge camouflage in the world have in common? Well, that largely depends on who you ask. Most people would struggle to find the link, but for University of Michigan chemical engineers Sharon Glotzer and Michael Engel, there is a substantial connection, indeed one that has flipped the world of materials science on its head since its discovery over 30 years ago.

The magic ingredient common to all three items is the quasiperiodic crystal, the “impossible” atomic arrangement discovered by Dan Shechtman in 1982. Basically, a quasicrystal is a crystalline structure that breaks the periodicity (meaning it has translational symmetry, or the ability to shift the crystal one unit cell without changing the pattern) of a normal crystal for an ordered, yet aperiodic arrangement. This means that quasicrystalline patterns will fill all available space, but in such a way that the pattern of its atomic arrangement never repeats. Glotzer and Engel recently managed to simulate the most complex quasicrystal ever, a discovery which may revolutionize the field of crystallography by blowing open the door for a whole host of applications that were previously inconceivable outside of science-fiction, like making yourself invisible or shape-shifting robots.

Early Bird uses 10 times less energy to train deep neural networks

Rice University’s Early Bird could care less about the worm; it’s looking for megatons of greenhouse gas emissions.

Early Bird is an energy-efficient method for training deep neural networks (DNNs), the form of artificial intelligence (AI) behind self-driving cars, intelligent assistants, facial recognition and dozens more high-tech applications.

Researchers from Rice and Texas A&M University unveiled Early Bird April 29 in a spotlight paper at ICLR 2020, the International Conference on Learning Representations. A study by lead authors Haoran You and Chaojian Li of Rice’s Efficient and Intelligent Computing (EIC) Lab showed Early Bird could use 10.7 times less energy to train a DNN to the same level of accuracy or better than typical training. EIC Lab director Yingyan Lin led the research along with Rice’s Richard Baraniuk and Texas A&M’s Zhangyang Wang.

OpenAI Finds Machine Learning Efficiency Is Outpacing Moore’s Law

Eight years ago a machine learning algorithm learned to identify a cat —and it stunned the world. A few years later AI could accurately translate languages and take down world champion Go players. Now, machine learning has begun to excel at complex multiplayer video games like Starcraft and Dota 2 and subtle games like poker. AI, it would appear, is improving fast.

But how fast is fast, and what’s driving the pace? While better computer chips are key, AI research organization OpenAI thinks we should measure the pace of improvement of the actual machine learning algorithms too.

In a blog post and paper —authored by OpenAI’s Danny Hernandez and Tom Brown and published on the arXiv, an open repository for pre-print (or not-yet-peer-reviewed) studies—the researchers say they’ve begun tracking a new measure for machine learning efficiency (that is, doing more with less). Using this measure, they show AI has been getting more efficient at a wicked pace.

This machine is a safe countermeasure to drones

The DroneGun Tactical by Australian-based company DroneShield is like something out of a video game. The rifle-shaped, high-powered antenna “blasts” drones out of the sky with frequency waves.

DroneShield designed the technology to thwart unmanned aerial vehicles (UAV) with explosives or weapons strapped to them. It works by blocking video transmission and GPS information, making it nearly impossible for its pilot to regain control.

“Most modern drones are equipped with a protocol that they come back to their operator when the radio frequency signal is jammed and land when radio frequency and GPS are both jammed,” company spokesman Oleg Vornik told the Daily Mail.

Artificial intelligence is energy-hungry—new hardware could curb its appetite

To just solve a puzzle or play a game, artificial intelligence can require software running on thousands of computers. That could be the energy that three nuclear plants produce in one hour.

A team of engineers has created hardware that can learn skills using a type of AI that currently runs on platforms. Sharing intelligence features between hardware and software would offset the energy needed for using AI in more advanced applications such as self-driving cars or discovering drugs.

“Software is taking on most of the challenges in AI. If you could incorporate intelligence into the circuit components in addition to what is happening in software, you could do things that simply cannot be done today,” said Shriram Ramanathan, a professor of materials engineering at Purdue University.

The Future of Employment with AI

Does artificial intelligence jeopardize employment for humans? What will people do when smart robots join the workforce? AI already plays a role in many of our jobs, and if you have ever searched for information online, you have interacted with an AI. If we extrapolate the evolution of search, we can imagine that soon AIs will become even better at helping us learn solutions that have worked in the past and remember what things have failed. In this way, working with AIs can be like having a really smart colleague or expert old-timer on our team. And these AI coworkers can also help us experiment with new approaches because AIs can be creative as well. Their creativity is unlike human creativity, and that uniqueness is its primary value. AIs can also make valuable team members by performing rote tasks that humans are or become bored by. The share of work that AIs perform is likely to shift over time, but I cannot think of a single job or occupation that will not benefit from collaborating with and delegating to AIs. If we reframe our fears about robots taking human jobs, if we can utilize the AI over our shoulder, if we can see AIs as team members, we will find the future of work holds opportunities for all of us.

This video on “The Future of Employment with AI” was commissioned by China Mobile as part of an online course. It is one of 36 lecture videos. A version with Chinese subtitles is available at Citic Migu: http://citic.cmread.com/zxHtml/listenBook/listenDetail/liste…&channel=1

A transcript of the lecture in English is available here: https://drive.google.com/file/d/16dYZ4Vwm796ScRQ0lHrEauwC4M3…sp=sharing

Whirling, glassy magnets found to be new state of matter

Most of us are familiar with the four classical states of matter – solid, liquid, gas and plasma – but there’s a whole world of exotic states out there. Now, physicists at Radboud and Uppsala Universities have identified a new one named “self-induced spin glass,” which could be used to build new artificial intelligence platforms.

Magnetism usually arises when the electrons in the atoms of a material all spin in the same direction. But in a spin glass, the atomic magnets have no order, all spinning in random directions. The “glass” part of the name comes from the similarities to how atoms are arranged amorphously in a piece of regular old glass.

So far spin glasses have only been found in certain alloys, but now, researchers have discovered that the state occurs naturally in the pure element neodymium. To differentiate it from the alloy version, they’ve called the new state self-induced spin glass.

New ‘Whirling’ State of Matter Discovered: Self-Induced Spin Glass

The strongest permanent magnets today contain a mix of the elements neodymium and iron. However, neodymium on its own does not behave like any known magnet, confounding researchers for more than half a century. Physicists at Radboud University and Uppsala University have shown that neodymium behaves like a so-called ‘self-induced spin glass,’ meaning that it is composed of a rippled sea of many tiny whirling magnets circulating at different speeds and constantly evolving over time. Understanding this new type of magnetic behavior refines our understanding of elements on the periodic table and eventually could pave the way for new materials for artificial intelligence. The results will be published in Science on May 29, 2020.

“In a jar of honey, you may think that the once clear areas that turned milky yellow have gone bad. But rather, the jar of honey starts to crystallize. That’s how you could perceive the ‘aging’ process in neodymium.” Alexander Khajetoorians, professor in Scanning probe microscopy, together with professor Mikhail Katsnelson and assistant professor Daniel Wegner, found that the material neodymium behaves in a complex magnetic way that no one ever saw before in an element on the periodic table.