Toggle light / dark theme

Automating semiconductor research with machine learning

The semiconductor industry has been growing steadily ever since its first steps in the mid-twentieth century and, thanks to the high-speed information and communication technologies it enabled, it has given way to the rapid digitalization of society. Today, in line with a tight global energy demand, there is a growing need for faster, more integrated, and more energy-efficient semiconductor devices.

However, modern semiconductor processes have already reached the nanometer scale, and the design of novel high-performance materials now involves the structural analysis of semiconductor nanofilms. Reflection high-energy electron diffraction (RHEED) is a widely used analytical method for this purpose. RHEED can be used to determine the structures that form on the surface of thin films at the atomic level and can even capture structural changes in real time as the thin film is being synthesized!

Unfortunately, for all its benefits, RHEED is sometimes hindered by the fact that its output patterns are complex and difficult to interpret. In virtually all cases, a highly skilled experimenter is needed to make sense of the huge amounts of data that RHEED can produce in the form of diffraction patterns. But what if we could make machine learning do most of the work when processing RHEED data?

MEDUSA‘ dual robot’ drone flies and dives to collect aquatic data

Researchers at Imperial College London have developed a new dual drone that can both fly through air and land on water to collect samples and monitor water quality. The researchers developed a drone to make monitoring drones faster and more versatile in aquatic environments.

The ‘dual robot’ drone, tested at Empa and the aquatic research institute Eawag in Switzerland, has successfully measured water in lakes for signs of microorganisms and algal blooms, which can pose hazards to human health, and could in the future be used to monitor climate clues like temperature changes in Arctic seas.

The unique design, called Multi-Environment Dual robot for Underwater Sample Acquisition (MEDUSA), could also facilitate monitoring and maintenance of offshore infrastructure such as subsea pipelines and floating wind turbines.

Meet The High-Tech Urban Farmer Growing Vegetables Inside Hong Kong’s Skyscrapers

Hong Kong, a densely populated city where agriculture space is limited, is almost totally dependent on the outside world for its food supply. More than 90% of the skyscraper-studded city’s food, especially fresh produce like vegetables, is imported, mostly from mainland China. “During the pandemic, we all noticed that the productivity of locally grown vegetables is very low,” says Gordon Tam, cofounder and CEO of vertical farming company Farm66 in Hong Kong. “The social impact was huge.”

Tam estimates that only about 1.5% of vegetables in the city are locally produced. But he believes vertical farms like Farm66, with the help of modern technologies, such as IoT sensors, LED lights and robots, can bolster Hong Kong’s local food production—and export its know-how to other cities. “Vertical farming is a good solution because vegetables can be planted in cities,” says Tam in an interview at the company’s vertical farm in an industrial estate. “We can grow vegetables ourselves so that we don’t have to rely on imports.”

Tam says he started Farm66 in 2013 with his cofounder Billy Lam, who is COO of the company, as a high-tech vertical farming pioneer in Hong Kong. “Our company was the first to use energy-saving LED lighting and wavelength technologies in a farm,” he says. “We found out that different colors on the light spectrum help plants grow in different ways. This was our technological breakthrough.” For example, red LED light will make the stems grow faster, while blue LED light encourages plants to grow larger leaves.

LaMDA and the Sentient AI Trap

“Quite a large gap exists between the current narrative of AI and what it can actually do,” says Giada Pistilli, an ethicist at Hugging Face, a startup focused on language models. “This narrative provokes fear, amazement, and excitement simultaneously, but it is mainly based on lies to sell products and take advantage of the hype.”

The consequence of speculation about sentient AI, she says, is an increased willingness to make claims based on subjective impression instead of scientific rigor and proof. It distracts from “countless ethical and social justice questions” that AI systems pose. While every researcher has the freedom to research what they want, she says, “I just fear that focusing on this subject makes us forget what is happening while looking at the moon.”

What Lemoire experienced is an example of what author and futurist David Brin has called the “robot empathy crisis.” At an AI conference in San Francisco in 2017, Brin predicted that in three to five years, people would claim AI systems were sentient and insist that they had rights. Back then, he thought those appeals would come from a virtual agent that took the appearance of a woman or child to maximize human empathic response, not “some guy at Google,” he says.


Arguments over whether Google’s large language model has a soul distract from the real-world problems that plague artificial intelligence.

Harnessing machine learning to analyze quantum material

Electrons and their behavior pose fascinating questions for quantum physicists, and recent innovations in sources, instruments and facilities allow researchers to potentially access even more of the information encoded in quantum materials.

However, these research innovations are producing unprecedented—and until now, indecipherable—volumes of data.

“The information content in a piece of material can quickly exceed the total information content in the Library of Congress, which is about 20 terabytes,” said Eun-Ah Kim, professor of physics in the College of Arts and Sciences, who is at the forefront of both research and harnessing the power of to analyze data from quantum material experiments.

Introducing ZiGGY: An autonomous robot that saves you a parking spot then charges your EV

Charging technology provider EV Safe Charge has unveiled ZiGGY – a mobile robot that can charge an EV wherever it’s parked. Through its ability to recharge itself via different energy sources and its summoning feature, ZiGGY can alleviate the need to install specific parking stalls for EV charging, as any spot can now become a spot to recharge.

EV Safe Charge currently provides end-to-end charging solutions, particularly as it pertains to mobile charging. The company created a mobile rental charging solution for the launch of Jaguar’s I-PACE EV and works with several other OEMs like Audi, Mercedes-Benz, Nissan, Porsche, and Stellantis.

Previously, EV Safe Charge has helped find its clients ideal charging solutions based on their needs, recommending charging technology from a multitude of partners including ABB, Enel X, evconnect, and Bosch.

/* */