Toggle light / dark theme

AI system leverages standard security cameras to detect fires in seconds

Fire kills nearly 3,700 Americans annually and destroys $23 billion in property, with many deaths occurring because traditional smoke detectors fail to alert occupants in time.

Now, the NYU Fire Research Group at NYU Tandon School of Engineering has developed an artificial intelligence system that could significantly improve by detecting fires and smoke in using ordinary security cameras already installed in many buildings.

Published in the IEEE Internet of Things, the research demonstrates a system that can analyze and identify fires within 0.016 seconds per frame—faster than the blink of an eye—potentially providing crucial extra minutes for evacuation and . Unlike conventional smoke detectors that require significant smoke buildup and proximity to activate, this AI system can spot fires in their earliest stages from video alone.

Elon Musk: Robotaxis Will Replace Personal Cars, Not Just Uber

Questions to inspire discussion.

🧠 Q: How does Tesla’s upcoming AI chip compare to the current one? A: Tesla’s AI5 chip will be 40 times better than the current AI4 chip, which is already capable of achieving self-driving safety at least 2–3 times that of a human.

💰 Q: What is the expected pricing for Tesla’s robotaxi service? A: Tesla’s robotaxi service is projected to cost $2 per mile at launch, which is cheaper than Uber rides in high-cost areas like Seattle.

Impact on Transportation.

🚘 Q: How will robotaxis affect car ownership? A: Robotaxis are expected to become a viable alternative to car ownership, especially when prices reach $1 per mile, making them cheaper than options like airport parking.

💼 Q: How does Tesla’s robotaxi cost compare to competitors? A: Tesla’s robotaxi can be built and deployed for half the cost of competitors like Whim, potentially offering more competitive pricing.

A Systems View of LLMs on TPUs

Training LLMs often feels like alchemy, but understanding and optimizing the performance of your models doesn’t have to. This book aims to demystify the science of scaling language models: how TPUs (and GPUs) work and how they communicate with each other, how LLMs run on real hardware, and how to parallelize your models during training and inference so they run efficiently at massive scale. If you’ve ever wondered “how expensive should this LLM be to train” or “how much memory do I need to serve this model myself” or “what’s an AllGather”, we hope this will be useful to you.

Mathematical model of memory suggests seven senses are optimal

Skoltech scientists have devised a mathematical model of memory. By analyzing its new model, the team came to surprising conclusions that could prove useful for robot design, artificial intelligence, and for better understanding of human memory. Published in Scientific Reports, the study suggests there may be an optimal number of senses—if so, those of us with five senses could use a couple more.

“Our conclusion is, of course, highly speculative in application to human senses, although you never know: It could be that humans of the future would evolve a sense of radiation or magnetic field. But in any case, our findings may be of practical importance for robotics and the theory of ,” said study co-author Professor Nikolay Brilliantov of Skoltech AI.

“It appears that when each retained in memory is characterized in terms of seven features—as opposed to, say, five or eight—the of distinct objects held in memory is maximized.”

Atom-thin crystals provide new way to power the future of computer memory

Picture the smartphone in your pocket, the data centers powering artificial intelligence, or the wearable health monitors that track your heartbeat. All of them rely on energy-hungry memory chips to store and process information. As demand for computing resources continues to soar, so does the need for memory devices that are smaller, faster, and far more efficient.

A new study by Auburn physicists has taken an important step toward meeting this challenge.

The study, “Electrode-Assisted Switching in Memristors Based on Single-Crystal Transition Metal Dichalcogenides,” published in ACS Applied Materials & Interfaces, shows how memristors—ultra-thin that “remember” past —switch their state with the help of electrodes and subtle atomic changes inside the material.

Brain cells simulated in the electronic brain

Europe now has an exascale supercomputer which runs entirely on renewable energy. Of particular interest: one of the 30 inaugural projects for the machine focuses on realistic simulations of biological neurons (see https://www.fz-juelich.de/en/news/effzett/2024/brain-research)

[ https://www.nature.com/articles/d41586-025-02981-1](https://www.nature.com/articles/d41586-025-02981-1)


Large language models (LLMs) work with artificial neural networks inspired by the way the brain works. Dr. Thorsten Hater (JSC) is focused on the nature-inspired models of LLMs: neurons that communicate with each other in the human brain. He wants to use the exascale computer JUPITER to perform even more realistic simulations of the behaviour of individual neurons.

Many models treat a neuron merely as a point that is connected to other points. The spikes, or electrical signals, travel along these connections. “Of course, this is overly simplified,” says Hater. “In our model, the neurons have a spatial extension, as they do in reality. This allows us to describe many processes in detail on the molecular level. We can calculate the electric field across the entire cell. And we can thus show how signal transmission varies right down to the individual neuron. This gives us a much more realistic picture of these processes.”

For the simulations, Hater uses a program called Arbor. This allows more than two million individual cells to be interconnected computationally. Such models of natural neural networks are useful, for example, in the development of drugs to combat neurodegenerative diseases like Alzheimer’s. The physicist and software developer would like to simulate and study the changes that take place in the neurons in the brain on the exascale computer.

/* */