Toggle light / dark theme

Doing The Math On CPU-Native AI Inference

A number of chip companies — importantly Intel and IBM, but also the Arm collective and AMD — have come out recently with new CPU designs that feature native Artificial Intelligence (AI) and its related machine learning (ML). The need for math engines specifically designed to support machine learning algorithms, particularly for inference workloads but also for certain kinds of training, has been covered extensively here at The Next Platform.

Just to rattle off a few of them, consider the impending “Cirrus” Power10 processor from IBM, which is due in a matter of days from Big Blue in its high-end NUMA machines and which has a new matrix math engine aimed at accelerating machine learning. Or IBM’s “Telum” z16 mainframe processor coming next year, which was unveiled at the recent Hot Chips conference and which has a dedicated mixed precision matrix math core for the CPU cores to share. Intel is adding its Advanced Matrix Extensions (AMX) to its future “Sapphire Rapids” Xeon SP processors, which should have been here by now but which have been pushed out to early next year. Arm Holdings has created future Arm core designs, the “Zeus” V1 core and the “Perseus” N2 core, that will have substantially wider vector engines that support the mixed precision math commonly used for machine learning inference, too. Ditto for the vector engines in the “Milan” Epyc 7,003 processors from AMD.

All of these chips are designed to keep inference on the CPUs, where in a lot of cases it belongs because of data security, data compliance, and application latency reasons.

Machine learning unravels quantum atomic vibrations in materials

Caltech scientists have developed an artificial intelligence (AI)–based method that dramatically speeds up calculations of the quantum interactions that take place in materials. In new work, the group focuses on interactions among atomic vibrations, or phonons—interactions that govern a wide range of material properties, including heat transport, thermal expansion, and phase transitions. The new machine learning approach could be extended to compute all quantum interactions, potentially enabling encyclopedic knowledge about how particles and excitations behave in materials.

Scientists like Marco Bernardi, professor of applied physics, physics, and at Caltech, and his graduate student Yao Luo (MS ‘24) have been trying to find ways to speed up the gargantuan calculations required to understand such particle interactions from first principles in real materials—that is, beginning with only a material’s atomic structure and the laws of quantum mechanics.

Last year, Bernardi and Luo developed a data-driven method based on a technique called singular value decomposition (SVD) to simplify the enormous mathematical matrices scientists use to represent the interactions between electrons and phonons in a material.

Systematic fraud uncovered in mathematics publications

An international team of authors led by Ilka Agricola, professor of mathematics at the University of Marburg, Germany, has investigated fraudulent practices in the publication of research results in mathematics on behalf of the German Mathematical Society (DMV) and the International Mathematical Union (IMU), documenting systematic fraud over many years.

The results of the study were recently posted on the arXiv preprint server and in the Notices of the American Mathematical Society and have since caused a stir among mathematicians.

To solve the problem, the study also provides recommendations for the publication of research results in mathematics.

Tesla AI5 & AI6 Chips “Compressing Reality”?! What Did Elon See?!

Elon Musk has revealed Tesla’s new AI chips, AI5 and AI6, which will drive the company’s shift towards AI-powered services, enabling significant advancements in Full Self-Driving capabilities and potentially revolutionizing the self-driving car industry and beyond.

## Questions to inspire discussion.

Tesla’s AI Chip Advancements.

🚀 Q: What are the key features of Tesla’s AI5 and AI6 chips? A: Tesla’s AI5 and AI6 chips are inference-first, designed for high-throughput and efficient processing of AI models on devices like autos, Optimus, and Grok voice agents, being 40x faster than previous models.

💻 Q: How do Tesla’s AI5 and AI6 chips compare to previous models? A: Tesla’s AI5 chip is a 40x improvement over AI4, with 500 TOPS expanding to 5,000 TOPS, enabling excellent performance in full self-driving and Optimus humanoid robots.

🧠 Q: What is the significance of softmax in Tesla’s AI5 chip? A: AI5 is designed to run softmax natively in a few steps, unlike AI4 which relies on CPU and runs softmax in 40 steps in emulation mode.

Mathematical model of memory suggests seven senses are optimal

Skoltech scientists have devised a mathematical model of memory. By analyzing its new model, the team came to surprising conclusions that could prove useful for robot design, artificial intelligence, and for better understanding of human memory. Published in Scientific Reports, the study suggests there may be an optimal number of senses—if so, those of us with five senses could use a couple more.

“Our conclusion is, of course, highly speculative in application to human senses, although you never know: It could be that humans of the future would evolve a sense of radiation or magnetic field. But in any case, our findings may be of practical importance for robotics and the theory of ,” said study co-author Professor Nikolay Brilliantov of Skoltech AI.

“It appears that when each retained in memory is characterized in terms of seven features—as opposed to, say, five or eight—the of distinct objects held in memory is maximized.”

Mathematical ‘sum of zeros’ trick exposes topological magnetization in quantum materials

A new study addresses a foundational problem in the theory of driven quantum matter by extending the Středa formula to non-equilibrium regimes. It demonstrates that a superficially trivial “sum of zeros” encodes a universal, quantized magnetic response—one that is intrinsically topological and uniquely emergent under non-equilibrium driving conditions.

Imagine a strange material being rhythmically pushed—tapped again and again by invisible hands. These are periodically driven , or Floquet systems, where energy is no longer conserved in the usual sense. Instead, physicists speak of quasienergy—a looping spectrum with no clear start or end.

When scientists measure how such a system responds to a magnetic field, every single contribution seems to vanish—like adding an infinite list of zeros. And yet, the total stubbornly comes out finite, quantized, and very real.

Clocks created from random events can probe ‘quantumness’ of universe

A newly discovered set of mathematical equations describes how to turn any sequence of random events into a clock, scientists at King’s College London reveal. The paper is published in the journal Physical Review X.

The researchers suggest that these formulas could help to understand how cells in our bodies measure time and to detect the effects of quantum mechanics in the wider world.

Studying these timekeeping processes could have far-reaching implications, helping us to understand proteins with rhythmic movements which malfunction in motor neuron disease or chemical receptors that cells use to detect harmful toxins.

What Is Superposition and Why Is It Important?

Imagine touching the surface of a pond at two different points at the same time. Waves would spread outward from each point, eventually overlapping to form a more complex pattern. This is a superposition of waves. Similarly, in quantum science, objects such as electrons and photons have wavelike properties that can combine and become what is called superposed.

While waves on the surface of a pond are formed by the movement of water, quantum waves are mathematical. They are expressed as equations that describe the probabilities of an object existing in a given state or having a particular property. The equations might provide information on the probability of an electron moving at a specific speed or residing in a certain location. When an electron is in superposition, its different states can be thought of as separate outcomes, each with a particular probability of being observed. An electron might be said to be in a superposition of two different velocities or in two places at once. Understanding superposition may help to advance quantum technology such as quantum computers.


One of the fundamental principles of quantum mechanics, superposition explains how a quantum state can be represented as the sum of two or more states.

Physicists demonstrate controlled expansion of quantum wavepacket in a levitated nanoparticle

Quantum mechanics theory predicts that, in addition to exhibiting particle-like behavior, particles of all sizes can also have wave-like properties. These properties can be represented using the wave function, a mathematical description of quantum systems that delineates a particle’s movements and the probability that it is in a specific position.

/* */