Menu

Blog

Archive for the ‘supercomputing’ category

Apr 24, 2024

Nvidia Dgx Gh200

Posted by in categories: robotics/AI, supercomputing

First @NVIDIA DGX H200 in the world, hand-delivered to OpenAI and dedicated by Jensen “to advance AI, computing, and humanity”: v/ @gdb.

Continue reading “Nvidia Dgx Gh200” »

Apr 24, 2024

Russia Is Working on a 128-Core Supercomputing Platform: Report

Posted by in category: supercomputing

The country’s notoriously ancient computer systems are due for an upgrade.

Apr 24, 2024

Supercomputer simulation reveals new mechanism for membrane fusion

Posted by in categories: biotech/medical, supercomputing

An intricate simulation performed by UT Southwestern Medical Center researchers using one of the world’s most powerful supercomputers sheds new light on how proteins called SNAREs cause biological membranes to fuse.

Their findings, reported in the Proceedings of the National Academy of Sciences, suggest a new mechanism for this ubiquitous process and could eventually lead to new treatments for conditions in which is thought to go awry.

“Biology textbooks say that SNAREs bring membranes together to cause fusion, and many people were happy with that explanation. But not me, because membranes brought into contact normally do not fuse. Our simulation goes deeper to show how this important process takes place,” said study leader Jose Rizo-Rey (“Josep Rizo”), Ph.D., Professor of Biophysics, Biochemistry, and Pharmacology at UT Southwestern.

Apr 24, 2024

The basis of the universe may not be energy or matter but information

Posted by in categories: particle physics, supercomputing

In this radical view, the universe is a giant supercomputer processing particles as bits.

Apr 22, 2024

NVIDIA To Collaborate With Japan On Their Cutting-Edge ABCI-Q Quantum Supercomputer

Posted by in categories: quantum physics, robotics/AI, supercomputing

NVIDIA is all set to aid Japan in building the nation’s hybrid quantum supercomputer, fueled by the immense power of its HPC & AI GPUs.

Japan To Rapidly Progressing In Quantum and AI Computing Segments Through Large-Scale Developments With The Help of NVIDIA’s AI & HPC Infrastructure

Nikkei Asia reports that the National Institute of Advanced Industrial and Technology (AIST), Japan, is building a quantum supercomputer to excel in this particular segment for prospects. The new project is called ABCI-Q & will be entirely powered by NVIDIA’s accelerated & quantum computing platforms, hinting towards high-performance and efficiency results out of the system. The Japanese supercomputer will be built in collaboration with Fujitsu as well.

Apr 22, 2024

Russia prepares 128-core server platform for supercomputers: Report

Posted by in category: supercomputing

But where do they plan to manufacture them?

Apr 19, 2024

Tesla’s Dojo Supercomputer: A Game-Changer in AI Computation

Posted by in categories: robotics/AI, supercomputing

Tesla’s Dojo supercomputer represents a significant investment and commitment to innovation in the field of AI computation, positioning Tesla as a key player in shaping the future of neural net hardware.

Questions to inspire discussion.

Continue reading “Tesla’s Dojo Supercomputer: A Game-Changer in AI Computation” »

Apr 16, 2024

AI-powered ‘digital twin’ of Earth could make weather predictions at super speeds

Posted by in categories: robotics/AI, supercomputing

An AI-driven supercomputer dubbed Earth’s ‘digital twin’ could help us avoid the worst impacts of climate catastrophes headed our way.

Apr 16, 2024

Los Alamos Pushes The Memory Wall With “Venado” Supercomputer

Posted by in categories: military, supercomputing

Today is the ribbon-cutting ceremony for the “Venado” supercomputer, which was hinted at back in April 2021 when Nvidia announced its plans for its first datacenter-class Arm server CPU and which was talked about in some detail – but not really enough to suit our taste for speeds and feeds – back in May 2022 by the folks at Los Alamos National Laboratory where Venado is situated.

Now we can finally get more details on the Venado system and get a little more insight into how Los Alamos will put it to work, and more specifically, why a better balance of memory bandwidth and compute that depends upon it is perhaps more important to this lab than it is in other HPC centers of the world.

Los Alamos was founded back in 1943 as the home of the Manhattan Project that created the world’s first nuclear weapons. We did not have supercomputers back then, of course, but plenty of very complex calculations have always been done at Los Alamos; sometimes by hand, sometimes by tabulators from IBM that used punch cards to store and manipulate data – an early form of simulation. The first digital computer to do such calculations at Los Alamos was called MANIAC and was installed in 1952; it could perform 10,000 operations per second and ran Monte Carlo simulations, which use randomness to simulate what are actually deterministic processes.

Apr 13, 2024

OpenAI and Microsoft are reportedly planning a $100B supercomputer

Posted by in categories: robotics/AI, supercomputing

Microsoft is reportedly planning to build a $100 billion data center and supercomputer, called “Stargate,” for OpenAI.

Page 1 of 8812345678Last