Toggle light / dark theme

Using reinforcement learning (RL) to train robots directly in real-world environments has been considered impractical due to the huge amount of trial and error operations typically required before the agent finally gets it right. The use of deep RL in simulated environments has thus become the go-to alternative, but this approach is far from ideal, as it requires designing simulated tasks and collecting expert demonstrations. Moreover, simulations can fail to capture the complexities of real-world environments, are prone to inaccuracies, and the resulting robot behaviours will not adapt to real-world environmental changes.

The Dreamer algorithm proposed by Hafner et al. at ICLR 2020 introduced an RL agent capable of solving long-horizon tasks purely via latent imagination. Although Dreamer has demonstrated its potential for learning from small amounts of interaction in the compact state space of a learned world model, learning accurate real-world models remains challenging, and it was unknown whether Dreamer could enable faster learning on physical robots.

In the new paper DayDreamer: World Models for Physical Robot Learning, Hafner and a research team from the University of California, Berkeley leverage recent advances in the Dreamer world model to enable online RL for robot training without simulators or demonstrations. The novel approach achieves promising results and establishes a strong baseline for efficient real-world robot training.

Research in the field of machine learning and AI, now a key technology in practically every industry and company, is far too voluminous for anyone to read it all. This column, Perceptron, aims to collect some of the most relevant recent discoveries and papers — particularly in, but not limited to, artificial intelligence — and explain why they matter.

In this batch of recent research, Meta open-sourced a language system that it claims is the first capable of translating 200 different languages with “state-of-the-art” results. Not to be outdone, Google detailed a machine learning model, Minerva, that can solve quantitative reasoning problems including mathematical and scientific questions. And Microsoft released a language model, Godel, for generating “realistic” conversations that’s along the lines of Google’s widely publicized Lamda. And then we have some new text-to-image generators with a twist.

Meta’s new model, NLLB-200, is a part of the company’s No Language Left Behind initiative to develop machine-powered translation capabilities for most of the world’s languages. Trained to understand languages such as Kamba (spoken by the Bantu ethnic group) and Lao (the official language of Laos), as well as over 540 African languages not supported well or at all by previous translation systems, NLLB-200 will be used to translate languages on the Facebook News Feed and Instagram in addition to the Wikimedia Foundation’s Content Translation Tool, Meta recently announced.

The recent James Webb Space Telescope(JWST) guide camera’s test image looks really similar to Hubble’s deep fields, which are my favorite. I decided to take a long exposure to the same target to see what my telescope can see and compare it to JWST’s image. I found one really faint galaxy 26–32 million light-years away, and a cute planetary nebula called Abell 39, pause and see if you can find it in my image.

- Scope: Celestron RASA 8.
- Mount: Ioptron cem40.
- Camera: ZWO ASI183mm pro.
- Guide scope: ZWO mini120mm.
- Guide Camera: ZWO ASI224mc.
- Filter: Astronomik MaxFR 12nm Ha filter.

NASA article: https://www.nasa.gov/image-feature/countdown-to-the-webb-telescopes-first-images.

More of my astrophotography work on Instagram.

Multiple angles of Booster 7 experiencing an unexpected ignition during Raptor engine testing.

Video and Pictures from the NSF Robots. Edited by Jack (@theJackBeyer).

All content copyright to NSF. Not to be used elsewhere without explicit permission from NSF.

Click “Join” for access to early fast turnaround clips, exclusive discord access with the NSF team, etc — to support the channel.

Head to https://www.squarespace.com/marcushouse to save 10% off your first purchase of a website or domain using code MARCUSHOUSE

Quite the inspirational week this one with the complete set of JWST First Images. Loads of Starship and Starbase news. Last week I mentioned that it was fire time for Starbase, and…WOW… I was not wrong there. SpaceX’s Starship Booster 7 has gone for repair after explosion. Falcon 9 launches for both Starlink and finally CRS-25. We also had the very first launch of Vega C. Rocket Lab firing off another Electron, and more. So enough of this intro. Let’s crack on with it!

Everyday Astronaut — Elon Musk Explains SpaceX’s Raptor Engine!

End Screen Music — Isle of Rain by Savfk.

A new language model similar in scale to GPT-3 is being made freely available and could help to democratise access to AI.

BLOOM (which stands for BigScience Large Open-science Open-access Multilingual Language Model) has been developed by 1,000 volunteer researchers from over 70 countries and 250 institutions, supported by ethicists, philosophers, and legal experts, in a collaboration called BigScience. The project, coordinated by New York-based startup Hugging Face, used funding from the French government.

The new AI took more than a year of planning and training, which included a final run of 117 days (11th March – 6th July) using the Jean Zay, one of Europe’s most powerful supercomputers, located in the south of Paris, France.

An international team of physicists has developed a new technique that allows researchers to study the interactions between neutrons inside of an atom. In their paper published in the journal Nature, the group describe their laser spectroscopy measurement technique and how it can be used.

It has been nearly 100 years since scientists discovered that inside of every atom are —which give atoms their —as well as . And despite much study of subatomic particles, scientists still do not know what sorts of interactions go on inside of an atom. In this new effort, the researchers modified laser spectroscopy measurement techniques to study such interactions.

In this new work, the researchers began by looking at elements with a —those that have highly stable protons and neutrons—and wound up using indium-131, which has a magic number of neutrons, and also a proton hole, in which a nuclide has one fewer proton than a traditional magic number element. Indium-131 is, unfortunately, also notoriously unstable, which means that it only exists for a short time before breaking down—it tends to last for just 0.28 seconds.