Toggle light / dark theme

Astronomers have identified an exoplanet named Enaiposha, also known as GJ 1214 b, located 47 light-years from Earth. Initially classified as a mini-Neptune, further observations suggest it may belong to a different planetary category.

We now know it isn’t just neutron stars that emit such pulses. A white dwarf and a red dwarf star have been discovered closely orbiting each other emitting radio pulses every two hours. Their findings means we know it isn’t just neutron stars that emit such pulses, but these are spaced unusually far apart.

An international team of astronomers led by Dr Iris de Ruiter, now at the University of Sydney, has shown that a white dwarf and a red dwarf star orbiting each other every two hours are emitting radio pulses.

Thanks to follow-up observations using optical and x-ray telescopes, the researchers were able to determine the origin of these pulses with certainty. The findings explain the source of such radio emissions found across the Milky Way galaxy for the first time.

Imagine an automated delivery vehicle rushing to complete a grocery drop-off while you are hurrying to meet friends for a long-awaited dinner. At a busy intersection, you both arrive at the same time. Do you slow down to give it space as it maneuvers around a corner? Or do you expect it to stop and let you pass, even if normal traffic etiquette suggests it should go first?

“As becomes a reality, these everyday encounters will define how we share the road with intelligent machines,” says Dr. Jurgis Karpus from the Chair of Philosophy of Mind at LMU. He explains that the arrival of fully automated self-driving cars signals a shift from us merely using —like Google Translate or ChatGPT—to actively interacting with them. The key difference? In busy traffic, our interests will not always align with those of the self-driving cars we encounter. We have to interact with them, even if we ourselves are not using them.

In a study published recently in the journal Scientific Reports, researchers from LMU Munich and Waseda University in Tokyo found that people are far more likely to take advantage of cooperative artificial agents than of similarly cooperative fellow humans. “After all, cutting off a robot in traffic doesn’t hurt its feelings,” says Karpus, lead author of the study.

Why would anyone need this level of wavelength detail in an image? There are many reasons. Car manufacturers want to predict exactly how paint will look under different lighting. Scientists use spectral imaging to identify materials by their unique light signatures. And rendering specialists need it to accurately simulate real-world optical effects like dispersion (rainbows from prisms, for example) and fluorescence.

For instance, past Ars Technica coverage has highlighted how astronomers analyzed spectral emission lines from a gamma-ray burst to identify chemicals in the explosion, how physicists reconstructed original colors in pioneering 19th century photographs, and how multispectral imaging revealed hidden, centuries-old text and annotations on medieval manuscripts like the Voynich Manuscript, sometimes even uncovering the identities of past readers or scribes through faint surface etchings.

The current standard format for storing this kind of data, OpenEXR, wasn’t designed with these massive spectral requirements in mind. Even with built-in lossless compression methods like ZIP, the files remain unwieldy for practical work as these methods struggle with the large number of spectral channels.

The warning came from one of the co-authors of the Space Weather Instrumentation, Measurement, Modelling and Risk (SWIMMR) S6 project group’s Severe space weather impacts on UK critical national infrastructure report, which was funded by the government.

The report said the government, regulators and CNI operators must “develop space weather preparedness plans” for CNI.

Space weather “is caused by disturbances from active regions of the Sun”, the report says.