Researchers at Intel Labs have modded Grand Theft Auto V using a neural network and a dataset of photos of German cities. The results look unsettlingly photorealistic.
Researchers at Intel Labs have modded Grand Theft Auto V using a neural network and a dataset of photos of German cities. The results look unsettlingly photorealistic.
It will serve as a backbone network for the China Environment for Network Innovations (CENI), a national research facility connecting the largest cities in China, to verify its performance and the security of future network communications technology before commercial use.
Experimental network connects 40 leading universities to prepare for an AI-driven society five to 10 years down the track.
NASA’s newest Mars rover is beginning to study the floor of an ancient crater that once held a lake.
NASA’s Perseverance rover has been busy serving as a communications base station for the Ingenuity Mars Helicopter and documenting the rotorcraft’s historic flights. But the rover has also been busy focusing its science instruments on rocks that lay on the floor of Jezero Crater.
What insights they turn up will help scientists create a timeline of when an ancient lake formed there, when it dried, and when sediment began piling up in the delta that formed in the crater long ago. Understanding this timeline should help date rock samples – to be collected later in the mission – that might preserve a record of ancient microbes.
Researchers in Singapore have found a way of controlling a Venus flytrap using electric signals from a smartphone, an innovation they hope will have a range of uses from robotics to employing the plants as environmental sensors.
Luo Yifei, a researcher at Singapore’s Nanyang Technological University (NTU), showed in a demonstration how a signal from a smartphone app sent to tiny electrodes attached to the plant could make its trap close as it does when catching a fly.
“Plants are like humans, they generate electric signals, like the ECG (electrocardiogram) from our hearts,” said Luo, who works at NTU’s School of Materials Science and Engineering.
Despite years of hype, virtual reality headsets have yet to topple TV or computer screens as the go-to devices for video viewing.
One reason: VR can make users feel sick. Nausea and eye strain can result because VR creates an illusion of 3D viewing although the user is in fact staring at a fixed-distance 2D display. The solution for better 3D visualization could lie in a 60-year-old technology remade for the digital world: holograms.
Holograms deliver an exceptional representation of 3D world around us. Plus, they’re beautiful. (Go ahead — check out the holographic dove on your Visa card.) Holograms offer a shifting perspective based on the viewer’s position, and they allow the eye to adjust focal depth to alternately focus on foreground and background.
Cosmologists love universe simulations. Even models covering hundreds of millions of light years can be useful for understanding fundamental aspects of cosmology and the early universe. There’s just one problem – they’re extremely computationally intensive. A 500 million light year swath of the universe could take more than 3 weeks to simulate… Now, scientists led by Yin Li at the Flatiron Institute have developed a way to run these cosmically huge models 1000 times faster. That 500 million year light year swath could then be simulated in 36 minutes.
Older algorithms took such a long time in part because of a tradeoff. Existing models could either simulate a very detailed, very small slice of the cosmos or a vaguely detailed larger slice of it. They could provide either high resolution or a large area to study, not both.
To overcome this dichotomy, Dr. Li turned to an AI technique called a generative adversarial network (GAN). This algorithm pits two competing algorithms again each other, and then iterates on those algorithms with slight changes to them and judges whether those incremental changes improved the algorithm or not. Eventually, with enough iterations, both algorithms become much more accurate naturally on their own.
Last August, several dozen military drones and tanklike robots took to the skies and roads 40 miles south of Seattle. Their mission: Find terrorists suspected of hiding among several buildings.
So many robots were involved in the operation that no human operator could keep a close eye on all of them. So they were given instructions to find—and eliminate—enemy combatants when necessary.
The mission was just an exercise, organized by the Defense Advanced Research Projects Agency, a blue-sky research division of the Pentagon; the robots were armed with nothing more lethal than radio transmitters designed to simulate interactions with both friendly and enemy robots.
Using neural networks, Flatiron Institute research fellow Yin Li and his colleagues simulated vast, complex universes in a fraction of the time it takes with conventional methods.
Using a bit of machine learning magic, astrophysicists can now simulate vast, complex universes in a thousandth of the time it takes with conventional methods. The new approach will help usher in a new era in high-resolution cosmological simulations, its creators report in a study published online on May 4, 2021, in Proceedings of the National Academy of Sciences.
“At the moment, constraints on computation time usually mean we cannot simulate the universe at both high resolution and large volume,” says study lead author Yin Li, an astrophysicist at the Flatiron Institute in New York City. “With our new technique, it’s possible to have both efficiently. In the future, these AI-based methods will become the norm for certain applications.”
Blue Robotics, a leading developer of marine robotics systems and components, has partnered with Unmanned Systems Technology (“UST”) to demonstrate their expertise in this field. The ‘Silver’ profile highlights how their underwater ROVs (remotely operated vehicles), thrusters and accessories enable a wide range of missions for commercial, research and exploration applications.
The BlueROV2 is a high-performance, highly configurable ROV designed for underwater inspections, research and ocean exploring. With open-source hardware and software, the platform features an unprecedented level of flexibility and expandability, allowing users to easily make improvements and upgrades to take on a huge variety of missions down to depths of 100m (330 feet).
The ROV incorporates six Blue Robotics T200 thrusters in a vectored configuration, delivering excellent thrust-to-weight ratio and providing the ability to move precisely in any direction. The system can be expanded to eight thrusters via a Heavy Configuration Retrofit Kit, and features adjustable gain levels for precision control at extremely low speeds as well as high power to overcome currents and carry heavy loads. The BlueROV2 is provided with a Fathom ROV tether, with available length options from 25m (82 ft) up to 300 m (984 ft).
Local media reports quoted Wang as saying that artificial intelligence, 6G, quantum technology, driverless vehicles, intelligent networks and other “frontier areas” would be the focus of Shenzhen’s investment plans, while the value of its digital economy would account for more than 31 per cent of GDP by 2025.
Money will be used to support innovation in core technologies, city’s Communist Party chief Wang Weizhong says.