Toggle light / dark theme

AI that scans a construction site can spot when things are falling behind

Construction sites are vast jigsaws of people and parts that must be pieced together just so at just the right times. As projects get larger, mistakes and delays get more expensive. The consultancy Mckinsey estimates that on-site mismanagement costs the construction industry $1.6 trillion a year. But typically you might only have five managers overseeing construction of a building with 1,500 rooms, says Roy Danon, founder and CEO of British-Israeli startup Buildots: “There’s no way a human can control that amount of detail.”

Danon thinks that AI can help. Buildots is developing an image recognition system that monitors every detail of an ongoing construction project and flags up delays or errors automatically. It is already being used by two of the biggest building firms in Europe, including UK construction giant Wates in a handful of large residential builds. Construction is essentially a kind of manufacturing, says Danon. If high-tech factories now use AI to manage their processes, why not construction sites?

AI is starting to change various aspects of construction, from design to self-driving diggers. Some companies even provide a kind of overall AI site inspector that matches images taken on site against a digital plan of the building. Now Buildots is making that process easier than ever by using video footage from GoPro cameras mounted on the hard hats of workers.

All-terrain microrobot flips through a live colon

A rectangular robot as tiny as a few human hairs can travel throughout a colon by doing back flips, Purdue University engineers have demonstrated in live animal models.

Why the back flips? Because the goal is to use these robots to transport drugs in humans, whose colons and other organs have . Side flips work, too.

Why a back-flipping robot to transport drugs? Getting a drug directly to its target site could remove side effects, such as hair loss or stomach bleeding, that the drug may otherwise cause by interacting with other organs along the way.

A virtual reality game that integrates tactile experiences using biometric feedback

Over the past few decades, technological advances have enabled the development of increasingly sophisticated, immersive and realistic video games. One of the most noteworthy among these advances is virtual reality (VR), which allows users to experience games or other simulated environments as if they were actually navigating them, via the use of electronic wearable devices.

Most existing VR systems primarily focus on the sense of vision, using headsets that allow users to see what is happening in a or in another simulated environment right before their eyes, rather than on a screen placed in front of them. While this can lead to highly engaging visual experiences, these experiences are not always matched by other types of sensory inputs.

Researchers at Nagoya University’s School of Informatics in Japan have recently created a new VR game that integrates immersive audiovisual experiences with . This game, presented in a paper published in the Journal of Robotics, Networking and Artificial Life, uses a player’s biometric data to create a spherical object in the VR space that beats in alignment with his/her heart. The player can thus perceive the beating of his/her heart via this object visually, auditorily and tactually.

SpaceX targeting this weekend for Starlink launch from Kennedy Space Center

SpaceX is targeting this weekend for its next Falcon 9 rocket launch from Kennedy Space Center, this time with another batch of Starlink internet satellites.

If schedules hold, teams will give the go-ahead for the 230-foot rocket to launch from pad 39A at 8:27 a.m. Sunday, the opening of an instantaneous window. It must launch at that time or delay to another day.

About eight minutes after liftoff, the rocket’s 162-foot first stage will target an autonomous landing on the Of Course I Still Love You drone ship in the Atlantic Ocean. SpaceX’s fleet of ships and the booster should return to Port Canaveral a few days later.

Artificial Intelligence Used to ‘Redefine’ Alzheimer’s Disease

Summary: New artificial intelligence technology will analyze clinical data, brain images, and genetic information from Alzheimer’s patients to look for new biomarkers associated with the neurodegenerative disease.

Source: University of Pennsylvania

As the search for successful Alzheimer’s disease drugs remains elusive, experts believe that identifying biomarkers — early biological signs of the disease — could be key to solving the treatment conundrum. However, the rapid collection of data from tens of thousands of Alzheimer’s patients far exceeds the scientific community’s ability to make sense of it.