Toggle light / dark theme

We are now on the brink of a “third revolution in warfare,” heralded by killer robots — the fully autonomous weapons that could decide who to target and kill… without human input.


Over the weekend, experts on military artificial intelligence from more than 80 world governments converged on the U.N. offices in Geneva for the start of a week’s talks on autonomous weapons systems. Many of them fear that after gunpowder and nuclear weapons, we are now on the brink of a “third revolution in warfare,” heralded by killer robots — the fully autonomous weapons that could decide who to target and kill without human input. With autonomous technology already in development in several countries, the talks mark a crucial point for governments and activists who believe the U.N. should play a key role in regulating the technology.

The meeting comes at a critical juncture. In July, Kalashnikov, the main defense contractor of the Russian government, announced it was developing a weapon that uses neural networks to make “shoot-no shoot” decisions. In January 2017, the U.S. Department of Defense released a video showing an autonomous drone swarm of 103 individual robots successfully flying over California. Nobody was in control of the drones; their flight paths were choreographed in real-time by an advanced algorithm. The drones “are a collective organism, sharing one distributed brain for decision-making and adapting to each other like swarms in nature,” a spokesman said. The drones in the video were not weaponized — but the technology to do so is rapidly evolving.

This April also marks five years since the launch of the International Campaign to Stop Killer Robots, which called for “urgent action to preemptively ban the lethal robot weapons that would be able to select and attack targets without any human intervention.” The 2013 launch letter — signed by a Nobel Peace Laureate and the directors of several NGOs — noted that they could be deployed within the next 20 years and would “give machines the power to decide who lives or dies on the battlefield.”

Read more

ALBUQUERQUE, NM – Applied Research Associates, Inc. (ARA) is developing a compact, completely self-contained directed-energy weapon that is the first of its size and specifications, making it a standout from existing systems.

The Silent Saber is high power, 1.5 kilowatt fiber laser packaged in a backpack with power supplies and a unique thermal control system. A telescope with multiple potential mounting configurations can be mounted to existing Picatinny rails of a soldier’s rifle, and a laser is connected to the telescope by a fiber cable.

“This tool provides options to the warfighter to support explosive ordnance disposal, counter infrastructure and counter drone missions,” said Principal Engineer Joseph Paranto, ARA’s Director of Directed Energy Systems.

Read more

“Be very, very afraid. As this extraordinary book reveals, we are fast sailing into an era in which big life-and-death decisions in war will be made not by men…and women, but by artificial intelligence” — @stavridisj’s review of @paul_scharre upcoming book Pre-order yours now:


A Pentagon defense expert and former U.S. Army Ranger explores what it would mean to give machines authority over the ultimate decision of life or death.

What happens when a Predator drone has as much autonomy as a Google car? Or when a weapon that can hunt its own targets is hacked? Although it sounds like science fiction, the technology already exists to create weapons that can attack targets without human input. Paul Scharre, a leading expert in emerging weapons technologies, draws on deep research and firsthand experience to explore how these next-generation weapons are changing warfare.

Scharre’s far-ranging investigation examines the emergence of autonomous weapons, the movement to ban them, and the legal and ethical issues surrounding their use. He spotlights artificial intelligence in military technology, spanning decades of innovation from German noise-seeking Wren torpedoes in World War II―antecedents of today’s homing missiles―to autonomous cyber weapons, submarine-hunting robot ships, and robot tank armies. Through interviews with defense experts, ethicists, psychologists, and activists, Scharre surveys what challenges might face “centaur warfighters” on future battlefields, which will combine human and machine cognition. We’ve made tremendous technological progress in the past few decades, but we have also glimpsed the terrifying mishaps that can result from complex automated systems―such as when advanced F-22 fighter jets experienced a computer meltdown the first time they flew over the International Date Line.

Read more

‘Google should not be in the business of war’: Over 3,000 employees pen letter urging CEO to pull out of the Pentagon’s controversial AI drone research, citing firm’s ‘Don’t Be Evil’ motto…


More than 3,000 Google employees have penned an open letter calling upon the internet giant’s CEO to end its controversial ‘Project Maven’ deal.

Calling the deal ‘business of war’, they said Google boss Sundar Pichai should ‘cancel this project immediately’.

It was revealed last month that Google is allowing the Pentagon to use some of its artificial intelligence technologies to analyze drone footage.

Read more

O n the outskirts of Beijing, a policeman peers over his glasses at a driver stopped at a motorway checkpoint. As he looks at the man’s face, a tiny camera in one of the lenses of his glasses records his features and checks them with a national database.

The artificial intelligence-powered glasses are what Chinese citizens refer to as “black tech”, because they spot delinquents on the country’s “blacklist”. Other examples include robots for crowd control, drones that hover over the country’s borders, and intelligent systems to track behaviour online. Some reports claim the government has installed scanners that can forcibly read information from smartphones.

In the last two weeks, Facebook has been mired in a privacy storm in the UK and US over potential misuse of personal data. But such an event might baffle many in China, where the country’s surveillance culture eclipses anything Facebook has done.

Read more

To ensure the safety of larger aircraft carrying pilots and passengers, unmanned aerial vehicles, or drones, can’t be flown higher than 400 feet so they don’t enter national airspace. Regulations in Russia, however, aren’t as strict, enabling drone pilot Denis Koryakin to fly his homebuilt, 2.3-pound craft to a staggering height of almost 33,000 feet.

For comparison, a 747 has a maximum ceiling of just over 45,000 feet, but most airliners will cruise at around the same altitude this tiny drone managed to reach. As amazing as the view is from 33,000 feet, it’s certainly a dangerous stunt and will get you in heaps of trouble in the US if you get caught. But Koryakin’s flight took place near the city of Strejevoï, in Siberia, which is notoriously frigid and sparsely populated. Russia also doesn’t appear to have any regulations on how high a drone can be flown, but hopefully stunts like this don’t become too commonplace.

[YouTube via DPReview].

Read more