Swarms Of AI Drones To Patrol Europe’s Borders

Please Share This Story!
image_pdfimage_print
Threat analysis and decisions will be made autonomously, notifying border patrol agents; however, this is a slippery slope that could all too easily be inducted into broad law enforcement practices. ⁃ TN Editor

Imagine you’re hiking through the woods near a border. Suddenly, you hear a mechanical buzzing, like a gigantic bee. Two quadcopters have spotted you and swoop in for a closer look. Antennae on both drones and on a nearby autonomous ground vehicle pick up the radio frequencies coming from the cell phone in your pocket. They send the signals to a central server, which triangulates your exact location and feeds it back to the drones. The robots close in.


Cameras and other sensors on the machines recognize you as human and try to ascertain your intentions. Are you a threat? Are you illegally crossing a border? Do you have a gun? Are you engaging in acts of terrorism or organized crime? The machines send video feeds to their human operator, a border guard in an office miles away, who checks the videos and decides that you are not a risk. The border guard pushes a button, and the robots disengage and continue on their patrol.


This is not science fiction. The European Union is financing a project to develop drones piloted by artificial intelligence and designed to autonomously patrol Europe’s borders. The drones will operate in swarms, coordinating and corroborating information among fleets of quadcopters, small fixed-wing airplanes, ground vehicles, submarines, and boats. Developers of the project, known as Roborder, say the robots will be able to identify humans and independently decide whether they represent a threat. If they determine that you may have committed a crime, they will notify border police.


President Donald Trump has used the specter of criminals crossing the southern border to stir nationalist political sentiment and energize his base. In Europe, two years after the height of the migration crisis that brought more than a million people to the continent, mostly from the Middle East and Africa, immigration remains a hot-button issue, even as the number of new arrivals has dropped. Political parties across the European Union are winning elections on anti-immigrant platforms and enacting increasingly restrictive border policies. Tech ethicists and privacy advocates worry that Roborder and projects like it outsource too much law enforcement work to nonhuman actors and could easily be weaponized against people in border areas.


“The development of these systems is a dark step into morally dangerous territory,” said Noel Sharkey, emeritus professor of robotics and artificial intelligence at Sheffield University in the U.K. and one of the founders of the International Committee for Robot Arms Control, a nonprofit that advocates against the military use of robotics. Sharkey lists examples of weaponized drones currently on the market: flying robots equipped with Tasers, pepper spray, rubber bullets, and other weapons. He warns of the implications of combining that technology with AI-based decision-making and using it in politically-charged border zones. “It’s only a matter of time before a drone will be able to take action to stop people,” Sharkey told The Intercept.


Roborder’s developers also may be violating the terms of their funding, according to documents about the project obtained via European Union transparency regulations. The initiative is mostly financed by an €8 million EU research and innovation grant designed for projects that are exclusively nonmilitary, but Roborder’s developers acknowledge that parts of their proposed system involve military technology or could easily be converted for military use.


Much of the development of Roborder is classified, but The Intercept obtained internal reports related to ethical considerations and concerns about the program. That documentation was improperly redacted and inadvertently released in full.

In one of the reports, Roborder’s developers sought to address ethical criteria that are tied to their EU funding. Developers considered whether their work could be modified or enhanced to harm humans and what could happen if the technology or knowledge developed in the project “ended up in the wrong hands.” These ethical issues are raised, wrote the developers, when “research makes use of classified information, materials or techniques; dangerous or restricted materials[;] and if specific results of the research could present a danger to participants or to society as a whole.”


Roborder’s developers argued that these ethical concerns did not apply to their work, stating that their only goal was to develop and test the new technology, and that it would not be sold or transferred outside of the European Union during the life cycle of the project. But in interviews with The Intercept, project developers acknowledged that their technology could be repurposed and sold, even outside of Europe, after the European project cycle has finished, which is expected to happen next year.


Beyond the Roborder project, the ethics reports filed with the European Commission suggest a larger question: When it comes to new technology with the potential to be used against vulnerable people in places with few human rights protections, who decides what we should and should not develop?


Roborder won its funding grant in 2017 and has set out to develop a marketable prototype — “a swarm of robotics to support border monitoring” — by mid-2020. Its developers hope to build and equip a collection of air, sea, and land drones that can be combined and sent out on border patrol missions, scanning for “threats” autonomously based on information provided by human operators, said Stefanos Vrochidis, Roborder’s project manager.


The drones will employ optical, infrared, and thermal cameras; radar; and radio frequency sensors to determine threats along the border. Cell phone frequencies will be used to triangulate the location of people suspected of criminal activity, and cameras will identify humans, guns, vehicles, and other objects. “The main objective is to have as many sensors in the field as possible to assist patrol personnel,” said Kostas Ioannidis, Roborder’s technical manager.


Read full story here…


Join our mailing list!