While discussing AI and its development, we all worry about the 'doomsday' scenario, where intelligent machines could go rogue and turn against humanity.
The problem has long been a concern. And now, a former Google engineer has highlighted the worst possible impact of killer robots, claiming that they could one day cause mass atrocities, even global wars.
Here's more on what she said.
Ever since AI became mainstream, governments have started looking at the tech as a way to make autonomous machines.
They have invested heavily in such weapons, which has prompted leading tech moguls and now Google's former top engineer Laura Nolan, to call for action against the development of fully-autonomous weaponized machines.
It is noteworthy to mention that Google has now shelved Project Maven.
In a conversation with The Guardian, Nolan, who had quit Google after being asked to work on a project to enhance US Military drones, claimed that weaponized machines should always remain under human control.
She emphasized that if there's no human element in the deployment process, the machines could accidentally start a war or do "calamitous things that they were not originally programmed for."
Nolan has briefed UN diplomats in New York and Geneva about the threat of killer robots.
She says these machines should be outlawed with an international treaty, like the one for chemical weapons, as they could become 'unpredictable and dangerous' one day.
"What you are looking at are possible atrocities and unlawful killings even under laws of warfare," the engineer emphasized.
Nolan explained the risk of killer robots with a hypothetical situation in which machines sent to confront radars and enemies pick up armed locals searching for food as the 'enemy' and open fire on them.
If a law comes into force, the development, testing, and deployment of killer robots can be stopped for good.
However, it is important to act as soon as possible because if these weapons get out into in the wild, there would be no stopping them.
Then, as Nolan says, they "could start a flash war, destroy a nuclear power station and cause mass atrocities."
Love Science news?
Subscribe to stay updated.