Anchored by Jake Tapper, The Lead airs at 4 p.m. ET on CNN.
We've moved! Come join us at our new show page.
(CNN) – "The Jetsons" gave us the dream of a robot designed to help.
"The Terminator" gave us the nightmare of a machine designed to kill.
The future is apparently here.
Machines that are able to select and engage targets without a human being at the controls now exist in real life, not just in IMAX 3D.
This week, the United Nations is holding its first ever convention on "lethal autonomous weapons systems," or killer robots.
"We're living in an incredibly important moment when it comes to the history of weapons and war," said Peter Singer of The Brookings Institution. "Now we're having to compare weapons by their intelligence, they're autonomy. That's something new that we haven't measured before."
Machines, including ones developed by the Pentagon, still require human direction, but can also take the brunt of physical risk on the battlefield.
Unmanned drones have enabled U.S. pilots to remotely strike targets as far away as Afghanistan.
Both South Korea and Israel have operated semi-autonomous lethal sentry robots near their borders. But what happens when robots can control themselves entirely? Including lethal actions? Who's accountable for those actions?
"Chronologically, the human decision may be something that's made hours, days, weeks months beforehand," says Singer. "That's the part that's truly complicating in terms of not just the politics of this, but the legal, the moral, the ethical side."
Movies like "The Terminator" show Hollywood's version of a machine without human ethics and emotion: uncontrolled and dangerous.
But proponents of real military technology argue the current laws of war would sufficiently apply to so-called "killer robots."
The United Nations will discuss whether this technology should be banned or restricted in any way from causing "unnecessary or unjustifiable suffering."
Groups like the Campaign to Stop Killer Robots don't want to give it the chance, using a "friendly robot campaigner" to promote its message that any future "kill functions" should be prohibited outright.
"We're certainly not in the world of "The Terminator" yet, but it is true that the technologies are starting to do more on their own," says Singer. "The software programming may be the important part of the decision, and that's the part we're really not well equipped to deal with."