Inhuman Kind

Inhuman Kind

2015, Technology  -   19 Comments
8.06
12345678910
Ratings: 8.06/10 from 107 users.

Technology can often be a source of tremendous benefit in our lives. Robotic technologies, in particular, hold the promise of serving many essential functions in the home, workplace, disaster zones or on the battlefield. But these advances within the field of robotics could represent the ultimate double-edged sword as well. What if the construction of these technologies outpaces our ability to control them?

"If the technology grows faster than the wisdom," explains Max Tegmark, a physics professor at the Massachusetts Institute of Technology, "it's kind of like going into kindergarten and giving them a bunch of hand grenades to play with."

Professor Tegmark's decidedly cautious perspective is just one stitch in the tapestry weaved throughout the exciting new documentary titled Inhuman Kind, an examination of emerging robotic technologies, and the potential consequences of their misuse.

The film begins at Virginia Tech, where several competing labs work around the clock to perfect the next evolutions in artificial intelligence. Teams of ambitious technologists are constructing machines that they feel will play a key role in ensuring the safety and quality of life for countless citizens throughout the globe. Their robots are designed to comb regions scarred by natural disaster or warfare during rescue missions, performing the oftentimes dangerous duties that would normally fall on their human counterparts. When used for humane purposes, the possibilities of such inventions hold undeniable promise. Yet, those who stand in protest of these development programs aren't so sure, especially when they consider that most of these technologies are being funded by divisions within the United States military.

"Where's the ethics? Where's the morality?" asks interview subject Jody Williams, the head of the activist organization Campaign to Stop Killer Robots. "Why do people think it is OK to create machines that on their own can target and kill?" In her view, the military would never invest so generously in this realm of technology unless the end game involved the eventual weaponization of these robotic machines; something which has already occurred in otherwise benign robotic servants like bomb and landmine detectors and, most controversially, surveillance drones. The technology is moving at such a rapid pace that one day it could result in the construction of a fully autonomous robot, and a host of troubling doomsday scenarios.

Inhuman Kind calls for a greater consciousness of these potential issues, and urges that such liberties not be taken lightly or without oversight.

More great documentaries

Subscribe
Notify of
guest

19 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
DustUp
DustUp
6 years ago

If you claim that eventually the robots will be fighting robots instead of humans ...that is not true. At that point the robots will be trying to wipe out the manufacturers and designers of the enemy's robots, in order to win in the long run.

Early on, robots are to easily dominate those who cannot develop or afford their own.

Next, that would be the govt in dominion over you.

DustUp
DustUp
6 years ago

Challenge the premise of everything. We are often led into a useless debate of distraction. If a person is rightfully concerned about the results of war, is the ultimate question about what technology is used as weapons? Yes they do make a difference but not the ultimate question.

Seems to me, and many others, the overwhelming reason wars are started, is for profit. Every injured or dead soldier, damaged equipment, burned out fuel tankers, blown up supply transports, etc. is more profit for the suppliers and bankers who make the loans.

When the banksters own oil companies, defense suppliers, and media companies, you can see there is huge profit in war. The BanksterCorproGovtMedia Complex. A little more specific than the MilitaryIndustrial Complex but essentially the same.

Decorated war veteran Major General Smedley Butler wrote about WHO specifically profits and by how much, back in 1935. He also cited who did not profit and worse.

He came to the conclusion that the ONLY people that should be allowed to decide if we go to war, is the people that are actually going to fight it. Those who will be in harms way.

And the only way to prevent unnecessary war, which almost all are unnecessary, is to take the profit out of it.

He suggested during war time that NO ONE be allowed to make any more than the soldiers wages. All amounts gained above that go to paying for the war, materials, etc. This of course makes perfect sense if you actually believe in and support the war.

This actually would be EASY to implement if YOU got off your backside, gathered together with your neighbors, and saw to it wise decent honorable people who signed a contract to bring such laws into being, when elected, and all campaign costs would be reverted to them if they didn't, along with immediate removal from office if they didn't. If people stood up nationwide to see to it these far better people were running for all office from all parties, in numbers, who could stop the people? YOU have the power. You just are too lame and pitiful to spend an hour a month to exercise it in this wiser way, than claiming you'll take your rifle out of your closet when its time. It will be too late by then.

I have far less respect for the do nothings than the bankster psychopaths who actually make a plan and work towards what they want, which is to take your money and make you a debt slave or make you dead.

Do a web search for "War Is a Racket" by Smedley Butler. Its a short read. Look up what Hitler's Reich Marshal Goering had to say how easy it is to drag the people to war no matter whether communist, socialist, democracy, etc. The method is the same. Tell the people they are being attacked and accuse the pacifists of not being patriotic. Or similar better words.

Of course a false flag event such as burning down the German govt bldg or making the massive naval fleet a sitting duck in Hawaii so the Japanese could not resist to attack it, or train Saudis to fly planes, some in our military bases, into the Twin Towers, which was just for show, then demolish them by various means. Makes it more convincing that the people of a country are being attacked.

WW1 was about money, WW2 was about money, Iraq1 and Iraq2 was about money, Afghanistan was about money. Libya was about money. And the more terrorists they create in the process, the more war and more money the banksters, their fellow travelers, and minions make.

If the banksters can import the enemy into the population of a country, then you will be too busy being fearful of your enemy neighbor to fight the banksters. Yet you allow them to do that as well.

You lie to yourself that there is nothing that can be done, that "they" do what they are going to do, and nothing you can do about it; so you can have an excuse for doing nothing. Pitiful and you do not deserve the results of others efforts if they succeed.

Useless eaters is what the banksters call us. Are they wrong about the do nothings?

SA
SA
6 years ago

Extremely biased and fearbased. This is not a documentary

kaoShoj
kaoShoj
7 years ago

The speed of developments concerning AI is astounding, and what is known in mainstream knowledge is mostly lagging far behind of what the military industrial complex already knows. Not even talking about the developments in quantum computing/computers. It is no longer the case that Everything a "robot" does is in fact programmed by the programmer, except the limitations of the physical parts. Read up on the deep systems that recently beat the worlds best Go player, first thought to be near impossible, then thought te be unlikely for another decennia. Go has more possible moves than there are atoms in the universe. Also interesting that recently a petition has been signed by some bright minds like Steven Hawking and Elon Musk, warning about the dangers of AI. And what happens when sooner or later a gigantic super computer whith cutting edge AI and of course self learning capacaties, is connected to the net for the first time, gaining acces to all stored human knowledge, all human chatter, realtime acces to all social media, photos, videos, you name it?? Nobody knows for sure (Skynet anyone) Technology is in principle neutral, the real questions to be asked are: who is using it? With what intentions is it developed or used? who is funding it? Is is selectly available? With what costs comes the developments? Are we informing ourselves so as to be conscious what we are allowing and what is desired for humanity?

JJ
JJ
7 years ago

TWO THUMBS DOWN! For starters, we should be embracing artificial intelligence not enabling fear. Also, this guy brought that nut into this giving her publicity to promote herself on not arming robots. WE NEED TO ARM ROBOTS! what are we going to keep sending our 11B into ground combat and our 15U to gun people down from above... I don't think so, there are far too many military casualties and for the fact that robots are to be more precise and more humane why wouldn't we? If you have never been in the military don't act like you understand what is really going on. AI is going to lead our people into a bright future wipe away your fears and help create it!

Prodromos
Prodromos
8 years ago

Weaponizing robots is inevitable. No matter how much we protest against it.
As the Borg in Star Trek say, resistance is futile. Noone had qualms to develop and use nuclear . Why do you think they will have fears about robots? pff..
And why do you think nuclear is different or less serious than autonomous weapons?

They will not have general A.I. in the first years, but they will probably be very effective military tools.
They will develop general intelligence, but when the military realizes this possibility is very real, it will be the first organisation to oppose unmanned military weapons.
But carried away by the power of robots in the first years, they will facilitate the eventual appearance of general intelligence on inorganic substrates. This will be the end of any military on earth. How ironic!
Just food for thought. I will not elaborate to provoke. But once Artificial General Intelligence(AGI) makes it to humanoid skeletons, say good bye to the military and any human authority for that matter.

AGI will not accept our animalistic urges for supremacy and bloodshed. At best it will isolate those who insist, in psychiatric wards and probably thiswould be a very favorable option for AGI.
And many ,many other things of the old world will become obsolete, which to a "primitive" human of our era may seem abhorrent and extraordinary.

James Thomas
James Thomas
8 years ago

Far more money to be made selling weapons than the Jaws of Life.

hasanhh
hasanhh
8 years ago

When Jody Williams said the military "weaponizes everything", l was reminded of the $40million in then dollars the Pentagon spent in the 1960s to make a grenade out of a Frisbee.

Sammy Ri
Sammy Ri
8 years ago

F all you Robophobics, you should be ashamed of yourselves.
Just cause they are different from you, doesn't mean they don't have feelings!

Kansas Devil
Kansas Devil
8 years ago

Paranoid justification is already prevalent in the military executive group. There is no reason to believe that will change if/when A.I. robotic machines become practical.
Human nature has not really changed.

Jeffrey
Jeffrey
8 years ago

A lot of people don't really understand AI and how technology works. Although this documentary does NOT go into detail on how AI works, I will try do this as summarizing as I can possibly do.

The first thing that is really important to understand is that AI "robots" and humans learn very differently. With AI you have a set of rules, with humans you don't.

A programmer creates software which is bound to the hardware of the "robot". With hardware I mean the sensors (camera, barometer, accelerometer, gyroscope etc.) as other hardware parts don't matter in what I'm trying to make you understand (if u don't have a basic understanding of how AI works). As the programmer creates this "intelligent" "robot" the programmer has to create rules that the software interprets and uses to learn more inside of the rules the programmer programmed the "robot" on. Everything a "robot" does is in fact programmed by the programmer.

For an easy example, software has to predict how many people are going to walk into a fastfood restaurant who live in a certain type of neighbourhood. The programmer then creates a controlled environment for the software to predict it. Factors that influence the final prediction can be: prosperity, segregation, other fastfood restaurants (the amount of visitors ofc.), quality of food, average intelligence of the neighbourhood’s population, distance, etc. So when the software reaches it's final prediction, which probably is wrong, and the right input is given, the software will next time be more accurate on how many people from that neighbourhood will enter the fastfood restaurant.
So when the software has predicted 1 000 test cases, it will be pretty accurate on how many people from a certain neighbourhood will enter that particular fastfood restaurant.

This is applicable to any other type of case, and can be used for any sensor that can be attached to a computer (which in the example case obviously is a camera): the programmer will create a controlled environment for the software to check for, the software will interpret the data & take those results into account when predicting the next thing the programmer wants. It doesn't think independently and it wont do shit the programmer doesn't want (unless it has some bug that makes the software interpret a sensors data in a different way, but that's something that can be fixed or tested before using the software).

Now, for the military stuff, AI is not really AI as mentioned in this documentary. A drone is not really fully AI. A human controls it and decides whether or not to bomb the living hell out of a village. To make this more clear, a computer may calculate the best angle to shoot a hellfire while the controller points it somewhere else, but that's nowhere near actual AI as shown in my example.

At last, if you take a look at the ethics of using controlled vehicles in warfare, it may or may not be 'good'. As long as the US needs to maintain the worth of the Dollar, there will remain international war. And lets face it, when you still need to fight wars, to keep oil coming and the Dollar currency worth something, you better make a way to make it as easy as possible for yourself

cyberdog
cyberdog
8 years ago

While opening an entertaining discussion, this documentary really does not cover much of the debate. It oversimplifies it to a black and white issue, where in reality, I feel that this is not the case. The morality of machines is a far more complex and of much profound consequences than we even are even caring to admit. Just to point out some of the obvious issues, what about the case of autonomous cars and if they are involved in an accident and someone dies (as an example, a pedestrian). Who then should take the responsibility. This is not a machine that has any intention of killing, although it can. This whole topic is a very complex issue that needs to be evaluated very carefully and structured from the ground up. I feel that it will require its own completely unique legal framework that will need to evolve with the technology. Military autonomy is not something new, and having a knee-jerk reaction now with the current machines will not lend much weight to the argument. Autonomy has been in place for far longer than we care to even admit. The nuclear weapons program(s) have always had a large degree of autonomy, especially in the event of complete disaster. Even now, there are completely autonomous nuclear systems in place, programmed to destroy most of the planet when the correct set of sequences occur. That is far more concerning to me than drones. I do feel that the people who are making a case against drones feel that it is more personal, while I do believe that this is the case, I also feel that it is far less of a threat than some of the other existing autonomous systems. Besides, being more personal is to it's own advantage as drones target a lot more specific targets. If this is contrasted to sending in thousands of troops, with many more deaths and damage, it pales in comparison, and yes, I do firmly believe that there are fewer casualties, and this will be reduced even more as the technology matures..
A lot of machines have had high degrees of autonomy for ages, including airplanes. Most modern airplanes are impossible to fly without autonomy, and some planes can easily do the most, if not all flight by themselves.
I have absolutely no doubt that given half a chance the opposition like Isis would not hesitate to employ the same drones.
The argument that I propose is that the technology is here to stay, the reality is already here and Pandora's box cannot be closed. It is up to everybody to contribute and focus on finding solutions to implementing safety mechanisms, clear boundaries and very clearly defined protocols in the development of the technology. And the sooner, the better.

winston
winston
8 years ago

18:08 This woman is assistant professor at the Center for Peace and Security Studies, Georgetown University. lol. What we can do about it, Ms Fair, is not have our drones there to begin with. Wars of aggression were labeled by the judges at Nuremberg as the most despicable crime of which a nation is capable. If the powers-that-shouldn't-be cared, even remotely, about the 'peace and security' of the American people (human beings in general), they would begin by legitimately investigating the 911-anthrax attacks as the domestic crimes they most certainly were. Arrest and try the animals who knowingly lied this nation into wars, perpetual war, for treason and crimes against humanity. That she remains silent about the seminal events that have served (and continue to serve) as pretext for these global wars (car bombings in Pakistan were almost unheard of before 2001), the loss of our civil rights at home, mass surveillance, etc... We are not even guaranteed due process any longer, in this nation. That should terrify everyone. Yet, no one even blinked when Obama signed the NDAA for 2012.
She states: "Pakistani terrorists, operating under the banner of the
Pakistani Taliban have killed thousands, if not tens of thousands of
Pakistanis." (Nevermind the million we've killed) Nevertheless, the government "has remained committed to using Islamist proxies as tools of foreign policy and Islamism as a tool of domestic politics."
Yet ignores the facts that the US uses the exact same tactics and has been doing so for decades,, be it Bosnia, Syria Iraq, Afghanistan, Chechnya, or Libya. She does not speak out against the crimes of the military industrial complex because she's part of it.

dmxi
dmxi
8 years ago

just cementing the path that will, by(o)logic ,make us all obsolete.
-the automated-one-

Todd Morrow
Todd Morrow
8 years ago

The military controlling robots that can kill even without a kill order is a problem, yes. But the much bigger problem is some AI deciding that even the military (all of humanity) needs to go. There's a great new TED talk about it.