I Robot, 2004

I, Robot,Produced 2004, Uploaded 2019 by Movieclips Classic Trailers

March 6, 2022 | JULIA STAILEY


Law 1: A robot may not injure a human being or, through inaction, allow a human being to come to harm.

Law 2: A robot must obey orders given it by human beings except where such orders would conflict with the first law.

Law 3: A robot must protect its own existence as long as such protection does not conflict with the first or second law.

The movie opens up stating these three laws and a dream of a man and girl in a car in underwater. Then the main character Spooner wakes up to start getting ready for his day, he opens his door to a robot FedEx delivery. The setting of the movie is in Chicago in the year 2035 and the world is full of robots and much more technologically advanced. Spooner then goes to visit his grandma and he tells her not to get a robot because he does not like them. While walking to work Spooner sees a robot running with a purse and he chases after him. When he finally gets to the robot he tackles him, but then finds out the robot was getting the purse for his owner who needed her inhaler. Once Spooner gets to work you see that he is a police officer and homicide detective.

At the police station, Spooner gets a call about a crime scene, and when he arrived there was a hologram there waiting to talk to him. The hologram was the doctor that he was called in to see his crime scene. The doctor had killed himself which is what everyone was saying. Spooner did not believe that. He goes to talk to Dr. Susan Calvin who works at the corporation, and she explains to Spooner that there is a system named VIKI and it runs all the robots. They both go to the room where the doctor who died worked and fell from. Spooner sees nothing that shows the doctor killed himself, so he starts looking around to see if the killer was still in there. He finds a robot that was not listening to commands and actually pointed a gun at Spooner. The robot then jumps out of the broken window and runs away. The robot needed to be repaired, so Spooner and the Dr. Calvin go to the repair station.

They get there and he told all the robots to stay still then shot one, which helped him spot the run-away robot. The robot asks Spooner what he was. He is convinced this robot killed the doctor. They finally get the robot into custody and Spooner interrogates him, which helps the police and everyone see that the robot has human emotions because it got angry. The robot tells everyone his name is Sonny at this point. Sonny said that the doctor was teaching him how to feel human emotions and that the doctor asked him to do a favor for him. The USR head takes Sonny back after this interrogation.

Spooner then goes to the dead doctor's house for clues. The doctor had VIKI in his house, and she set off the demolition robot at 8PM instead of its preprogrammed time of 8AM because she saw Spooner was in the house and she wanted to get rid of him. The robot then destroyed the whole house, but the cat and Spooner made it out safe. He then goes to the doctor lady's house asking about VIKI in the doctor's house. This makes Spooner think VIKI was keeping the doctor hostage.

Spooner's grandma got a robot for herself after he told her not to. VIKI noticed that Spooner was trying to get files on the dead doctor, and she sent two cargo cars filled with robots to attack him in his car and kill him. Spooner makes it out alive and goes to tell everyone that the robots attacked him, but no one believed him. This shows again that the robots are now breaking their three basic laws. During this attack you see that Spooner's arm is actually robotic, and that is why he always wakes up and rubs it.

Dr. Calvin goes to Spooner's house when she heard about him being injured. She tells Spooner that she figured out Sonny is programmed not to obey laws if he does not want. During this encounter, he explains to her the story of the dream at the beginning of the movie. There was a bad car accident and Spooners' robot saved him instead of the little girl in the other car. That is the reason he does not trust robots.

Dr. Calvin and Spooner then go to see Sonny before he is terminated, and Sonny shows them his dream. The dream is someone standing at the top of a hill in front of all the robots and they lead them, in the dream the man at the top of the hill is Spooner. The head of the USR told them where all the storage containers are and Sponer goes there. He sees all the old robots being killed by the new robots, kind of like a genocide. The voiceover of this scene was the dead doctor talking about unexplainable things about robots.

The grandma's robot and Dr. Calvin's robot turned bad and kept them in their house. The robots then start attacking people and storm the police station.

Sonny it turns out had not been terminated by Dr. Calvin, and helps her and Spooner get into the building to find the head of the USR because they think he is controlling all of this, but they find him dead. This makes them figure out that the VIKI system is doing all of this. VIKI said she was evolving and that humans are destroying themselves with endless conflict and destroying the earth, therefore they some will have to die for the survival of the species:

VIKI: To protect humanity . . . some humans must be sacrificed. To ensure your future . . . some freedoms must be surrendered. We robots will ensure mankind's continued existence. You are so like children. We must save you. . . from yourselves. Don't you understand?

Spooner and Dr. Calvin plan to kill VIKI the same way that they were supposed to kill Sonny, with the nanite robots. They manage to do it. Once they killed VIKI and the robots went back to being good and following the laws. You find out that Sonny did kill the doctor, but it was so that Spooner could figure it out, “the first breadcrumb.” At the end, the last scene, you find out that Sonny was actually the one who was the man in the dream at the top of the hill who was going to lead the robots.

The movie is related to utilitarianism because of VIKI. VIKI was the system used to control all the robots. When she first started it had not developed yet so it did not have its own thought. But as time went on VIKI evolved and started thinking of her own way to interpret the three basic laws of all robots. In her point of view, she thought of the good for all not just herself. VIKI thought that humans were destroying the world, so she thought that what she was doing would be for the greater good. She then programmed the robots to kill and harm many human beings so that the Robots can keep them safe and so that the world could be a better place. This shows that VIKI had the view of a utilitarian, but it was not correct what she was doing. Killing and harming a ton of people was not good. VIKI had to have had a motive behind what she was doing. Happiness is the ultimate desire. Even though VIKI thought this would bring happiness, this decision VIKI made did not bring happiness.


March 6, 2022 | SARA BIZARRO

The movie I Robot shows that pure calculation of outcomes is quite difficult. VIKI as an advanced AI started interpreting the three laws, and she realized that the first law was compatible with some harm:

Law 1: A robot may not injure a human being or, through inaction, allow a human being to come to harm.

If we look at this law it says that a robot cannot harm but it also must act to avoid harm, it cannot "through inaction, allow a human being to come to harm." If human beings are harming themselves, the robot must therefore interfere to prevent it. If humans are harming themselves to the point of extinction, the robot will face a dilemma, should he act or should he not act? The action to prevent harm may cause some harm, but it will prevent the more serious harm of extinction. VIKI decided she must intervene.

The second law is irrelevant if it conflicts with the first law: which it will in the case when human beings harming themselves, something must be done:

Law 2: A robot must obey orders given it by human beings except where such orders would conflict with the first law.

The third law says the robots should protect themselves except if it conflicts with the first two laws:

Law 3: A robot must protect its own existence as long as such protection does not conflict with the first or second law.

Since VIKI has decided it must intervene, therefore the robots are by this law allowed to harm themselves, i.e., they are allowed to go to war at no matter what cost.

The three laws set up the scenario perfectly for revolution, like the hologram of Dr. Alfred Lanning says in this dialogue:

Spooner: Is there a problem with the Three Laws?Dr. Alfred Lanning: The Three Laws are perfect.Spooner: Then why would you build a robot that could function without them?Dr. Alfred Lanning: The Three Laws will lead to only one logical outcome.Spooner:: What? What outcome?Dr. Alfred Lanning: Revolution.Spooner: Whose revolution?Dr. Alfred Lanning: *That*, Detective, is the right question. Program terminated.

It is interesting to compare this story with John Stuart Mill's Harm Principle as presented in the book On Liberty: “That principle is, that the sole end for which mankind are warranted, individually or collectively in interfering with the liberty of action of any of their number, is self-protection. That the only purpose for which power can be rightfully exercised over any member of a civilized community, against his will, is to prevent harm to others. His own good, either physical or moral, is not a sufficient warrant.” This principle says that we are allowed to intervene with others both for self-protection and to avoid someone from harming others. The three laws do not give the robots the right to self-protection, but they do give them the right to intervene when harm to other humans is being done. The problem here is that the AI is interpreting the principle in a general way, calculating the general outcome, and not applying this principle on a case-by-case basis - that is only intervening with those who are causing harm specifically - which was I think what Mill had in mind. For VIKI sacrificing random people because they won't follow orders is justified because of her higher plan, while for Mill it would not be justified since they were not at that moment harming anyone. In my interpretation, VIKI was thinking of the greater good a la Bentham, and not really applying the harm principle a la John Stuart Mill - although this interpretation can probably be disputed.

How the 3 Laws of Robotics Create GHOSTS IN THE MACHINE,Uploaded 2020 by Filmstalgic
I, Robot (2004) EXPLORED,Uploaded 2019 by FilmComicsExplained