After watching the movie I, Robot, I find that many ethical issues come about from the technology shown in the movie. The movie takes place in 2035 and is about robots that are programmed with Three Laws: First Law-A robot must never harm a human being or, through inaction, allow any harm to come to a human; Second Law-A robot must obey the orders given to them by human beings, except where such orders violate the First Law; Third Law- A robot must protect its own existence unless this violates the First or Second Laws. Humans use these robots to do common tasks for them. Some of the ethical questions arisen from this movie include do robots have the ability to make emotional or ethical decision, are they entitled to the same rights as humans and should we use robots for wars.
In the movie I, Robot, a detective name Del Spooner, played by Will Smith is saved by a robot when he and a little girl where trapped and ...view middle of the document...
Del already had a higher survival factor, but because the robot did not have emotions or compassion to deal with it made the decision to save based solely on chance of survival. Whereas I would venture to say that most adults would have saved the child who needed more assistance than the adult Del Spooner, a human would not have been able to overlook emotion or compassion when making the decision on whom to save.
Another issue brought forward from the movie is whether they should be given the same rights as humans. The movie shows us that the robots have three laws that they live by, the first one being they must protect human from any harm. This first law has a few issues in being that sometimes humans do not need to be protected, for example people who have committed a crime, need to be punished, not protected. The second law tells the robot they are to obey every order given unless it violates the first law. Even if the order is unethical the robot must still obey it. The third law states the robot must protect the robot its self unless it would violate the first two laws. If they were given the same rights as humans would set them free from their laws. Robots cannot function as human because they lack the ability to have compassion or emotion. Robots do not have the ability to make ethical decisions.
Another big ethical issue raised in the move is whether or not robots could be used to fight wars. This ethical issue just likes the other in the fact that it revolves on the lack of emotional or compassion component of the robots. Robots can be programed for the protection of individuals but because of their lack of compassion or emotion they would not know when to stop the attack.
Because of the advancing technology robots similar to the ones in the movie I, Robot is not unlikely in the near future. Technology is advancing at lighting speeds and these issues will have to be address at some point in the future. Ethics and technology have never been more important as we continue to advance our technology.