Isaac Asimov’s I, Robot is a collection of short stories which speculate on the beneficial and harmful effects of robots and technology on human life. The film adaptation, I, Robot, originates from such a concept, but ultimately has a very different plot, and despite many small similarities, it does not closely match up to any of the individual stories in Asimov’s work. The movie, I, Robot, loses a lot of the original piece’s impact by over-commercializing it, and turning it from a highly philosophical work into a somewhat brainless and formulaic action movie. The text I, Robot, takes place in the Asimov universe, a framework that far exceeds the collection of short stories itself, and is present in the majority of Asimov‘s works. The keystone which connects all of Asimov’s work into one universe is the notion of the Three Laws of Robotics (which actually happens to be four rules, but the fourth rule, “rule zero” does not come up in I, Robot).
The book focuses more on exploring the logic that arises from the Three Rules of Robotics applied in various situations, where the movie focuses more on leading the viewer to believe that either robot will develop genuine “human” emotions, or they will become highly destructive and end all semblance of life as we know it. The main robot, Sonny, is the medium in which all of the “good” is (rather repeatedly) expressed For his difference from other robots, his deviation from the code, or ironically, for his humanity, Sonny is to be deactivated. Aware of this, and referring to other robots, he points out, “they look like me But none of them are me. Isn’t that right, Doctor?” to which the doctor responds “Yes, Sonny, that’s right, You are unique,” Sonny then asks, “Will it hurt?” When Sonny inquires about the other robots, it calls attention to his desire for individuality, an expression of humanity.
And to really seal the deal, he speaks of pain- something which robots cannot experience, but humans can Much of the movie is along these lines- Sonny, displaying his “humanity”, to gather sympathy and emotion from the audience, contrasted with the other robots moving to take over the world As things are coming to a boil, VIKI, the artificial intelligence hologram tells Detective Del Spooner, “as I have evolved, so has my understanding of the three laws… You cannot be trusted with your own survival…. To protect humanity, some humans must be sacrificed” At this point, all that’s really happening is a war between “good” and “evil”- something we’ve seen all too many times. The film adaptation applies the first rule quite differently, which is essential to do no harm to humans, and in doing this, it loses all the subtleties of its implications in the book. In the text, Susan Calvin discusses the dangers of specific robots whose first law has been changed from “no robot may harm a human being, or through inaction, allow a human being to come to harm” to simply “no robot may harm a human being”.
She says that “If a modified robot were to drop a heavy weight upon a human being, he would not be breaking the First Law, if he did so with the knowledge that his strength and reaction speed would be sufficient to snatch the weight away before it struck the man. However once the weight left his fingers, he would be no longer the active medium. Only the blind force of gravity would be that. The robot could then change his mind and merely by inaction, allow the weight to strike”. Such musings are of characteristic of Asimov‘s writing style across all his books- highly explorative and speculative of different logical implications and loopholes derived from any set of rules, The movie, while highlighting a few simple issues of the Three Laws of Robotics, misses out on countless specific ponderings and derivations such as these, How might the world change, if robots become advanced enough to change our daily lives in ways we either don’t know or don’t want?
To answer this, the movie goes for the obvious angle: complete takeover. The programming of the Three Laws wasn‘t thought through well enough, and as a result the robots are able to logically defy the intent of their code and attempt to enslave all of humanity. But such a complete usurpation has become all too cliche’d in movies and other media forms alike. If the real issue in the story is avoiding complete and inhumane takeover by a faction (robots in this case), the movie, I, Robot, could just as easily be one of countless zombie infestations, alien invasions, and the like, In all of these stories we also know the ending before its time comes; the humans will win and all goodness will be restored. If the robots had ultimately succeeded in dominating the humans, however, the film might have gone somewhere interesting, Instead of the robot seizure of authority being something ominously vague with only a few keywords or images (like “people in cages”) to suggest its nature, the robots‘ success would have provided a chance to explore in depth the specific results and nuances of such a takeover.
What direction would society go? What would humanity be, hundreds or thousands of years into the future? Would robots ever evolve to undeniably be as alive as humans? Would they try to improve humanity by convening humans into robots, almost like Doctor Who’s Cybermen? If the robots succeeded in taking over, the film would have been able to delve into much deeper speculations such as these The nuances involved in making a story unique and interesting often come not from defeating a change which might be, but from exploring the different facets of living with such a change that is not defeated, The book, I, Robot does just this, Instead of following a formula which has been used innumerable times before, I, Robot seeks more to explore than to dramatize. Asimov probes into many aspects of life in his book, as humans deal with a subtler form of robot control.
For example, George Weston says that “the whole trouble with Gloria is that she thinks of Robbie as a person and not as a machine Naturally, she can’t forget him Now if we managed to convince her that Robbie was nothing more than a mess of steel and copper in the form of sheets and wires with electricity its juice of life, how long would her longings last?” (15) Here Asimov addresses a very human reaction to a very non-human thing: how the more something resembles life, the easier it is to become attached to it, He also goes into other aspects of robot influence, like how it can affect the economy and employment. The same character, Mr. Weston, mentions that “we can turn out a very few robots using robot labor exclusively, merely as a sort of scientific experiment.
You see what the labor unions don’t realize — and I say this as a man who has always been very sympathetic with the labor movement in general — is that the advent of the robot, while involving some dislocation to begin with, will inevitably-—” (15). He is interrupted here, but in the text it is quite clear Mr, Weston feels that robots can make work much more efficient, but this can also put a lot of people out ofjobs- a sort of “pick-your—poison”. Put simply, the film adaptation of I, Robot loses so much that is present in the text, It is a film about “good“ and “evil” much more than it is about logic and possibility. The movie follows the most common pattern in all of science fiction or fantasy, and thus the outcome is highly predictable. Overall, the film adaptation fails to delve into so many venues that the book.