The following is a post for an Educational Policy and Theory class at the University of Illinois, which asked students to evaluate Frankenstein in relation to the neutrality or power of technology and how humans interplay with that power.
Understanding “Frankenstein” in the context of technology and politics requires that we consider both the monster and his maker, both acceptance of responsibility and awareness of potential consequences. At its surface level, “Frankenstein” may be evaluated for its cries to humanity and to the value of knowledge. However, I will choose to evaluate the monster first as a “technology” and second as a created being with access to and ownership of knowledge, or “technology.” Further, I will argue that, much like the OLPC projects’ founders and consumers, Victor Frankenstein and his creation both lacked the requirements necessary for possible positive technology use: acceptance of responsibility and awareness of potential consequences.
Victor Frankenstein embarked upon the endeavor of creating a new technology, which the world of science had not yet seen. To be able to animate a lifeless body had never been heard of, much like many real-world technological endeavors: whether creating a computing system with a language of ones and zeros, or of a small green laptop which promises to be a “salvation” of sorts. Being the sole, or among the minority of possessors, of “power holders” as Winner (1986) dubs them, of a knowledge or technology gives one power. Whether or not that technology (by association) assumes a positive or negative power, I believe, depends on the user or consumer. As Joerges points out, “The power of things does not lie in themselves. It lies in their associations; it is the product of the way they are put together and distributed.” The being that Frankenstein created was a technology itself; as such, those in possession of that technology can possess, rather than embody “specific forms of power and authority (Winner 1986).” Which polarized power that technology assumes by association is absolutely determined by human intervention: humans create the technology, humans choose how to use the technology, and humans ultimately decide when a technology becomes obsolete.
Through neglect, Frankenstein’s “being” sought its own knowledge. I argue that the being sought out and became armed with knowledge, which here shall serve as a form of “technology.” Whereas Frankenstein had the opportunity to use this reanimating power for a positive learning leap in science, and whereas the reanimated being had the opportunity to use his newfound knowledge for good, Frankenstein chose to ignore the power of his created technology, and the being chose to use his technology to harm his maker in order to gain what he desired. Thus, the technologies, in the hands of Frankenstein and the being, became a negative power. A technology, through a consumer’s misuse, or a writer’s negative spin on words, can make any technology seem inherently bad, as Joerges demonstrates it is possible to argue that “Moses’ bridges discriminate…against luxury buses.” Thus, all technologies, once in the hands of users, have an assumed power, but users must choose to handle and support that technology responsibly.
Finally, I argue that without proper foresight and checkpoints, acceptance of responsibility, and awareness of potential consequences, technology can have disastrous or minimally effective results. Frankenstein is markedly excitedly at the start of his project: “So much has been done…far more, will I achieve…I will pioneer a new way, explore unknown powers, and unfold to the world the deepest mysteries of creation (Shelley 1818).” With much the same well-meaning gusto, it seems Negroponte dove into his OLPC project, proclaiming it as “probably the only hope…to eliminate poverty, to create peace, and work on the environment (Warchauer et al 2012).” Negroponte, unlike Frankenstein, decided to thrust his technology upon the public in full force, without much set up in the way of technical support. Langford in a Birmingham, Alabama school was more than happy to oblige Negroponte’s desire to see his technology in students’ hands; however, Langford in particular exhibited a lack of acceptance of responsibility and of an awareness of consequences. For instance, “Langford did not consult with the school system to see whether they wanted computers” in the first place. Additionally, he allowed students to own the laptops instead of the school system (Warschauer et al 2012).” Langford, Negroponte, and Frankenstein all seem set on seeing technology put into place, rather than considering potential pitfalls and necessary supports before doing so. Frankenstein could have discovered the dangers of his experiment earlier. Langford first stated about XO laptops, “We need to put a laptop in each child’s hands and step back and let them learn about the world (Warschauer et al 2012).” While this statement is not inherently wrong, he could have added a bit about how students would also need strategies, technical and teacher support, and lessons on how to utilize the technology appropriately and safely. I believe Langford’s (and the school’s) inability to see and accept responsibility and foresee consequences, in addition to other unknown factors, led to the failure of Birmingham’s OLPC program, through factors like “low levels of interest and use by teachers,” “inadequate social and technical infrastructure,” “child ownership,” and issues with the XO laptop itself (Warschauer et al 2012).
OLPC is a very concrete example by which we can see the benefits that foresight and supportive checkpoints may hold before creating new technology and placing it in the hands of the public. Perhaps, though, Frankenstein will serve as just another “parable” in Joerges’ words, to allow us to remember that responsibility and consequences are paramount to the successful possession of power, and to the possession of technology.
Do Artifacts Have Politics? by Langdon Winner
Do Politics Have Artifacts? by Bernward Joerges
Frankenstein by Mary Shelley
One Laptop Per Child Birmingham by Mark Warschauer et al.