Ray Kurzweil, AI visionary and futurist, didn't like it
The Matrix Reloaded is crippled by senseless fighting and chase scenes, weak plot and character development, tepid acting, and sophomoric dialogues. It shares the dystopian, Luddite perspective of the original movie, but loses the elegance, style, originality, and evocative philosophical musings of the original.
Tell us what you really think, Ray.
I'm especially interested in the charge of Ludditism here. Kurzweil says:
The dystopian, Luddite perspective of the Wachowski brothers can be seen in its view of the birth of artificial intelligence as the source of all evil. In one of Morpheus' "sermons," he tells Neo (Keanu Reeves) that "in the early 21st century, all of mankind united and marveled at our magnificence as we gave birth to AI [artificial intelligence], a singular construction that spawned an entire race of machines." Morpheus goes on to explain how this singular construction became a runaway phenomenon as it reproduced itself and ultimately enslaved humankind.
The movie celebrates those humans who choose to be completely unaltered by technology, even spurning the bioport. Incidentally, in my book The Age of Spiritual Machines2, I refer to such people as MOSHs (Mostly Original Substrate Humans). The movie's position reflects a growing sentiment in today's world to maintain a distinct separation of the natural- and human-created worlds. The reality, however, is that these worlds are rapidly merging. We already have a variety of neural implants that are repairing human brains afflicted by disease or disability, for example, an FDA-approved neural implant that replaces the region of neurons destroyed by Parkinson's Disease, cochlear implants for the deaf, and emerging retinal implants for the blind.
My view is that the prospect of "strong AI" (AI at or beyond human intelligence) will serve to amplify human civilization much the same way that our technology does today. As a society, we routinely accomplish intellectual achievements that would be impossible without the level of computer intelligence we already have. Ultimately, we will merge our own biological intelligence with our own creations as a way of continuing the exponential expansion of human knowledge and creative potential.
However, I do not completely reject the specter of AI turning on its creators, as portrayed in the Matrix. It is a possible downside scenario, what Nick Bostrom calls an "existential risk."
Unfortunately there is nothing we can do today to assure that AI will be friendly. Based on this, some observers such as Bill Joy call for us to relinquish the pursuit of these technologies. The reality, however, is that such relinquishment is not possible without instituting a totalitarian government that bans all of technology.
While I mostly agree with Kurzweil's assessment, I don't necessarily see anything wrong with using the worst-case scenario as a fictional element. I do have a problem with it being the primary mode of written and media SF, though, from Jurassic Park
to The Terminator
Our culture has a strange love/hate relationship with technology. People constantly talk about its dehumanizing effects, its dangers, etc., all while consuming more and more of it.
It's hypocritical, but then, that's very human too, isn't it.