Thinking as a Hobby


Home
Get Email Updates
LINKS
JournalScan
Email Me

Admin Password

Remember Me

3478632 Curiosities served
Share on Facebook

Tech Celeb Views on the Singularity
Previous Entry :: Next Entry

Read/Post Comments (1)

Still talking about The Singularity, from the special feature over at IEEE Spectrum. This section shows views on The Singularity from various people in the computer and cognitive sciences.

Here's what Jeff Hawkins, author of the excellent On Intelligence has to say about when The Singularity will occur:


If you define the singularity as a point in time when intelligent machines are designing intelligent machines in such a way that machines get extremely intelligent in a short period of time—an exponential increase in intelligence—then it will never happen. Intelligence is largely defined by experience and training, not just by brain size or algorithms. It isn't a matter of writing software. Intelligent machines, like humans, will need to be trained in particular domains of expertise. This takes time and deliberate attention to the kind of knowledge you want the machine to have.


I'm disappointed in this answer. I would draw at least a fuzzy boundary between knowledge and skills acquired through stimuli and intelligence as the capacity to acquire and use them. Certainly a function of intelligence isn't just what a system knows, but how rapidly and accurately it can encode new skills and knowledge.

Besides, humans have invented several ways to more efficiently transfer knowledge from one generation to the next: language (written and spoken), recording of information in photographic and digital form, schools, etc. Inorganic machines will likely find ways to streamline the process of information transfer, e.g. direct mind-to-mind communication. It's not difficult to envision the next generation of minds that are both quantitatively and qualitatively improved on ours, that can learn much more, much more rapidly, making inferences and predictions much more quickly. And there's no theoretical impediment to grafting a particular knowledge base into an existing system, ala the uploading of knowledge about how to fly a helicopter or perform surgery.

In such cases, future minds wouldn't be limited in the acquisition of skills and knowledge to reading, training, and practice, the way we are.

Meanwhile, T.J. Rodgers, founder and CEO of Cypress Semiconductor, Corp. in San Jose, California says The Singularity will never occur:


I don't believe in technological singularities. It's like extraterrestrial life—if it were there, we would have seen it by now (there are actually rigorous papers on that point of view).


Huh? I'd like to see these "rigorous papers". This is the biggest gob of bullshit in the entire piece.

Then there's Stephen Pinker, who doesn't just say The Singularity will never happen. He says "never, ever." Gotcha.


There is not the slightest reason to believe in a coming singularity. The fact that you can visualize a future in your imagination is not evidence that it is likely or even possible. Look at domed cities, jet-pack commuting, underwater cities, mile-high buildings, and nuclear-powered automobiles—all staples of futuristic fantasies when I was a child that have never arrived. Sheer processing power is not a pixie dust that magically solves all your problems.


It's true that your ability to imagine something happening in the future doesn't make it true, but it also doesn't negate its possibility. Look at humans flight, visiting the moon, putting robots on Mars, satellites, cloning, and damn near every other modern technology that was once just a speculative fantasy. Sheer processing power isn't pixie dust, but it's probably a necessary component to creating advanced intellects.

Like I said yesterday, making categorical statements one way or the other about a future event predicated on advances in technology is pretty silly, unless you have some very good reasons to justify them, and none of these is very good.

Finally, a view from writer Warren Ellis, from a post over at Sentient Developments that I saw a couple of days before the IEEE Spectrum issue.


The Singularity is the last trench of the religious impulse in the technocratic community. The Singularity has been denigrated as "The Rapture For Nerds," and not without cause. It's pretty much indivisible from the religious faith in describing the desire to be saved by something that isn't there (or even the desire to be destroyed by something that isn't there) and throws off no evidence of its ever intending to exist. It's a new faith for people who think they're otherwise much too evolved to believe in the Flying Spaghetti Monster or any other idiot back-brain cult you care to suggest.


Again, the comparison to religion...Ellis knows exactly which buttons to push to make techno-geeks froth at the mouth.

If someone is asserting that The Singularity will definitely happen and that it will definitely be positive, then yes, they're being daft (speaking of which, I haven't yet read the Rodney Brooks piece in the IEEE Spectrum, which may very well be this way). But saying there's no evidence for the eventual emergence of either intelligent machines or enhanced humans is also daft. It's not just wishful thinking to extrapolate current scientific and technological trends toward a future where there are agents smarter than ourselves. It is wishful thinking to assert such a claim with absolute certainty and also claim that such a world will be idyllic.

But I guess a magazine feature in which everyone says "Who knows?" wouldn't be very interesting.


Read/Post Comments (1)

Previous Entry :: Next Entry

Back to Top

Powered by JournalScape © 2001-2010 JournalScape.com. All rights reserved.
All content rights reserved by the author.
custsupport@journalscape.com