>Yes. His concern seems to revlove around the issue of self-replication, and the "evolution" of computer intelligence as a superior species, in effect (not his exact words). The idea seems crazy, except that many bright minds seem to be thinking about it.<Maria,As I pointed out, the smaller the artifact the more difficult it is to make it capable of self-replication. Designing so that the artifact can self replicate outside a carefully controlled environment makes the problem vastly more difficult still. There has also been a lot of thought given to the idea of these devices "evolving" over generations of replication. The short answer is that it probably won't happen and there are straightforward steps to take that make "probably" into "certainly".To elaborate, living creatures are the end products of giga-years of evolution. In effect, we are evolved to be capable of evolving. Human designed systems have no such legacy. Evolution is facilitated when the thing that is replicating has the characteristic of graceful degradation. If a cosmic ray zaps the chromosome of a dividing cell, the offspring cells are usually still viable, perhaps at some reduced degree (although, very occasionally you get something that helps things by accident, which is the engine of evolution). In designed systems, the ability to degrade gracefully is usually desirable and is difficult to implement. For self replicating systems, you want very "ungraceful" degradation (which is usually what you get as a default anyway). Should this not provide enough certainty, Ralph Merkle pointed out that the software of the replicating systems can be encoded in such a way that any change to any of the instructions will render the whole thing impossible to execute. Since Merkle, prior to his MNT days, was one of the seminal figures in the development of public key cryptography, he knows whereof he speaks.To, perhaps, make the idea less crazy seeming I would recommend the book "A Fire Upon the Deep" by Vernor Vinge (actually, I would recommend it anyway- it has been my favorite for the better part of a decade). I guess that my take on the concept of our designing our something that will displace us is that it reminds me of a car I once owned. The driver's side door could not be locked while it was open, but the passenger's side could- to prevent you from locking your keys inside. It occurred to me that one could still manage to lock the keys in, but only by working hard at being stupid enough to do it. So, yes, we could design our way into extinction, but we would have to be truly dumb, deserving of a collective "Darwin Award", to manage it.Thanks for the elucidationGreg
Best Of |
Favorites & Replies |
Start a New Board |
My Fool |
BATS data provided in real-time. NYSE, NASDAQ and NYSEMKT data delayed 15 minutes.
Real-Time prices provided by BATS. Market data provided by Interactive Data.
Company fundamental data provided by Morningstar. Earnings Estimates, Analyst Rat