I'm not convinced that the singularity isn't going to happen. It's just that I am deathly tired of the cheerleader squad approaching me and demanding to know precisely how many femtoseconds it's going to be until they can upload into AI heaven and leave the meatsack behind...
"The capability to serialise and store a human mind-state is unprecedented. We believe that the only reason we can do this today, without legal obstruction, is that the law of man does not know that it is possible. If it was known to be possible, there is an excellent chance that it would be illegal. This is despite the facts that you are a consenting adult and neither of us have raised significant ethical objections.
"A mind-state is not a legal human being. It does not hold rights, including the legal right to exist. It is copyrighted binary data, and it would be protected by copyright law. You would be the copyright owner. Copyright law varies in severity depending on geographical location, but it extends little beyond fines and jail time, whereas the destruction of a mind-state could be seriously construed as murder. While you have a contract with Mr. Hunt to protect and ensure the integrity of the binary data you'll be storing at his data centre, once the serialisation procedure is published, he may find that he has no legal course of action but to destroy it.
"And finally, much like cryogenic storage, the technology to reactivate a stored mind-state, either in a computer simulation or in a real human body, does not yet exist. For all we know, it may never exist.
"These are not the risks. These are the knowns. We can put numbers to all of these possibilities. They are the safe outcomes; the eventualities in which your mind-state is lost forever, and you continue with your life as normal, and all we have wasted is time.
"The simplest way to put it is this: once digitised, your mind could be sent anywhere, anytime. As you've mentioned yourself, it's thought that within a few decades it will become possible to store an arbitrary amount of data in a single fundamental particle, itself stored in a device as small as a basketball... or a thumb... or a fingernail. You will be copied and copied and copied all over the world. Copies of your mind-state - the first digitised human mind-state in history, remember - could survive until the end of human civilisation. After you go to sleep this afternoon, one of you will wake up tomorrow morning. There is, let us say, a one in a million chance that you will wake up tomorrow morning. The rest of you are embarking on a subjectively instantaneous one-way journey into the uttermost unknown, where, beyond a few decades into the future, your single physical self will not be able to protect you. You will be completely without support or protection or preparation.
"We can't put a mind-state back into a body. But the hope is that one day it will become possible. Somebody could steal your mind and insert it into another body on the other side of the world. Under their terms. And do anything they liked to you. They could kill you. Then they could find another body, insert your mind-state again, and continue to kill you. For ever.
"You could wake up in a digital world. Any of countless possible digital worlds. They won't be real, but you'll feel them for real. Imagine a virtual heaven. But now imagine a virtual hell. In a simulated environment, a malfeasor would have absolute, eternal, unbreakable control over you.
ERROL MORRIS: Van Vleck said that computer programs are like hieroglyphs. If you lose the people who have actually created them, they become difficult — maybe even impossible — to understand.
JERRY SALTZER: The programs were not terribly well-documented. A program operates in an environment. If you write a program today that’s supposed to run on your desktop computer, it depends on the existence of perhaps Windows or the Macintosh operating system. And if you want to try to get that program running 20 years later, you have to run it in the same environment, or you have to reconstruct the environment.
ERROL MORRIS: How difficult is that to do?
JERRY SALTZER: The environments are very complex. And most programs have errors in them, simply because people aren’t perfect programmers. But the errors interact with the environment in so many different ways, and sometimes they are benign, other times they cause trouble. The ways that cause trouble usually show up as problems on your screen, and they get fixed. But the things that are benign, don’t show up. And if you then attempt to run the same program in a reproduction of the original environment, the errors may no longer be benign, because you have not actually reconstructed the original environment perfectly. And so, somebody trying to get CTSS running again, for example, will be operating on an emulation of an IBM 7094, not on an actual IBM 7094. And the emulation isn’t perfect. And that imperfection may interact with an imperfection in the software to produce a completely unexpected result. Now, the question is: Well, what did you intend to happen in this case? And the answer is, “Well, go talk to the programmer.” “Yeah, but he died 30 years ago.”
ERROL MORRIS: Are you saying the meaning of a computer program is not recoverable?
JERRY SALTZER: It’s difficult. Without getting into the head of the original programmer, it can be challenging. On the other hand, a smart person can work around a problem or figure out what must have been going on in the mind of the original programmer. Actually, it’s a little bit easier to deal with a program from the 1960s than it is to deal with Aramaic.
This passage seems to me to highlight a number of practical problems that exist with the Singularity, or at least the mind-uploading aspect of it. Just about every single metaphor we've used in this discussion presumes a perfect copy. But can that be done, truly? Outside the skull, does a brain work? Outside the world? Who would be the "smart person" who would work around the problem and figure out what was going on in the mind of the original programmer? The computer within which the simulation was housed? How would it know how to do that? On what basis would it decided? Can we even comprehend such a being, comprehend enough to tell it it's wrong, that's not how my mind works, put me back? And in the analogy of brain-as-emulated program --- you can't emulate something on a machine less sophisticated than itself. If consciousness is an emergent property, wouldn't anything smart enough to contain and emulate a human brain also be conscious? If so, how do we know it'd be willing to just, you know, let us hang out in storage, be willing to dust the shelves and keep feeding us electricity for all time?
« Older You know who else owned things with swastikas on... | Don-8r: the alternative to Chuggers Newer »
This thread has been archived and is closed to new comments