Server suicide:
September 10, 2002 6:41 AM   Subscribe

Server suicide: A group of british artists have set up a webserver that also controls a crusher. The thing is, the webserver is inside the crusher and will crush itself on Thursday at 20:00 GMT. (via found)
posted by edsousa (29 comments total)
 
Oops it's at 19:00 GMT, 20:00 is the time over here.
posted by edsousa at 6:42 AM on September 10, 2002


cool...I guess . do androids dream of electric sheep?
posted by hoopyfrood at 6:43 AM on September 10, 2002


I'm not much into scifi, but doesn't this go against one of Asimov's Laws of Robotics?
posted by edsousa at 6:47 AM on September 10, 2002


When machines get conscious, they are *so* going to kick our ass over this one.
posted by mediareport at 6:48 AM on September 10, 2002


where is it? if it is london i would like to go see this server crush itself. oh hang on. i am playing five a side then...
posted by Frasermoo at 6:51 AM on September 10, 2002


Is there a different server serving the pictures? Otherwise what is the point? Lame.
posted by McBain at 6:57 AM on September 10, 2002


It would be better if they did it like that Chris Burden installation, samson):
In order to fully appreciate this installation, the viewer must pass through the turnstile, which in turn expands the jack and forces the beams against the walls. Although each movement is slight and imperceptible, the sculpture maintains the theoretical capacity to destroy the room in which it is housed.

Could do the same thing here, each unique visitor hitting the web site moves the crusher down a very, very tiny amount, until the poor thing gets its ram smashed thru its motherboard.
posted by malphigian at 6:58 AM on September 10, 2002


"...doesn't this go against one of Asimov's Laws of Robotics?"

Computers are only as smart as the human beings that program them. If Asimov's laws of robotics have not been programmed into a computer by human beings, then the computer doesn't know to follow them. It's one of the drawbacks of Asimov's theory about robotics.
posted by ZachsMind at 6:59 AM on September 10, 2002


Asimov didn't have any theories about robotics -- but he did postulate some definitions (covering laws, really) that he played with in his novels.

But this thing isn't a robot. It has neither independent motility, nor independent action on account of it (and other things).

What I find interesting is that it's a possibly a reference to computer-assisted suicide. What weakens it, like McBain said, is that the watcher (read: image server) should be the server itself, so we `gain' empathy from viewing it and understand more readily the relevance of our viewing-as-participation.
posted by Hilarion at 7:22 AM on September 10, 2002


the watcher (read: image server) should be the server itself

Huh. I just assumed it was. That *is* lame.
posted by mediareport at 7:25 AM on September 10, 2002


All I can think is:

"I cannot self-terminate. You must lower me into the steel."

"Nooo! You gotta stay! I ORDER you to stay!"
posted by Succa at 7:55 AM on September 10, 2002


That *is* lame.

Why is it lame?

The server is both documenting and causing its own demise. As it causes its own 'death,' many will be watching as it ceases to serve images. Kind of like watching someone speak their last words.

Also, it is very possible that the mass of traffic at the event itself will cause the server to crash -- would this halt the suicidal process? Talk about irony.
posted by o2b at 7:57 AM on September 10, 2002


It just seems like an extremem way to kill a process...
posted by jazon at 8:05 AM on September 10, 2002


or even an extreme way (damn fat fingers)
posted by jazon at 8:06 AM on September 10, 2002


Is there a different server serving the pictures?

Apparently, there is.
posted by LinusMines at 8:14 AM on September 10, 2002


Even if it were considered to be a robot, it does not violate the Three laws of robotics. The three laws are basically:

1. Don't hurt humans, or let humans be hurt through inaction.
2. Do whatever humans say, unless that command would violate the first law.
3. Protect yourself, unless that doesn't conflict with the first and second laws.

In this case, a human can order a robot to destroy itself, and it will do so. You can't order a robot to kill someone though. And a robot can't kill itself of its own volition.

If this server were a 'robot' then it would be doing this based on our command. We are telling it to smash itself, and it will obey.

"I Robot" is a great book filled with short stories about robots, and about all the interactions of these laws, and about how some things can happen that look like violations, but that logically still obey the laws. Its very interesting, and a great read.

On a less intellectual note: SMASH SMASH SMASH! Yay! I'm definately going to be watching this when the actual crushing occurs.
posted by phidauex at 8:56 AM on September 10, 2002


When machines get conscious, they are *so* going to kick our ass over this one.

Probably so, but they will *certainly* kick our ass over this.
posted by edsousa at 9:06 AM on September 10, 2002


I wouldn't call it suicide since it isn't conscious of what it's doing. Kind of trivializes it really.

But it is an interesting piece of art.
posted by Foosnark at 9:15 AM on September 10, 2002


As it causes its own 'death,' many will be watching as it ceases to serve images. Kind of like watching someone speak their last words.

I wish that was how it was set up, o2b; that would be nice if the server went dark as it crushed itself. But it's not clear that's how it's set up; it looks like we'll just be watching as a *different* server crushes itself. Woohoo.
posted by mediareport at 9:28 AM on September 10, 2002


Oops, I may have misinterpreted McBain's comment, o2b. I see what you mean. I agree with you and not McBain; watching the server go blank would be the only thing that would make this cool.
posted by mediareport at 9:32 AM on September 10, 2002


Assuming the computer/server is a robot - no, this is not in violation of Asimov's laws of robotics.

At first it may appear to violate the third law:

"A robot must protect its own existence as long as such protection does not conflict with the First or Second Law."

But, since the first law is:

"A robot may not injure a human being, or, through inaction, allow a human being to come to harm."

then the server is bound by the laws to destroy itself.

Many of Asimov's short robot stories indicate that "harm" to a human being can be emotional as well as physical - even if it is extremely slight. Now, imagine the disappointment we would all feel if this computer did not get destroyed when the counter reaches zero. OK, it may only be slight, but it is enough for the computer (as robot) to validate its own destruction, I assure you.
posted by nthdegx at 9:43 AM on September 10, 2002


...especially considering the number of human beings who may be watching... don't forget the Zeroth Law:

0. A robot may not hurt humanity, or, through inaction, allow humanity to be harmed.
posted by o2b at 10:25 AM on September 10, 2002


I can't remember a thing about it, but there was an internet art project a few years ago that involved, I think, a pendulum -- located in a public space in Switzerland or Italy. You could send mail to the computer controlling the pendulum to affect its motion. After several months -- I believe it was more than a year -- the pendulum would smash into the controlling computer and mail server and destroy itself.
posted by dhartung at 10:43 AM on September 10, 2002


my theories on why this computer is being crushed:

1) tax write-off
2) enron mail server
3) MPAA has acquired the Napster db server
4) mom and dad went away for a few days
posted by lsd4all at 11:23 AM on September 10, 2002


[A] robot can't kill itself of its own volition.

If it can't do something it could physically do because of its own "volition", then it really has no volition and isn't conscious or intelligent. Whoever programmed it would have placed a limit on its consciousness if it were unable to destroy itself.
posted by oaf at 1:42 PM on September 10, 2002


You know - it is possible to take the fun out of debate by stating the obvious.
posted by nthdegx at 2:28 PM on September 10, 2002


I'm sure it was just sousa being silly, but a lot of people do in fact believe, wrongly, that Asimov's laws apply to robots generally. This isn't hard for sf fans, of course, but he clearly meant that they were laws programmed into the robots in his story universes, and specifically, into "positronic" robots -- leaving open the possibility, even in his universes, of non-positronic robots that did not obey the laws.

The logic of the author went, roughly:
* so much "sci fi" relies on robots behaving badly
* humans would not program or build robots that consistently behaved badly
* instead, they would create a set of programming rules that would govern robot behavior
* stories written with these rules in place are much more intrinsically interesting than stories about miswired robots bouncing around the room bleating "Kill maker! Kill maker!"
* Frankenstein was a psychological story, not a scientific one, and not a good basis for hard sf.
posted by dhartung at 2:51 PM on September 10, 2002


Dave, stop ... Stop, will you? Stop, Dave ... Will you stop, Dave ... Stop, Dave. I'm afraid ... I'm afraid ... I'm afraid, Dave ... Dave ... my mind is going ... I can feel it ... I can feel it ... My mind is going ... There is no question about it. I can feel it ... I can feel it ... I can feel it ... I'm a ... fraid ...

*crunch*
posted by evanizer at 6:50 PM on September 10, 2002


guess it worked then...
posted by jonvaughan at 2:38 AM on September 13, 2002


« Older A Day in Radio.   |   U.S. Stops Iraq-Al Qaeda Talk Newer »


This thread has been archived and is closed to new comments