Episode 00084: The Biracle of Thanksgiving (Part 4)
May 3, 2016 2:18 PM   Subscribe

Fullest House. We fed every Full House script into a artificial neural network machine learning algorithm. Each day, a new episode of Full House will be generated by a computer, forever.
posted by robocop is bleeding (38 comments total) 13 users marked this as a favorite
 
DANNY: Well, I was the beautiful surprise.

Yes you were, Danny Tanner. *sniff* Yes you were.
posted by Rock Steady at 2:26 PM on May 3, 2016 [5 favorites]


If we keep feeding neural networks this kind of stuff, and one wakes up, man is it going to be pissed.
posted by furnace.heart at 2:28 PM on May 3, 2016 [13 favorites]


This was cruel. I clicked that link actually expecting to read a script.
posted by If only I had a penguin... at 2:29 PM on May 3, 2016


My twitter bot has always wanted to meet Jodie Sweeten so if anybody knows how I can get his agent in touch with the producers, I think this is his in.
posted by MCMikeNamara at 2:31 PM on May 3, 2016 [3 favorites]


This was cruel. I clicked that link actually expecting to read a script.

The scripts are in the first link.
posted by Rock Steady at 2:32 PM on May 3, 2016


It is only a matter of time before someone writes a program that generates these episodes by splicing together single-word clips of Fuller House. Then mankind's work on this Earth will be complete and we can all commit ritual suicide.
posted by dephlogisticated at 2:35 PM on May 3, 2016 [3 favorites]


It's pretty much just gibberish. Seems like one of those things where the idea is more fun and interesting than the result.
posted by clockzero at 2:36 PM on May 3, 2016


Episode 00011: (Part 2)
Episode 00021: (Part 2)
Episode 00027: (Part 3)
Episode 00036: (Part 2) (Part 2)
Episode 00049: (Part 4)


And I thought the Fringe mytharc was complicated.
posted by Rock Steady at 2:36 PM on May 3, 2016 [9 favorites]


It's pretty much just gibberish. Seems like one of those things where the idea is more fun and interesting than the result.

You mean the oroginal show, right?

*rimshot*
posted by mandolin conspiracy at 2:42 PM on May 3, 2016 [3 favorites]


The program-generated scripts remind me of the far-future episodes of "The Simpsons" from Don Hertzfeld's couch gag.

"Don't don't have cow man."
posted by infinitywaltz at 2:52 PM on May 3, 2016 [11 favorites]


The scripts are in the first link.

OMG, thanks. I will read them now. And yeah, I'm expecting gibberish. Basically catch phrases and typical things they say just mishmashed together.
posted by If only I had a penguin... at 2:53 PM on May 3, 2016


Well that was disappointing. I mean I was expecting better semantic/grammatical construction. Like not that the plot of an episode would make any sense, but that individual sentences would at least be grammatical and make some sense.
posted by If only I had a penguin... at 2:56 PM on May 3, 2016


What has occurred with predictability?
Television, newspaper bicycle boy, the man who suckles cows?
You miss your ancient friends but they await you in your elbow

Everywhere you look (everywhere)
There is a human heart
A grasping hand
Everywhere you look (everywhere)
There is a human face on the ground

When you are lonely in the abyss
A light exists that will lift up your body
Everywhere you look
posted by Atom Eyes at 3:02 PM on May 3, 2016 [19 favorites]


...overhead, without any fuss, the stars were going out.
posted by Capt. Renault at 3:12 PM on May 3, 2016 [10 favorites]


Not surprisingly, I found the article about the underlying technology a lot more interesting. Includes versions of Paul Graham (don't miss the most likely thing that Paul Graham might say) and Shakespeare, plus "hallucinated" Wikipedia articles, XML snippets, Algebraic Geometry papers, commented C code, and baby names). Who's the first to run this on the MetaFilter corpus?
posted by effbot at 3:24 PM on May 3, 2016 [2 favorites]


This has been a Miller-Watson-Smarter Child-Boyette production, in association with Lorimar Skynet Telepictures Domestic Goatse Television Distribution.
posted by dr_dank at 3:33 PM on May 3, 2016 [1 favorite]


Holy crap. I'm having my Mac read the first episode to me. I have Aphex Twin's DrukQs running in the background. It's making me feel funny, but I can't stop.
posted by Cat Pie Hurts at 3:43 PM on May 3, 2016 [6 favorites]


It's a neural net, which means it's still learning. Eventually it will be generating episodes that are indistinguishable from real episodes, but even then it won't stop. The program will keep learning and writing. Each episode will be incrementally closer to representing the platonic ideal of a Fuller House episode. Some theorists have argued that this ideal can never actually be reached. Like calculating the digits of pi, one can in principle achieve any arbitrary level of precision, but it will always be an imperfect approximation. However, some obscure interpretations of the sacred texts have suggested otherwise: immanentization of the Fullest House could in fact be an inevitable stage of the final apotheosis. It is only by invoking this entity that the gateway to next world can be opened and the sins of this life cleansed forever by purifying fire.
posted by dephlogisticated at 3:48 PM on May 3, 2016 [7 favorites]


"STEPHANIE:
I don't know what I am. I was the only one with you."


Haunting?
posted by oh.ghoulin at 3:54 PM on May 3, 2016 [4 favorites]


I mean these are basically indistinguishable from the actual script for The Room
posted by Jon Mitchell at 3:59 PM on May 3, 2016 [3 favorites]


DJ:
Well, I wasn't not talking about?


Indeed.
posted by Existential Dread at 4:03 PM on May 3, 2016


Still more coherent than Marijuana Simpson.
posted by neckro23 at 4:05 PM on May 3, 2016 [1 favorite]


Representation size of neural net is vastly bigger than normative hidden markov model (which is a derivation of the markov model) because of the distribution of representation in the units, but it's still limited (it's hard to overstate this: this is a O(n) to O(2^n) win on representational power per unit, or to put it another way it is a O(2^n) to O(n) win on the amount of data you need).

And mind you that representational power is the thing, it's why optimal brain damage and optimal brain surgeon work (actually the naive pruning per absolute weight also works). Local optima was trumpeted as a huge problem in the 80's and is now generally correctly realized to not be a problem at all, but the numerics of the wholeness of the problem is not well known, and it's highly path-dependent, highly gradient-application-method-dependent (SGD will give an entirely different result than SGD+Nesterov will give an entirely different result than RMSProp than AdaGrad than ADAM and ha ha ha if you want to have more than a sort of story for what each of these do).

I hope one day that we will be able to train on the order of the size of the representation, not on the size of the weights with respect to the representation, and that will actually be fun times and such. I tend not to believe in the attentional mechanisms sort of thing until we can get to that point, because it will be too slow for the really vast tasks we want even with GPU: like running bubble sort on a 100 GHZ computer. Same with reinforcement learning for non-piddly tasks.
posted by hleehowon at 4:49 PM on May 3, 2016 [1 favorite]


Somebody needs to put together a Standard American Sitcom House set and a vintage laugh track machine and get some actors to perform these scripts.
posted by acb at 4:51 PM on May 3, 2016 [1 favorite]


(I can't find citation for the exponential -> linear claim, but there's a buncha similar claims made in the temporal RBM papers)
posted by hleehowon at 4:53 PM on May 3, 2016


This has been a Miller-Watson-Smarter Child-Boyette production, in association with Lorimar Skynet Telepictures Domestic Goatse Television Distribution.

Mewoof!
posted by Alvy Ampersand at 5:06 PM on May 3, 2016 [2 favorites]


Well, I've tried to read one and to quote Jesse: "Well, I don't know what the heck is going on the house."
posted by bigendian at 5:17 PM on May 3, 2016


After reading through more of the scripts than I'd like to admit, there are some phrases that seem to frequently repeat. Some make sense (e.g., I don't understand, I'm sorry, I love you, Beach Boys), but did Full House feature a duck at some point?

Not that the scripts as a whole make a lot of sense, but duck has comes up enough that I'm wondering if I blocked out something...
posted by TofuGolem at 6:52 PM on May 3, 2016 [1 favorite]


Steph, he's not a real duck!
posted by Alvy Ampersand at 7:55 PM on May 3, 2016


Yeah, the grammar is bad, but Babyy Bew Bear (the only one I looked at so far) strikes me as, I don't know, thematically coherent at least? Fun, as a surrealist romp? If you just think of it as baby talk (which ends up being the theme), it would do OK as a Brechtian commentary on the inanity of Full House.

full disclosure: I watched the taping of the Christmas episode with my Girl Scout troop when I was a kid
posted by gusandrews at 10:04 PM on May 3, 2016


too many cooks
posted by gusandrews at 10:05 PM on May 3, 2016


I'm going to be a little person to be a big girl!

Hmm?
posted by boilermonster at 10:13 PM on May 3, 2016


Fun, as a surrealist romp?

It's a bit long, but otherwise this is a rather good universal New Yorker cartoon caption:

"Whoa, whoa, whoa, whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whoa... whockething nut on a coptortoo. I'm going to be some problem."
posted by effbot at 2:33 AM on May 4, 2016


Hi where can I contribute to the kickstarter to get these produced?

Just one, please, someone make these happen. (with laughtrack)
posted by Theta States at 7:02 AM on May 4, 2016


me right now:
DJ: I don't know what the heck is going to have to go to the station.
Joey: Yeah, I don't want to go to the party.
posted by Theta States at 7:04 AM on May 4, 2016


Alvy Ampersand:
Mewoof!


When JD Roth hosted Fun House, he had to do the "Lorimar Telepictures..." word soup at the closing credits, capping it off with an exasperated "oh boy!".
posted by dr_dank at 8:02 AM on May 4, 2016


"I know. Sorry. Last night after my gig at the smash club I go for a cruise on my Harley, right? Next thing I know I'm in Reno. It was dark. Who would have known? Then I happened to wander into this show, Razzle Dazzle '87 -- much better than Razzle Dazzle '86, by the way. And I see this incredible showgirl Vanessa, right? Our eyes meet. Ba-boom, this lightning bolt of passion shoots across the casino. Turns out, Vanessa is on her way to the Philippines to do a Bob Hope special, and is dying for one last night of good old American"

Neural net, or real show?

You decide.
posted by blucevalo at 8:46 AM on May 4, 2016


Just one, please, someone make these happen. (with laughtrack)

David Lynch could make this happen.
posted by Bucket o' Heads at 9:03 AM on May 4, 2016 [1 favorite]


« Older RIP Bookslut   |   The Racist History of the Word Caucasian Newer »


This thread has been archived and is closed to new comments