Calvin and Markov
July 6, 2015 2:54 PM   Subscribe

 


So how does this work?
1. The Markov chain code

This is a custom implementation of a Markov chain process that I wrote in Perl a few years back because (a) I’m finicky about how my Markoving works and (b) it seemed like a fun thing to write. It’s a content-neutral set of functions — nothing about it is specific to Calvin, or to comic strips. It’s just a bunch of code that will digest an arbitrary collection of text and then burp out new weird sentences when you ask it to.

I’ve made a few small improvements to this code as I’ve revisited it the last few days, but it was feature-complete already.
posted by the man of twists and turns at 2:57 PM on July 6, 2015




I'm not quite sure what is going on here....
posted by Roger Dodger at 3:12 PM on July 6, 2015 [7 favorites]


I'm waiting for the deep learning neural network update. Get crackin' cortex.
posted by GuyZero at 3:13 PM on July 6, 2015 [2 favorites]


Hey hey! It's been fun watching playing with this today. I really wasn't sure whether it was gonna come together once I got it functioning—I talk a little bit about that at the tail end of the blog post the man of twists and turns just linked, about what a different cultural place C&H has compared to an easy target like Garfield—but it turns out it's a decent vein of surreality in its own right.

I'm waiting for the deep learning neural network update.

Oh geez, that could be some righteous nightmare fuel. Pair it with Zalgo text for the word balloons, I suppose.
posted by cortex at 3:16 PM on July 6, 2015 [2 favorites]


You are absolutely correct that This is Fun. Nice job cortex. Do you have any ideas for keeping chains more focused on a particular subject matter or theme throughout the strip?
posted by Roger Dodger at 3:22 PM on July 6, 2015 [1 favorite]




Do you have any ideas for keeping chains more focused on a particular subject matter or theme throughout the strip?

The Markov engine I wrote actually supports a rudimentary keyword seeding process—you can in principle say, "hey, give me a sentence that has this word in it" and it'll do so if possible—that I could in theory use to start each next balloon based on some weird that appeared in the previous one.

But that's not active in the current state of the project, largely because I just don't have a big enough corpus yet to make it very likely you'd get other than e.g. basically the same comment from Hobbes three panels in a row, or similar sub-optimal output.

It's something I should revisit some time because I think it could be made to work with a little nudging, though.
posted by cortex at 3:32 PM on July 6, 2015




Flight to Orlando ruined.
posted by Kabanos at 3:38 PM on July 6, 2015 [2 favorites]


Dark times.

posted by Kabanos at 3:39 PM on July 6, 2015 [2 favorites]




Welp, I guess it works, then. (Not Calvin and Markov but the related Previously, on the X-Files. Also, thank you for these, cortex.)
posted by The Great Big Mulp at 3:53 PM on July 6, 2015


I wonder which would seem less incongruous: Calvin and Hobbes with Garkov-generated text, or Garfield with C&M-generated text.
posted by ardgedee at 3:59 PM on July 6, 2015




Bill can't be happy about this.
posted by shockingbluamp at 4:44 PM on July 6, 2015








Every single one of these is better than 97% of current newspaper comics.
posted by mbrubeck at 5:21 PM on July 6, 2015 [10 favorites]


Hobbes is Dad, part two
posted by wanderingmind at 5:33 PM on July 6, 2015


Leave your tiger in the basement for the rest of his life. Everyone else lived happily ever after.

The other panels are nonsense, but that's dark, Calvin's Dad.
posted by vibratory manner of working at 5:33 PM on July 6, 2015 [3 favorites]




MeFi's Own Josh 'Tex' Millard is truly the Markov master. In addition to Garkov, he has made 'Previously, on the X-Files...', 'The Big Markovski', the proudly sacrilegious 'Jesus Markoving Christ', and the apparently still-in-beta 'Previously, on Next Generation...' (ah, the joy of exploring open subdirectories). So what's next? Markovbert? Markov of Thrones? Mad Markov? Markov Avengers? Markovtor Who?

It also explains some of his moderation decisions here. JK! I KEED! HAMBUGUR!
posted by oneswellfoop at 5:41 PM on July 6, 2015 [1 favorite]




Okay, one more. I can't stop laughing at these. Calvin goes a bit meta.
posted by wanderingmind at 5:44 PM on July 6, 2015




I'm pretty sure this could be used as a kind of psycological test, like a question on a souped-up voigt-kampff test or something.
posted by some loser at 6:06 PM on July 6, 2015


I love the ones that could alllllmost pass for a real strip.
posted by drinkyclown at 6:20 PM on July 6, 2015 [2 favorites]


also that 'previously on the next generation' link is probably better than the one in the fpp. Those were in my first five. I also had one where O'Brien called Garon "Gillespie" which I thought was hilarious. "sit down Gillespie!"... tha best
posted by some loser at 6:22 PM on July 6, 2015


He's not smart but he's streetwise.
posted by mai at 6:37 PM on July 6, 2015


It would be hard but cool to train a model to do speech bubble detection. The trick would be getting it to correctly determine which person each bubble is coming from. Then you could automate most of your tagging process.
posted by vogon_poet at 6:40 PM on July 6, 2015


The gospel of Jesus of Markov: Dropped my pants; fucking you, then your mother.
posted by lathrop at 6:55 PM on July 6, 2015 [1 favorite]


Why yes, I have had this exact day with my son.
posted by dejah420 at 6:59 PM on July 6, 2015 [2 favorites]




Omg. "Some time passed." As it is wont to do..
posted by chainsofreedom at 7:09 PM on July 6, 2015 [1 favorite]


Let's change the subject.

Yes, let's.
posted by xorry at 7:16 PM on July 6, 2015


I like this one as a horse-ebooks-y meditation.
posted by churl at 7:20 PM on July 6, 2015 [1 favorite]


Ewww, Calvin.
posted by Metroid Baby at 7:30 PM on July 6, 2015


...[S]tart each next balloon based on some weird...

Wait, it can get weirder?!? Yes please! :)
posted by riverlife at 7:32 PM on July 6, 2015


lmao
posted by Rustic Etruscan at 8:14 PM on July 6, 2015 [2 favorites]


I love the ones that could alllllmost pass for a real strip.

Big plans.
posted by Kabanos at 8:19 PM on July 6, 2015 [2 favorites]


Designer clothes.
posted by Kabanos at 8:44 PM on July 6, 2015 [2 favorites]






Humidity.
posted by eruonna at 9:04 PM on July 6, 2015


This is great, but what's most surprising is how quickly I can recall the original strip for most pieces of text. C&H obviously imprinted on me far more than I realized.
posted by vanar sena at 9:12 PM on July 6, 2015


I love the ones that could alllllmost pass for a real strip.

I'll be right...
posted by eruonna at 9:15 PM on July 6, 2015 [1 favorite]




Data doesn't think Picard listens to him. (via Previously, on Next Generation)
posted by The Great Big Mulp at 9:46 PM on July 6, 2015


Poetry
posted by The Whelk at 10:39 PM on July 6, 2015


A perfect story prompt
posted by The Whelk at 10:43 PM on July 6, 2015 [1 favorite]


speak in me O Muse
posted by The Whelk at 10:47 PM on July 6, 2015 [1 favorite]








C: I did something bad?
H: You don’t know yet, i can’t decide.

Whoa, man, that is double-rainbows deep.
posted by D.C. at 3:50 AM on July 7, 2015






What was that about actual comic strips?
posted by tocts at 5:43 AM on July 7, 2015 [2 favorites]


Boredom
posted by Mr.Encyclopedia at 5:47 AM on July 7, 2015 [1 favorite]








I'm a little scared right now
posted by Mayor West at 9:25 AM on July 7, 2015




this one is pretty much my childhood.
posted by Foosnark at 11:47 AM on July 7, 2015




I'm a little scared part 2?
posted by some loser at 6:30 PM on July 7, 2015


« Older Burt Shavitz, namesake and co- founder of Burt's...   |   If it weren't for Edison we'd be watching TV by... Newer »


This thread has been archived and is closed to new comments