Cheap [generators] done [adjective]
May 24, 2021 1:09 PM   Subscribe

GalaxyKate has a very nice long blog post on different approaches to generative content, ranging from distributions to grammars to constraint solving. She also developed Tracery, a grammar-based generative text tool, which has a nice web demo and a how-to guide for integrating with video games, and is apparently also useful for making Twitter bots.
posted by kaibutsu (7 comments total) 28 users marked this as a favorite
 
Tracery feels a lot like Perchance, although I prefer the simple syntax of the latter.

Great blog post, though.
posted by avapoet at 2:48 PM on May 24, 2021


Love Kate’s stuff, her stuff at the Roguelike Celebration is always great.
posted by astrospective at 8:06 PM on May 24, 2021 [2 favorites]


And Tracery and Perchance are both similar to Improv.

There's also RandomGen, by the same guy that made Cookie Clicker.

I wonder how many procgen libraries there are out there?
posted by Nossidge at 1:31 AM on May 25, 2021


Context-Free Grammars are well-studied both in computer science and as a model of human language, so it's not surprising that there are many libraries available for randomly sampling from a CFG. Some of the things that set Tracery apart are:
* Its comprehensive and interactive tutorial.
* Its integration into casual tools such as Cheap Bots, Done Quick and Cheap Bots, Toot Sweet which do not require any programming besides writing the Tracery grammar, but still allow automatic posting to active social media sites.
* The culture of open and remixable code fostered by those tools.
* Its "variables" feature (part 5 in the tutorial) allowing users to go beyond CFGs into Context-Sensitive Grammars, allowing a wider range of creative expression from learning a single, optional, piece of syntax.
* Some relatively famous applications.

(self-promo) Personally I like using it to generate abstract art in SVG format. Also Context-Free Grammars include Regular Grammars, which include Markov chains. So, here's an online tool to automatically generate a CBDQ-friendly Markov chain Tracery grammar from an input text. (end self-promo)

What Tracery could really do with is a cross-platform/in-browser editor that's sensitive to its syntax's use of hashes, and a linter/static analyser that says when tags are used without being defined.
posted by polytope subirb enby-of-piano-dice at 3:19 AM on May 25, 2021 [3 favorites]


Those Truchet patterns are great! (polytope's SVG links, for those passing by.)

I've been daydreaming lately about ways to get better long-term dependency modeling in neural network generators... We have really, really good 'local' generators now, in lots of contexts, which still have some tendency to wander or get weird in longer passages. The obvious example is text generation (GPT-3), but text-to-speech is also a good example. TTS works fantastically well on single utterances these days, but in long passages generally fails at emoting.

So, I find myself thinking about NPCs in role playing games... There's a lot of interesting structure+assumptions you want to encode which make the character (character class, alignment, a few personality trait tags, some overall life story, and some history of their week and day), and you'd like the text (or speech) generator to take these things into account. The GPT-3 approach is to prefix the generated text with some paragraph of facts+style, which is OK, I guess, but it would be nice to be able to plug in some sequence of discrete variables/tags and get something interesting out. And then take the generated text and use it to get new tags that can be added to the character description. (Again, GPT-3 handles this by just keeping track of some recent generated sentences... A more compact embedding would be great.)

In my day job, we would usually call these tags 'conditioning,' which comes from the notion of conditional probability. The next word generated is picked from a conditional probability distribution, informed by the conditioning inputs. The trick, then, is how to go about training the neural generator to interact sensibly with some hand-designed collection of tags, which I feel like I don't have a great answer for off the top of my head. But, so it goes for daydreaming.
posted by kaibutsu at 10:05 AM on May 25, 2021 [1 favorite]


I wrote a grammar-based generator once, but alas its last update was 6 years ago and says "update to modern python2". so I doubt it'll run today without heroics. Oh well.

Of course, you can also just code them in a language you know, whether it's one you love or hate, like this one I created one time based off some tables in an RPG manual.

In another foray into RPGs, I made an NPC generator. This is more a matter of throwing things together from lists, but run it a handful of times and you'll find some character that "hangs together".
posted by the antecedent of that pronoun at 2:58 PM on May 25, 2021


.. and so I went and updated novelwriting to run on modern python3. Easier than I expected by quite a lot.
posted by the antecedent of that pronoun at 11:30 AM on May 26, 2021 [1 favorite]


« Older How Washington Got Hooked on Flying Saucers   |   She Stays Winning Newer »


This thread has been archived and is closed to new comments