How to extract the game data (the hard way)
September 8, 2016 8:49 AM   Subscribe

"When we decided to remake The Dragon’s Trap, one of the primary goals we set for ourselves was to make a game that is faithful and respectful to the original in terms of gameplay [...] We want the timings, physics, and reactions to feel just right and not like a cheap remake [...] We don’t want to replicate all of it by guesswork and lose some the original magic due to translation. So I went and started digging into the original code… "
posted by griphus (27 comments total) 17 users marked this as a favorite
 
So we’ve extracted the data from the original game! The data fits in a 256 kilobytes file, aka 262144 unique bytes.
My wife got sick of her machine and brought home a new "pretty fast" one last night. 8 gig of RAM. It's not even another world any more.
posted by hawthorne at 9:03 AM on September 8, 2016 [1 favorite]


MY OLD FRIEND!!!!!!! (eyes sparkle) I loved this game!
posted by Dressed to Kill at 9:06 AM on September 8, 2016 [1 favorite]


My wife got sick of her machine and brought home a new "pretty fast" one last night. 8 gig of RAM. It's not even another world any more.

That's pretty minimal these days. My work laptop has 16GB.
posted by octothorpe at 9:10 AM on September 8, 2016


I'm quite nostalgic for this game, as well as the ones in the series that came right before and right after. They were Sega's action-RPG response to Zelda, and they were very good.
posted by painquale at 9:18 AM on September 8, 2016


There's actually 2 remakes of The Dragon's Trap in the works. The first being the straight remake and the other being Monster Boy which also looks absolutely gorgeous.
posted by Talez at 9:19 AM on September 8, 2016 [1 favorite]


This is a fairly brief post that (despite its warning) didn't get nearly as technical as I'd have liked, but I really love reading about this kind of spelunking into old code. As with this post, one of the main takeaways is usually "man, people used to go to extraordinary lengths to work around the hardware limitations of the time".

These days, we just throw more RAM and more CPU power at the problem. Back then, programmers (not "developers", yet) were forced to be ingenious.

I'm not saying I want to go back to that, but I do find myself nodding and muttering "respect, brother" across the decades when I read this stuff.
posted by escape from the potato planet at 9:19 AM on September 8, 2016 [11 favorites]


The example of the graphing tool plotting the memory at run time is so satisfying and I could watch an extended version of it for an hour.
posted by MCMikeNamara at 9:42 AM on September 8, 2016 [5 favorites]


This was a great read. Thanks! I really liked the gif with the custom emulator.

The hardware hackery is cool, too. On a sorta related note, I actually made use of that unused-by-anything-official expansion port on the back of the Sega Master System to install one of these for a friend. Really sounds great!
posted by destructive cactus at 10:10 AM on September 8, 2016 [1 favorite]


There are so many cool visualizations and debugging tools that could be added to modern emulators. MAME has a simple debugger but it's not very pretty.

Sadly, modern software is so complicated that reverse-engineering an old Sega game seems to me a lot easier (and more fun) than building and extending the MAME hairball.
posted by RobotVoodooPower at 10:16 AM on September 8, 2016


After listening to the recent Retronauts podcast about the Wonder Boy/Adventure Island/Monster World series I can't even think about these games without going cross-eyed. Still, I'm glad Monster Boy is continuing the tradition of completely inexplicable naming conventions.
posted by Mr.Encyclopedia at 10:26 AM on September 8, 2016 [1 favorite]


That's pretty minimal these days. My work laptop has 16GB.

Hahaha...I read a while back (around the time CPUs started switching from 32 to 64 bit architecture that a 32-bit CPU was capable of addressing up to 8GB of ram...a 64-bit CPU is capable of addressing something like 16 trillion GB of ram. So, yeah...you ain't seen nuthing yet. Reality itself is just gonna start to melt.
posted by sexyrobot at 10:30 AM on September 8, 2016 [1 favorite]


32 bit CPUs can address 4GiB of RAM, because 2^32 B = 4 GiB.
posted by NoxAeternum at 10:48 AM on September 8, 2016


32-bit CPU was capable of addressing up to 8GB of ram

A 32-bit pointer represents 4GB of address space. However, it was common for the CPU to allow multiple 32-bit values to be combined to address more than 4GB of physical memory; Intel chips did this as early as the Pentium Pro. This is somewhat similar to how 16-bit CPUs could address more than 64K of memory.
posted by aubilenon at 10:54 AM on September 8, 2016 [5 favorites]


Reminds me of when I hacked the data file for Space Empire on my CNet Amiga BBS using a hex editor in order to allow me to play multiple times per day.

I ruined it for everyone. :(
posted by grumpybear69 at 12:33 PM on September 8, 2016 [1 favorite]


Space Empire or Space Dynasty ? (I was a WWIV BBSer, and yeah, ran my own BBS just for the doors.. )
posted by k5.user at 1:01 PM on September 8, 2016


After listening to the recent Retronauts podcast about the Wonder Boy/Adventure Island/Monster World series...

As a devoted Retronauts listener who got maybe 25 mins into it, I commend your endurance.
posted by griphus at 1:08 PM on September 8, 2016


Funnily enough, I've been messing around with z80 code and tweaking a favourite game recently (to a much lesser extent that on show here). I love, love, love reading or thinking about this sort of thing, revisiting a classic (or better, favourite) game to tease out its subtleties, either by disassembly or obsessive play. Great stuff! Fiction has fanfic, games have remakes/ROMhacks. We own our experiences and interpret them as we choose.

On a sorta related note, I actually made use of that unused-by-anything-official expansion port on the back of the Sega Master System to install one of these for a friend. Really sounds great!
Cannot emphasise how cool that is. My platform of choice (Sinclair Spectrum) came with detailed instructions about what the various external connections did and how they could be employed, but the jump from software to hardware was so beyond me. I don't think half the potential of the '80s micro-computers (and apparently consoles too!) was ever used.
posted by comealongpole at 2:10 PM on September 8, 2016 [2 favorites]


I don't think half the potential of the '80s micro-computers (and apparently consoles too!) was ever used.

I totally agree. As you mentioned Z80 - I've been trying to justify getting one of these kits to try to apply my adult brain to the stuff I couldn't handle when I was a kid, when these computers were around. The really ironic part is, there's even a module for this thing that lets you plop a Raspberry Pi Zero into it, to handle 'modern' console I/O (usb keyboard, hdmi monitor).

Sorry to veer so far off-topic! Anyhow, I'm really glad that the Wonder/Monster/Adventure Boy/World/Island series is in such good hands!
posted by destructive cactus at 3:24 PM on September 8, 2016 [2 favorites]


That looks so cool, cheers for linking or it would have flown right under my radar. Reifying the 8-bit computing experience tickles me so much. Dread to think how long that Mandelbrot image took to render, I remember waiting "forever" for my PS2 to produce a comparable image in YABASIC.

...try to apply my adult brain to the stuff I couldn't handle when I was a kid...
So much this! I learned so much from early computing which surprised me when it proved to be generally applicable, but equally so much eluded me too. Ironically, I think I get more and grok less these days. Might be a time/commitment/focus factor.
posted by comealongpole at 3:53 PM on September 8, 2016 [1 favorite]


don't think half the potential of the '80s micro-computers (and apparently consoles too!) was ever used.

You've seen Elite on a BBC Micro, right? A 32KB 6502?

And I can tell you for sure that every gate, every transistor and every clock cycle on the ZX Spectrum got fully utilised in that machine's history. That thing was a bare-bones Z80 platform with no hardware support for anything - the keyboard, display overscan, sound and tape IO were directly mapped to the CPU IO ports, there was one (1) display mode at 256x192 with a single fixed buffer and 1:1 pixel/bit mapping on luminance and 3:3:2 bit to 8x8 pixel square colour/effect attribute, 16K of the address space was the BASIC ROM, 48K was RAM, the Z80A ran at 3.5 MHz (less than that if it was accessing the 16K of RAM contended with the video hardware) and that. was. it.

24,000 titles were written for that platform.

By the end of its life as a primary target, there was absolutely nothing left unused or unabused in that thing.

Nothing.
posted by Devonian at 4:52 PM on September 8, 2016 [4 favorites]


(and don't get me started on the ZX81. Knowing how to frob the I and R registers on that was just the start, and most Z80 progtrammers don't even know it has I and R registers)
posted by Devonian at 5:02 PM on September 8, 2016 [1 favorite]


Most programmers these days don't know what a register is.
posted by octothorpe at 5:35 PM on September 8, 2016 [1 favorite]


Well, depends if they're getting a CS/CE degree or learning somewhere else. At least as recently as 2013, a cicroprocessor course was a graduation requirement for my CS program. I suspect these classes are single-handedly keeping Motorola's RISC-based chip foundries alive.

No, whether any CS grads remember anything from their required microprocessor course is another matter entirely.
posted by tobascodagama at 6:17 PM on September 8, 2016


None of the young programmers with CS degrees that I've worked with lately seem to have taken any hardware or machine language classes at all. They don't seem to have worked with anything lower level than Java.
posted by octothorpe at 7:48 PM on September 8, 2016 [1 favorite]


I was talking to a CS professor a couple of years back at an Intel processor event for one of their experimental non-GPU massively multicore devices - he was head of a department which had been working with Intel R&D and providing academic input on architectural design for scientific HPC.

He said that when he was interviewing potential students for PhD and post-doc roles, he had always made a point of being very favourable for those who had been on the demo scene, which he found a better indicator of potential than the more traditional academic pathways, and that he was now having trouble finding good people as that scene had largely gone. It was that lack of true low-level understanding and experience he was bemoaning.

I had the same message from someone else in industry, who used to have a standard interview question where they got a candidate to write a small routine in a high-level language (didn't matter which; even pseudo-code was fine) and then hand-compile it down to assembler - again, it didn't have to be a real assembler, as long as it showed knowledge. The real stars he wanted to hire would go beyond basic ideas and talk about how compiler technology interacted with processor architecture. Nobody could do that these days.

So when you kvetch about modern CS types not having a reality-based internal model of what actually happens when computers run code, you're right - but the problem isn't just academia or lack of insight in industry, it's that the truly talented who can gain and use such knowledge aren't teaching themselves, because the technological culture which made it the most interesting thing to do has gone.

And it's still important. You can still bring a powerful system to its knees by having it run stuff that breaks its pipelines or caching, no matter how artsy-fartsy the overlying virtualised containerised massively distributed platformed powerpoint in the brochure. It's the reason that the only place you'll get people still arguing about x86 architectural quirks is at Def Con et al, because security is one of the few places where you can't gloss over inadequacies by bigger, shinier, fuzzier.

This is where the bits live. This is how computer do.
posted by Devonian at 5:56 AM on September 9, 2016 [5 favorites]


I remember tinkering with a disassembler on the Amiga 1200 back in the day to crack shareware and create keygens. It was purely for fun and learning, although I do feel a bit bad about it now.

But what I can't fathom today is how I managed to do all that without having access to the internet. I actually found out everything by myself, and I just can't imagine being able to do that anymore.

It makes me feel stupid.
posted by Captain Fetid at 7:28 AM on September 9, 2016 [1 favorite]


I still often use my ti-99/4a, so hang out in the AtariAge ti-99/4a forums. There is a very active community making new programs and hardware for those machines, programming in assembly, pushing the machines to their absolute limits. I am in awe of what some of those guys are capable of getting that ancient hardware to do. I played around with basic programming back in the day, but I still can't get my head around machine-language programming.
posted by fimbulvetr at 8:23 AM on September 9, 2016 [1 favorite]


« Older You 360 Degrees Turn and Try to Look Away   |   Airbnb’s Work to Fight Discrimination and Build... Newer »


This thread has been archived and is closed to new comments