It doesn't track IP addresses
January 4, 2014 7:39 AM   Subscribe

Source Code in TV and Films reveals what the code for that GUI interface in Visual Basic is really for.
posted by griphus (89 comments total) 30 users marked this as a favorite
 
the girl just said she is creating an interface in VB to track IP addresses.

What else do you need to know?
posted by Colonel Panic at 7:47 AM on January 4, 2014 [3 favorites]


I bet it's using an Access database too.
posted by blue_beetle at 7:49 AM on January 4, 2014 [3 favorites]


In the TV series Arrow some C source code is shown for calculating the position of Jupiter’s Galilean moons.

That's kinda cool. So is Arrow a good series or what?
posted by Foci for Analysis at 7:52 AM on January 4, 2014 [1 favorite]


Assuming you can appreciate the YOU HAVE FAILED THIS CITY melodrama for what it is and sit through the network television-enforced Emotional Scenes, then Arrow is a pretty good series, yeah.
posted by griphus at 7:54 AM on January 4, 2014 [3 favorites]


(Also it gets a lot better when you realize that everyone working on it actually wants to do a Batman show and knows this is as close as they're going to get.)
posted by griphus at 7:55 AM on January 4, 2014 [5 favorites]


The Terminator's 6502 origins are shown, so this list is complete.

I had noticed the x86 assembly language in Elysium but had no idea where it was cribbed from. Seeing the sources tracked down is quite cool.
posted by localroger at 7:55 AM on January 4, 2014 [2 favorites]


It's kind of sad that they don't know to pick any really interesting languages. Like APL.
ackermann←{
     0=1⊃⍵:1+2⊃⍵
     0=2⊃⍵:∇(¯1+1⊃⍵)1
     ∇(¯1+1⊃⍵),∇(1⊃⍵),¯1+2⊃⍵
 }
Or Brainfuck.
++++++++++>>+<<[->[->+>+<<]>[-<+>]>[-<+>]<<<]
Or Chef.
Stir-Fried Fibonacci Sequence.
 
An unobfuscated iterative implementation.
It prints the first N + 1 Fibonacci numbers,
where N is taken from standard input.
 
Ingredients.
0 g last
1 g this
0 g new
0 g input
 
Method.
Take input from refrigerator.
Put this into 4th mixing bowl.
Loop the input.
Clean the 3rd mixing bowl.
Put last into 3rd mixing bowl.
Add this into 3rd mixing bowl.
Fold new into 3rd mixing bowl.
Clean the 1st mixing bowl.
Put this into 1st mixing bowl.
Fold last into 1st mixing bowl.
Clean the 2nd mixing bowl.
Put new into 2nd mixing bowl.
Fold this into 2nd mixing bowl.
Put new into 4th mixing bowl.
Endloop input until looped.
Pour contents of the 4th mixing bowl into baking dish.
 
Serves 1.
posted by sonic meat machine at 7:58 AM on January 4, 2014 [30 favorites]


any really interesting languages. Like APL.

Has anyone checked out the alien wrist display in Predator?
posted by localroger at 8:07 AM on January 4, 2014 [3 favorites]


Or Chef

What did I just read?
posted by klausman at 8:17 AM on January 4, 2014 [3 favorites]


I wonder what OS they use to enable Double Hacking.
posted by Pogo_Fuzzybutt at 8:20 AM on January 4, 2014 [2 favorites]


The problem with Brainfuck and Chef is that they are Turing-complete programming languages designed not to look like programming languages. Movie makers want props that do look like programming languages.

On reflection, I am convinced that the CSI VB-IP clip is not down to ignorance but deliberate fuckery on the part of the writer, who probably spent at least an hour brainstorming the most ridiculous thing he could slip past the rest of the production crew.
posted by localroger at 8:21 AM on January 4, 2014 [15 favorites]


I just watched GI Joe: Retaliation last night, and someone mentioned sending out a "cyber blast" so considering they used the phrase "GUI interface" and pronounced "GUI" correctly, I would agree with the deliberate fuckery part.
posted by griphus at 8:25 AM on January 4, 2014


On reflection, I am convinced that the CSI VB-IP clip is not down to ignorance but deliberate fuckery on the part of the writer…

I disagree. I figure that one's probably just ignorance. I think the "double hacking" video (that Pogo_Fuzzybutt posted) is more likely to be intentional fuckery.
posted by sonic meat machine at 8:25 AM on January 4, 2014


There were some terrible metaphors for code/cryptography in The Fifth Estate. I wish I could recall them exactly, but they were so bad I decided to look up how Rubber Hose and Wikileaks roughly worked on my own, so mission accomplished, I guess.
posted by mccarty.tim at 8:30 AM on January 4, 2014 [1 favorite]


Sometimes I turn on the local TV for 5 minutes before turning it back off.

The TV said: “SnapChat was hacked by exploits in their server.” I was thus enlightened.
posted by saber_taylor at 8:42 AM on January 4, 2014 [1 favorite]


VB-IP clip is not down to ignorance but deliberate fuckery on the part of the writer, who probably spent at least an hour brainstorming the most ridiculous thing he could slip past the rest of the production crew.

Or had some produce placement directive dumped on them
posted by fearfulsymmetry at 8:48 AM on January 4, 2014 [1 favorite]


(That sort of shit was really irritating in Homeland)
posted by fearfulsymmetry at 8:49 AM on January 4, 2014


For those not familiar with media production terminology, "produce placement" is when a commercial grocery chain pays a film-maker a sum of money in order to have e.g. a head of lettuce featured prominently in an establishing shot.
posted by cortex at 9:02 AM on January 4, 2014 [32 favorites]


Don't mess with Big Rutabaga...
posted by chavenet at 9:04 AM on January 4, 2014 [3 favorites]


I irritated my wide my making her pause Grey's Anatomy so I could read the code they used as part of the title sequence once, which was the same meaningless Java they show there.
posted by Artw at 9:29 AM on January 4, 2014 [1 favorite]


Or had some produce placement directive dumped on them

Nah, that's more likely to be visible as "Here, let me use my Microsoft Surface for that!".
posted by Artw at 9:31 AM on January 4, 2014 [1 favorite]


I irritated my wide

You haven't seen irritated yet.
posted by Horace Rumpole at 9:39 AM on January 4, 2014 [15 favorites]


Hey, that's no broad, she's my wide.
posted by Strange Interlude at 9:43 AM on January 4, 2014 [19 favorites]


Funny, Whelk made the same typo a couple of hours ago.
posted by octothorpe at 9:57 AM on January 4, 2014


Grmph.
posted by Artw at 9:57 AM on January 4, 2014


What sort of monster prints out source code in a non-monospace font? At least it's not Comic-Sans.
posted by MrBobaFett at 10:10 AM on January 4, 2014 [4 favorites]


There should be one standard prop source code that all movies/TV could use like the standard prop newspaper.
posted by octothorpe at 10:30 AM on January 4, 2014 [3 favorites]


wide is the new teh.
posted by mondo dentro at 10:35 AM on January 4, 2014 [1 favorite]


> The Terminator's 6502 origins are shown, so this list is complete.

A peak moment in movies for me. Hey, that looks just like an App][ system ROM dump! That Jobs, I knew he was a wrong'un.
posted by jfuller at 10:36 AM on January 4, 2014 [2 favorites]


A plot point in a first season episode of Elementary revolved around a snippet of code written in "Malboge," a programming language named after a circle of Dante's hell and allegedly designed to be impossible to read.

My girlfriend and I were cracking up at how ridiculous that was, how it was clearly created by clueless TV writers who don't understand the first thing about programming, when I googled it on a whim and okay television you win this round.
posted by Ian A.T. at 10:43 AM on January 4, 2014 [7 favorites]


My favorite on-screen code was that in Attachments, a British Channel 4 drama centering on a web development company during the Dot Com boom. Anytime they needed to show anyone coding, which was fairly frequent, they'd always be editing the same CSS file, over and over...
posted by Artw at 10:46 AM on January 4, 2014 [5 favorites]


They probably have to support Internet Explorer, Artw.
posted by wachhundfisch at 10:53 AM on January 4, 2014 [14 favorites]


Thanks to sonic meat machine's comment, I've discovered that not only is there a Wikipedia page on esoteric programming languages, but a whole wiki devoted to the same, which makes me so very happy. Time to go hone some skills of questionable use!
posted by kilo hertz at 11:03 AM on January 4, 2014 [1 favorite]


A lot of these obviously fall into the category, "no-one will ever notice, so have the art department do the best they can in ten seconds." (eg. html from Engadget, svg file from wikipedia.) Understandable, but still a wasted opportunity.

But, it's impossible not to wonder about the ones that seem to have been carefully placed as an obscure tribute or inside joke, such as the AwesomeWM source. And I love that the Superman clips show the source for the (goofy) program that actually gets run in the film. Would love to hear the conversation between the filmmakers and the person who was handed the script and had to write that code.

On the whole, kind of disappointing that so few of them are original or do anything interesting. I've had a few friends work on films who've spent way more time than it was worth carefully inserting obscure personal references into the props of movies that wind up being four pixels wide and illegible. Hard to imagine the people responsible for these wouldn't take the opportunity to have more fun with it.
posted by eotvos at 11:04 AM on January 4, 2014 [1 favorite]


I worked in the art department of a procedural which commonly used computer screens as a plot device. Directors and producers have zero interest in versimilitude when it comes to computer screens in film/TV. We were frequently asked to do highly unrealistic things with the imagery in order to sell a certain plot point or make some piece of information more visually available to viewers.

Beyond that, the idea that anyone in film/TV actually gives a shit what the code really means is laughable. There is literally less than zero oversight for this sort of thing. When I was doing it I tried to at least start from a place from the right kind of thing (so for example if the script calls for web source code, I would try to find the right sort of website at least). And that was a lot more "realistic" than most people, and wildly more legit than what people up the ladder wanted it to look like.

Also, we recycled screens and screen elements all the time. Why spend 20-40 hours building a "someone is coding something" screen from scratch when you could just use the same "someone is coding something" screen as before with a few necessary modifications?

It's actually EXTREMELY FUCKING DIFFICULT to make the screen graphics and was pretty much a full time job, even two full time jobs, on the show where I did this.
posted by Sara C. at 11:05 AM on January 4, 2014 [6 favorites]


Someone needs to get these folks to use Hacker Typer. I love to pull this up in a full screen window and tell someone I need to reformat their overcoils and defrag the modem. The bewildered look of slack-jawed amazement is priceless.
posted by Enron Hubbard at 11:06 AM on January 4, 2014 [15 favorites]


Also, you guys know that these websites and programs and things aren't really real, right?

What you see on the screens is made from scratch in Photoshop.

There is no "the actual code from the actual program that really did the thing" in a Superman movie. They wouldn't have hired a programmer to actually make any working software, unless it was done as a marketing thing to create content they could release as an app or something.

The closest we ever came to "real code for a real program" was an episode about violent video games where we hired a VFX company to design a fictional first person shooter for us. Even then, it wasn't a real playable game, and it wasn't made by a video game designer. We just farmed out the animation because it was beyond our skill set, and then created everything else based on the graphics the VFX company came up with.
posted by Sara C. at 11:12 AM on January 4, 2014


A caveat about the specific Superman II graphic: I actually don't know how this sort of thing was done in the 80s. With unsophisticated computers that can't display high res images in the way computers can today, you clearly can't build the screens in photoshop.

So, I don't know, maybe they paid someone to just write the program they needed?
posted by Sara C. at 11:26 AM on January 4, 2014


they'd always be editing the same CSS file, over and over...

To be fair, this fairly accurately encapsulates the actual process of web development since the advent of CSS, or replace CSS with XML, Java, Javascript, Ruby, xhtml, etc, or basically since the advent of web development.

So, I don't know, maybe they paid someone to just write the program they needed?

I'm assuming you guys are talking about Richard Pryor's "hacking" scenes in Superman III where he's sitting at a big old dumb terminal in the machine room of a mainframe.

If so, that program would take seconds to simulate and mock-up in BASIC or nearly any scripting language to be displayed on a dumb terminal. Hell, it could have been recorded as plain TTY, VT100 or ASCII terminal signals and played from a compact cassette data tape as 300 baud modem noises hooked up to a terminal.

I used to make BASIC and MS-DOS .BAT files that were often recursive that liked to spit out a lot of random or patterned text that looked very uber k-rad scrolling by. It took seconds to write a small BASIC script or BAT file that could call up some text file or random number generator, then write another script that called the first script a random number of times, and so on.

In the space of about 5-20 lines you can write a text bomb that will basically endlessly output significant looking text and numbers scrolling as fast as your poor old computer's video buffer can handle.
posted by loquacious at 11:45 AM on January 4, 2014 [2 favorites]


Also, you guys know that these websites and programs and things aren't really real, right?

are you saying that for the matrix they did not actually invent machines that the actors plugged into their brains

because idk it looked pretty real so i think maybe you are wrong
posted by elizardbits at 12:04 PM on January 4, 2014 [18 favorites]


What sort of monster prints out source code in a non-monospace font? At least it's not Comic-Sans.

What cracks me up is when web forums using the same old phpBB base as everyone else style it such that the [code] tag presentation is a proportional font. It doesn't really matter when it's the Ford 150 fans forum or whatever but it's still kind of hilarious.
posted by George_Spiggott at 12:15 PM on January 4, 2014


You gotta remember that lots of film and TV show workers don't have easy access to techie-types. They'll rely on almost anyone who seems the least bit convincing to create their tech props.
posted by benito.strauss at 12:23 PM on January 4, 2014 [1 favorite]


I've always assumed that most of the time when they have a whizzy interactive UI onscreen it's done in Flash. That's not to suggest they bother to make it genuinely interactive -- you can, but that's pointless here -- but it's very quick to develop a plausible looking scripted UI animation and fine-tune the timing so stuff zooms in and whisks out of the way and so forth.
posted by George_Spiggott at 12:30 PM on January 4, 2014


are you saying that for the matrix they did not actually invent machines that the actors plugged into their brains

As far as hacking tools, The Matrix used nmap correctly.
posted by eyeballkid at 12:32 PM on January 4, 2014 [2 favorites]


Given that it was shot on film in 1983, it sure seems like it would have been easier to film actual computer screens printing text on a screen than to do anything else in Superman 3. (I haven't seen the film, and am basing that only on the link from the original article, so there may be other more complicated displays shown elsewhere.)

Out of idle curiosity, I tried looking for the earliest instances of composited video displays in film. . . and convinced myself that I don't know nearly enough about the industry or its terminology to search for that information. I'd be curious to know when, in general, it became cheaper to add a computer display in post than to put a physical display on set and then have to mess with sync and lighting. I'd naively guess late-90s for non-futuristic interfaces, but I could well be off by decades.

But, even if everything is painted onto the scene afterward, someone still had to decide on the specific elements to use. I'm not at all surprised that those in charge don't care about the details, or disappointed that it usually doesn't make sense. But, when someone does go the extra mile and waste their time on details, such Easter eggs are delightful.

Also, this discussion prompted me to follow back links to the meatfigher website, which looks to be full of nifty things.
posted by eotvos at 12:34 PM on January 4, 2014 [1 favorite]


Hell, it could have been recorded as plain TTY, VT100 or ASCII terminal signals and played from a compact cassette data tape as 300 baud modem noises hooked up to a terminal.

This is much closer to how computers are depicted on screen nowadays, so this is pretty compelling to me. But I have no idea how it was actually done aside from the fact that the reflections on the screen in the screen shots imply that it wasn't designed separately and burned into the screen in post-production, which is another way this has been done in the past.
posted by Sara C. at 12:39 PM on January 4, 2014


You gotta remember that lots of film and TV show workers don't have easy access to techie-types.

This isn't entirely true, it's more that it's just not really worthwhile in terms of time or money to be ultra-realistic about this stuff. So you tend to get things that are realistic if the people making them know something about that particular aspect of tech, but once you get into what it looks like to work with Ruby On Rails, who even knows? Just use the thing we used last time there was a hacker or a programmer.

On the other hand, as audiences become more tech savvy and stories revolve more around techy stuff, I'm curious about exactly how realistic this will need to get going forward.
posted by Sara C. at 12:46 PM on January 4, 2014


I realize that in consecutive comments I just referred to the same thing as "hard to imagine" and "not at all surpris[ing]." I probably ought to stop talking now before I can think up with yet another vague and incompatible opinion to express. *sigh*
posted by eotvos at 12:47 PM on January 4, 2014 [3 favorites]


I've always assumed that most of the time when they have a whizzy interactive UI onscreen it's done in Flash. That's not to suggest they bother to make it genuinely interactive -- you can, but that's pointless here -- but it's very quick to develop a plausible looking scripted UI animation and fine-tune the timing so stuff zooms in and whisks out of the way and so forth.

This is true when the screen in question needs to have a lot of movement, but generally not true if it's not strictly required. And in my experience of it, doing Flash animations required infinitely more work than just having one static screen image, or a slideshow of static screen images that could be advanced by pressing any key. Which is what we mostly did unless it was written into the script that some more complicated animation needed to happen.

Like if you see an insert of a cursor moving over a link and clicking it, that would be Flash. If you see CSI Bro Du Jour looking at some kind of ballistics report on a monitor, it's much more likely to be one static image, or a slide show of static images.

I'm curious how this is being done for iPads, as they were really just coming into mundane use when I left the art department and stopped having to design these things.
posted by Sara C. at 12:53 PM on January 4, 2014


The other thing that is a factor in this, aside from time/money, is that the people most likely to know a little of the thing we're trying to create for the screen are the people least likely to have our opinions respected in a meeting with higher ups.

Not to say that directors and producers are all jerks, but the very nature of filmmaking is that they are in charge of telling the story in the way they want to tell it. It's not about the graphic designer and the research they did into what the SnapChat backend actually looks like. So we don't get a vote. If the director wants it bigger, or in a different typeface, or wants the words "DOES NOT COMPUTE. RETRY? Y/N" in there, you have to make those changes.
posted by Sara C. at 1:08 PM on January 4, 2014


Like if you see an insert of a cursor moving over a link and clicking it, that would be Flash.

That surprises me. I'd expect a prop assistant to just move the mouse and click, kinda like how Star Trek had them opening the sliding doors so Kirk doesn't break his nose walking into them. As you say, I was thinking of when the show calls (and is budgeted) for a sophisticated looking UI with zoomy sweepy animation and translucence and such. Dead easy to do in Flash with little or no actual coding but way overkill for a static website or console interface.
posted by George_Spiggott at 1:43 PM on January 4, 2014


There's a scene in Iron Man 2 where Black Widow is hacking into the bad guy's something or another (I barely remember that movie) and it's just plain HTML.
posted by brundlefly at 1:46 PM on January 4, 2014 [1 favorite]


I'd expect a prop assistant to just move the mouse and click, kinda like how Star Trek had them opening the sliding doors so Kirk doesn't break his nose walking into them.

The problem is that the program or website in question isn't real. You can't click on a link that's just a static image.

There is just no way to hire a team of web developers to create a functional website full of real content for a thirty second scene in a movie or TV show. On a TV production schedule, especially, it would be completely absurd.
posted by Sara C. at 1:51 PM on January 4, 2014


Well, whomping up a couple of HTML pages with links between them is pretty quick, but that does explain why web pages look so odd on TV shows, with great big username fields and a big jumbo unrealistic mouse pointer that you can't really miss even on an SD TV screen.
posted by George_Spiggott at 1:54 PM on January 4, 2014


I caught a few minutes of Stealth a couple years back, just at the part where they were going through the source code for the rogue AI. I shit you not, it was written in LaTeX.

No wonder the thing went crazy.
posted by aw_yiss at 1:58 PM on January 4, 2014 [5 favorites]


but that does explain why web pages look so odd on TV shows, with great big username fields and a big jumbo unrealistic mouse pointer that you can't really miss even on an SD TV screen.

No, that has nothing to do with it. We can make static images that look identical to anything you've ever seen on your computer screen in real life.

The problem is that the director will say "The cursor is hard to see. Can we make it a bright color?", or "I can't see where the username is. We need to make it bigger. No bigger. Really. BIGGER. Like twice as big as this." Not to mention that half the time the script is asking for something that would never exist in real life, or some external factor (the actor's comfort level, the blocking for the scene, etc) forcing constraints that wouldn't otherwise exist.

That's another angle on the reason we use static images rather than actually designing a fully functional website, though. That way we have full control over every aspect of the way the finished product looks and behaves. If we make a real website, what happens when the director wants the browser tabs to be a different color?

We also ran most non-laptops on Mac Minis, because they were small and easy to work with, but didn't always want to indicate that the person was using the Mac OS. We can't exactly create our own proprietary operating system from scratch. And if we did, well, again, what happens when this week's script calls for something specific, or next week's director would prefer something different?

Creating everything from scratch as a static image or Flash animation is just a hell of a lot easier than either giving up all aesthetic control or trying to make your own real functional software for $10,000 with eight days of prep time.
posted by Sara C. at 2:06 PM on January 4, 2014 [2 favorites]


Oh, I've just realized another reason to use Flash for mundane pages when HTML would be easier: web pages full of results usually render all at once, but directors seem to think it's more dramatic to have the data appear, with beepity-beepity sound effects, in a kind of trickle.
posted by George_Spiggott at 2:06 PM on January 4, 2014


(On simultaneous posting, kinda what you said in your fourth paragraph.)
posted by George_Spiggott at 2:07 PM on January 4, 2014


"Computer, compose note of thanks to Sara C. For her insights. Website: MeFi, subsite: BLUE."

+++INITIATING COMMENT+++

30%
50%
90%

+++ERROR ERROR ERROR. DELETING INTERNET.+++

"Whoops."
posted by Artw at 2:17 PM on January 4, 2014 [2 favorites]


Message To: max@job 3:14
Message From: Job

Max,
     Goods tainted. Consider extremely 
hazardous. DO NOT USE.
Komputerizing is seeree-us bidnez.
posted by blue_beetle at 2:17 PM on January 4, 2014 [1 favorite]


FWIW the bleep-bloop and lots of screen movement as things gradually render is mostly going away at this point. I think that worked 10-15 years ago when computers in general were less powerful and people used them less in their everyday lives.

It's still fully unrealistic, of course. It's just not usually that bad or that arbitrary in terms of directors and producers having magical ideas about how computers work. Nowadays you will get a writer who does their research and writes in references to programming languages that actually exist, even if the director still wants the cursor to be 300% larger than any actual cursor has ever been.
posted by Sara C. at 2:17 PM on January 4, 2014


May have accidentally created a robot army that is coming after you. Sorry.
posted by Artw at 2:23 PM on January 4, 2014 [1 favorite]


Hmmm, I'm tempted to adopt a new code review standard for my devs: "What type of medium might this code appear in?"
  • Mainstream sci-fi: stop showing off and simplify it.
  • Niche sci-fi: rewrite it as if you'd learned to program sometime after 1973.
  • Mainstream cable TV or niche network TV: improve code comments and rethink design, separate concerns instead of calculating and displaying in the same four-line code snippet.
  • Mainstream network TV: obfuscate it a bit, we don't want people realizing how easy this shit really is.
  • YouTube tech demo: delete everything you have and stop giving people hope.
  • Any Dilbert-related franchise: you're fired. Go get a highly-paid job as a speaker, WSJ commentator, and all-around masturdebator, then give me $5 and tell me how great you are.
posted by Riki tiki at 2:27 PM on January 4, 2014 [1 favorite]


One thing I thought was funny when flipping past whatever that show was with Linda Hunt and Chris O'Donnell whose purpose was to be flipped past: nonexistent Windows laptops that were obviously Macbooks spraypainted red with a glowing minimalist Windows logo masked into the lid in place of the glowing Apple logo. Way to not fool anyone, there.
posted by George_Spiggott at 2:32 PM on January 4, 2014 [1 favorite]


Google fu failed me but there was an interview a while back the guy who did all the computery stuff for The Wire... it's obvious they went a bit above and beyond what would normally be accepted to try and get it as accurate as possible (but then again, The Wire)
posted by fearfulsymmetry at 2:55 PM on January 4, 2014


> May have accidentally created a robot army that is coming after you. Sorry.

If it was a zombie nazi skeleton robot army you'd tell us, wouldn't you? Devil's in the details.
posted by jfuller at 2:56 PM on January 4, 2014 [1 favorite]


Bascially all the best parts of Terminator and Terminator 2.
posted by Artw at 3:05 PM on January 4, 2014


Oh be still my beating heart... BBC Basic was used in Aliens! (and loads of Dr Who and the Adventure Game). Code def appears on a monitor on the Davison era Tardis because I remember a website that analysed it (one of those times when you readjust your nerdiness levels)
posted by fearfulsymmetry at 3:20 PM on January 4, 2014


In the pipe, 5 x 5...

Meanwhile all the cool graphics and animations for Hitchhikers Guide to the Galaxy were all hand drawn.
posted by Artw at 3:22 PM on January 4, 2014


Oh, I've just realized another reason to use Flash for mundane pages when HTML would be easier: web pages full of results usually render all at once, but directors seem to think it's more dramatic to have the data appear, with beepity-beepity sound effects, in a kind of trickle.

Well, you could do it with Javascript easily.
posted by sonic meat machine at 5:25 PM on January 4, 2014


/idea forms for bullshit.js, a lightweight framework for movie UIs...
posted by Artw at 5:27 PM on January 4, 2014 [5 favorites]


(That sort of shit was really irritating in Homeland)

The better to fit in with all of the other sorts of shit in Homeland.
posted by HillbillyInBC at 6:04 PM on January 4, 2014 [1 favorite]




What sort of monster prints out source code in a non-monospace font? At least it's not Comic-Sans.

Those of us who use ProFontWin scoff at monospacing.
posted by Canageek at 12:44 AM on January 5, 2014


Jurassic Park was on tonight--that was a serious offender.

Though the Swedish Girl With Dragon Tattoo movies were good on this front.
posted by professor plum with a rope at 2:14 AM on January 5, 2014


It's not source code, but in Terminator 3 when Arnie reboots we see a listing that appears to have been taken from the Windows Control Panel, and which includes "Quicktime Player".
posted by alby at 7:53 AM on January 5, 2014


> in Terminator 3 when Arnie reboots

If ever there was a place for a Blue Screen of Death. They could even have made it red if they just hadta. Missed opportunity.
posted by jfuller at 12:40 PM on January 5, 2014


Actually, professor plum with a rope, Jurassic park was infact using a unix file system called fsn. Obscure, and no one really ever used it, but it was, in fact, a usable program that existed independent of Jurassic Park.
posted by Canageek at 9:23 PM on January 5, 2014 [1 favorite]


For suitable values of "usable"; it was an SGI tech experiment. I remember playing briefly with fsn on an Indigo back in college; it was an extraordinarily slow and clunky way of navigating a filesystem.

The other bit of real-world tech leakage in Jurassic Park (other than the looks-good-on-camera Connection Machines): in the screen showing Nedry's live camera on the dock they forgot to cover up the obvious QuickTime progress bar at the bottom.
posted by We had a deal, Kyle at 10:28 PM on January 5, 2014 [1 favorite]


I guess the issue I had was with things like that he'd have to go through 2 million lines of code to figure out what Newman was up to, and some of the other dialogue surrounding the computer system.
posted by professor plum with a rope at 12:13 AM on January 6, 2014


Half of it is flowerboxed comments.
posted by Artw at 12:16 PM on January 6, 2014


War Games hooked it up a lot like hackertyper.

Tron Legacy had Cillian Murphy using EMACS eshell to pipe PS into GREP to find out what process to issue a kill -9 on.

You're on a sad-assed system when you don't have pkill or pgrep.
posted by Ogre Lawless at 3:36 PM on January 6, 2014


You're on a sad-assed system when you don't have pkill or pgrep.

It was the lame, boring corporate suits' network, so you'd expect it to be sad-assed.
posted by radwolf76 at 5:15 PM on January 6, 2014


I've never understood pkill, what if you mess up the search string and kill the wrong process? pgrep makes sense but by the time it was common in linux distributions I already had ps aux|grep xyz wired into my finger memory pretty deeply. (And yeah, eat it SysV ps options, SunOS 4 lyfe)
posted by whir at 9:59 PM on January 6, 2014 [1 favorite]


You're on a sad-assed system when you don't have pkill or pgrep.

He was on someone else's system, he probably went with what he knew would work, rather then taking the time to see if pgrep and pkill were installed. Also, wasn't that on his Dad's old computer in the basement? pgrep and pkill both come from System 7, which was made in 1999. His Dad went missing long before then, so it would make sense that computer didn't have either of those.
posted by Canageek at 10:18 AM on January 7, 2014


I've been doing *nix stuff for two decades and have never heard of pgrep or pkill. They look pretty handy but not essential. Unix folks tend to be creatures of habit and use the utils and options that they're used to. Also, I've usually been in situations where I didn't know what kind of system my script would end up running on so I've tended to write to the lowest common denominator of utilities or versions of utilities so that they would work on Linux/*BSD/AIX/HPUX/Digital Unix/Solaris/Cygwin/etc.
posted by octothorpe at 11:09 AM on January 7, 2014 [2 favorites]


It's reached the Guardian...

Computer code in films: hidden meanings or irrelevant nonsense
posted by fearfulsymmetry at 9:41 AM on January 10, 2014


> You're on a sad-assed system when you don't have pkill or pgrep.

You do any kind of support, you're going to wind up having to fix something on a sad-assed system someday, and will dislocate your shoulder patting yourself on the back if you can function using only what they had in 1987. E.g. chmod followed by a string of digits instead of rwx. vi.
posted by jfuller at 7:35 AM on January 11, 2014


In fact, you'll thank the Ghods you've got vi, and not just ed.
posted by jfuller at 7:39 AM on January 11, 2014 [3 favorites]


« Older we keep hesitating at the notion that sex does not...   |   conspiracy of kindness Newer »


This thread has been archived and is closed to new comments