Just wait until I port Fortnite onto Apple 2e
April 12, 2018 5:26 PM   Subscribe

 
This is an interesting analysis, though I guess given all of the upsides of things like process memory-space isolation I’m not so inclined to think of it as a significant loss in any sense (although like anyone who appreciates retro-computing I do lament the fact that the rate at which developer averseness towards consuming excessive system resources seems to exceed the rate at which hardware resources improve). I also think that latency is multi-dimensional; I’d be curious to see how audio latency has fared across the decades, with the proviso that comparing results in that dimension is harder given the difference in computational burden between e.g. 8-bit wavetable synthesis on a SID chip and spectral synthesis in Iris running as a VST.
posted by invitapriore at 5:54 PM on April 12, 2018 [3 favorites]


I've long held to the opinion that OS and application developers would much rather work on new, whiz-bang, resume-building features for the next release than work on dull, tedious, unglamourous things like speed and robustness improvements.

That's why my 2016 HP Win7 laptop is slower (for ordinary, everyday tasks) and less reliable than my 1986 80386 MS-DOS PC was.
posted by ZenMasterThis at 6:00 PM on April 12, 2018 [3 favorites]


I don’t think it’s developers driving this entirely. Slack, for instance, is a resource-hogging Electron app probably less because of JavaScript diehards pushing that approach than because excessive resource consumption and input latency is not a significant detriment to their adoption and so isn’t prioritized as a problem.
posted by invitapriore at 6:09 PM on April 12, 2018 [9 favorites]


"I've long held to the opinion that OS and application developers would much rather work on new, whiz-bang, resume-building features for the next release than work on dull, tedious, unglamourous things like speed and robustness improvements. "

OS and application developers don't build what they want to build, they build what they are told to build (by the large). If you are a developer working at a company producing an OS for Apple or Microsoft, your resume isn't going to be better or worse if you worked on portion y vs portion z; you're resume is already looking pretty good.
posted by el io at 6:11 PM on April 12, 2018 [7 favorites]


I have a minor issue with taking the latency measurement in the OS’s terminal. At least on every Mac I’ve used, it’s always seemed that Terminal was noticably more laggy than, say, typing in TextEdit. I dunno why, but that’s been my experience. YMMV, of course.
posted by Thorzdad at 6:19 PM on April 12, 2018


I got my first gaming rig recently... And I noticed it had a port I hadn't seen in some time - a ps/2 port. I was pretty confused (I am not a gamer, I bought the machine for lightroom/ps) at first. Then I had an 'ah-hah' moment, where I realized a mouse or keyboard using a PS/2 port might use hardware interrupts, wouldn't have to go through a USB driver, and might have significant less latency. When I opened this article I did a control-f for 'usb', and did not find any results. So I'm sort of taking the entire article with a bit of grain in its exploration of these issues. That being said, I'm hope this gets more attention as latency is super-important (and not just for things like gaming; hell typing a message on metafilter is harder with more latency).
posted by el io at 6:44 PM on April 12, 2018 [4 favorites]


PS/2 has some other advantages, like allowing n-key rollover (being able to, in principle, press every key on the keyboard at the same time and have them all register) without special drivers.
posted by Pyry at 6:47 PM on April 12, 2018 [5 favorites]


I'm not sure the author really has a coherent set of thoughts, here.

On the methodology side, it looks like a human is still pressing the keys...? Which means that 10ms resolution looks, well, bogus, since the human variability is going to totally blow it away, barring a TON of repetitions to get the distribution of one person's finger press. (For the record, I'd rather see a well characterized pinball plunger tied to a voltage trigger signal, and a decent light sensor for the screen, all tied to a measurement clock and data acquisition box.)

On the "Why do we even care about this?" side, the papers the author cited talk about the users ability to distinguish between latencies only on drag and tap tasks. In fact, the second paper summarized as "it causes users to execute simple tasks less accurately" doesn't actually say that at all. Nothing seems to look at actual productivity measures associated with latency.

And on the personal side: 1. I've used teletypes productively. 2. I've written React web pages that make a mere 200ms seem gloriously fast. Sorry.
posted by cowcowgrasstree at 6:58 PM on April 12, 2018 [4 favorites]


I was sure this was a double, and it sort of is;
https://www.metafilter.com/170035/performance-claims-without-benchmarks-probably-arent-true

Mostly because I started to make the same remark I did last time.
posted by bongo_x at 7:24 PM on April 12, 2018 [3 favorites]


Yeah. Give me like, a couple of days with an 8051 (or hey, an Arduino) and I'll build you a system that shows a key stroke in single-digit microseconds. Well, plus frame latency. Of course it'll do fuck all else...
posted by kleinsteradikaleminderheit at 7:46 PM on April 12, 2018 [4 favorites]




The minimum latency for the Apple 2 is measured in microseconds, since the CRT can start displaying different data as soon as the keypress is detected if the beam is in the right place. (and i think the keyboard scan rate is in the kHz, modern USB keyboards add 10s of msec latency)
posted by RobotVoodooPower at 8:08 PM on April 12, 2018 [2 favorites]


> Give me like, a couple of days with an 8051 (or hey, an Arduino) and I'll build you a system that shows a key stroke in single-digit microseconds.

Now that you can choose from a variety of open-source keyboard drivers for TMK and QMK, you can probably do it in less than a day with less than ten dollars' worth of hardware.
posted by ardgedee at 3:47 AM on April 13, 2018


One of the reasons Slack's poor performance hasn't been an issue for a while is that it was possible for all of Slack's history to interact with it via the IRC protocol. The people who care about the memory load could open up the big app when they need to play games with some google doc plugin, and just deal with everything else via irssi or something else low-fat.

But this is due to change at the end of the month. Slack are ending support for IRC, and it is going to force all of these people to spend more time in the memory-hungry version. We're starting to see pushback now that news has come that the IRC support is over, and I think it could shift priorities for them.

I'm curious about other similar systems, though. Discord seems to be a bit lighter and focused on games and voice chat. It's better at keeping multiple sites open and tracking alerts from all of them (though the interface makes muting policy slightly harder to get right)
posted by rum-soaked space hobo at 4:12 AM on April 13, 2018


Salient: Walter Doherty wrote a watershed paper on the subject of latency for IBM back in 1982
In addition, an average, experienced engineer working with sub-second response was as productive as an expert with slower response. A novice's performance became as good as the experienced professional and the productivity of the expert was dramatically enhanced.
What happens to my users' productivity when the app I'm maintaining for them has to read and write everything to a cloud database API with a 1-2 second latency?
posted by clawsoon at 6:12 AM on April 13, 2018 [2 favorites]


I would think modern issues of UI latency and perceived responsiveness mostly are concerned with much longer timescales.

You might be able to perceive the difference between, say, 30ms and 200ms of latency in UI response, in reaction to a mouse click or a touchscreen tap, for example. But both are likely to be perfectly acceptable. Most people aren't going to notice the difference most of the time.

UI latency becomes a problem when you click, swipe, or tap something and... nothing happens for a few seconds. You can't tell if your input was in the wrong spot, if it just didn't register, if something is stuck, or if the reaction is just slow. Your real task has been interrupted by a new, administrative task: decide whether to try again or to wait.

Those kinds of issues don't have the same root causes as the keystroke latency. There's nothing wrong with improving the keyboard scan rate, the display system latency, etc., but those improvements aren't likely to fix real, perfectly justified complaints so many of us have about our ludicrously complicated devices.
posted by Western Infidels at 8:25 AM on April 13, 2018 [2 favorites]


Latency is a hobby topic of mine, mostly related to videogames. (Lots of links here). A really tight game like Super Meat Boy just feels so much better than, say, a Javascript .io game. This article is taking about responsiveness just on a local computer, and it's remarkable to me how so many modern systems accept 100, 200ms latency as if that's OK. 16ms is a magic number to remember; that's the 60Hz framerate. But even 2ms latency can be noticeable.

Online games have huge problems dealing with network latency and hide it in various clever ways. The difference of feel between League of Legends and Heroes of the Storm, nominally the same kind of game, is particularly telling. The simple way to think of it is LoL is built like a fighting game (a la Street Fighter) whereas Heroes of the Storm is built like an RTS (a la Starcraft). Overwatch had significant lag early on because they used a low update rate on the server, thankfully they changed that. Their netcode is complicated too.

Also wanted to shout out this article comes from The Recurse Center, the nifty New York programmer educational retreat. A lot of interesting things come out of it.
posted by Nelson at 8:54 AM on April 13, 2018 [2 favorites]


It depends on what you consider latency. Memory latency surely has vastly improved. I never had a Apple IIe but on my ZX Spectrum loading a game from tape took several minutes. Now I can load a game in a browser-based Spectrum emulator and it boots in 100 milliseconds or so.
A modern computer with a CRT display sounds like an interesting idea, though. Apple IIe vertical retrace takes about 25% of the frame time. So if you don't do double buffering on your modern Apple IIe emulator and draw everything during vblank instead, it's only a 75% performance hit.
posted by ikalliom at 2:41 PM on April 13, 2018 [1 favorite]


The client apps for both Slack and Discord are web browsers. Use your favorite web browser "reload" hotkey and see what happens. So it's effectively an app running in a virtual machine runtime environment in an app in your OS.

If a modern web browser makes your computer struggle so will these chat clients. They have an immense amount of overhead. Of course, they're exploiting that overhead in various ways; if your personal perspective on Slack is that it's basically a proprietary counterpart to IRC, it's kind of reasonable to begrudge it. But if you're thinking of it more like a robust communication environment that lives in your computer or phone, all those audio, video, image, file shares, embedded applications and scriptable tools are reasonably only deployable at their current economic scale because Slack's and Discord's dev teams only have to write for one environment and treat any platform-specific issues as minor feature variations. In a way, it's the dream that Java once had, only this time it works.
posted by ardgedee at 3:17 PM on April 13, 2018 [1 favorite]


and treat any platform-specific issues as minor feature variations

And just ignore minor things like OS accessibility features because VCs don't care about that sort of thing.
posted by Space Coyote at 3:24 PM on April 15, 2018


« Older Paul McCartney Rabbit Hole   |   The Secret Language of Ships Newer »


This thread has been archived and is closed to new comments