"The accuracy of the sea came at the cost of the land."
March 5, 2015 5:25 AM   Subscribe

That’s how I feel about the web these days. We have a map, but it’s not for me. So I am distanced. It feels like things are distorted. I am consistently confused. — Frank Chimero, on What Screens Want
posted by iamkimiam (31 comments total) 12 users marked this as a favorite
 
Companion post to What Screens Want: The Web's Grain.
posted by iamkimiam at 5:32 AM on March 5, 2015 [2 favorites]


I was initially put off by the formatting, but it is a pretty decent read (and scores points with me by referencing Connections). Also, this:
Increasingly, it feels like we decided to pave the wilderness, turn it into a suburb, and build a mall. And I hate this map of the web, because it only describes a fraction of what it is and what’s possible. We’ve taken an opportunity for connection and distorted it to commodify attention. That’s one of the sleaziest things you can do.
is a pretty good distillation of something that I've been feeling increasingly disconsolate about. And I get especially disconsolate about the fact that so many links on MeFi lead straight into the heart of the "sleazy" part, instead of... filtering out that stuff. Oh, well.
posted by Wolfdog at 5:33 AM on March 5, 2015 [7 favorites]


What screens want is not anything like how this sucker laid out his essay.

Petty snark?

No, if you're talking about design and your own is kack, your opinions are likely dreck too.
posted by MartinWisse at 5:39 AM on March 5, 2015 [3 favorites]


I interpreted 'screens' in the sense he uses it to be a metaphor for 'people'. And that the screens of today are the logical extension of the images of past.
posted by iamkimiam at 5:46 AM on March 5, 2015


That was... weird, and I'm getting deja vu. It closes with the familiar "we need a more open, decentralized, distributed web!" call but the whole piece is about design. I mean, by all means, great, they're a designer, talk about what you know, but it's strange how that whole piece just avoids the subject of content altogether. I don't really understand what the author is agitating for/raging against. "We paved the wilderness"-- what, you want Geocities back?

I've been having a debate lately with some folks about IRC vs. Slack, and what it comes down to is that modern, friendly web design is just so much more inclusive than the "old" internet. If we've turned the web into a suburb, at least it's a suburb everyone can move to. There's still old, strange, rough web out there if that's your thing. But as for what's on the web: it's way wider and more inclusive than it used to be. Let's talk about that, too.
posted by phooky at 6:18 AM on March 5, 2015 [2 favorites]


I was initially put off by the formatting

Yep. This. If this guy is a designer he hides it well.
posted by doctor_negative at 6:19 AM on March 5, 2015 [1 favorite]


Looked pretty good on my smartphone, I thought. It's from 2013, too, so maybe he'd have made different design choices today. Or the credited designer would have.

I just read Joshua Ferris' story "The Breeze." It's a series of vignettes of many of possible ways a woman's spring fling with her husband of several years can be disappointing, and not like the spring breeze that inspired her to seek it out. It's a pretty good story.

Later I learned that Ferris had composed it on smartphone, over a couple of weeks, a process he described as frustrating and painful. That explained his choice of vignettes, I thought. His screen narrowed his attention, and enforced break times. And maybe that also explained the woman's frustration and disappointment, too, Ferris' struggle with his screen.

Our tools shape our creations and our imaginations and I think it's smart to give our tools some thought once in a while just to be sure they're still working the way we want them too.
posted by notyou at 6:50 AM on March 5, 2015 [3 favorites]


What screens want is not anything like how this sucker laid out his essay.

It's easy to snark about designers who try to nudge the web in new directions, but Chimero knows what he is doing. I found the piece quite readable and engaging, and thought it was a good solution to translating a talk into a visual essay. Perhaps you'd prefer a slideshow, or a hokey TED video?
posted by oulipian at 6:53 AM on March 5, 2015 [5 favorites]


Petty snark?

No, if you're talking about design and your own is kack, your opinions are likely dreck too.


Switch to 'reader' view. Read it on a phone. Use your page down key instead of arrow or mousewheel. Open it with Lynx. It all still works. It all still communicates.

go to his homepage, that's plain ole HTML, that's actually beautiful still without images, or floats, or weird 'slide' effects with neon backgrounds at all.

It all still works, because it's good design. Chimero has taste. Discounting the entirety of his work because you disagree with some stylistic choices on a single instance of it seems harsh.

His book, "The Shape of Design" is also excellent.
posted by DigDoug at 7:07 AM on March 5, 2015 [5 favorites]


Well, the end of every fifth line or so is missi the last coupl of characte so I'm not terribly impresse with his web desig skills!
posted by alasdair at 7:20 AM on March 5, 2015 [1 favorite]


But as for what's on the web: it's way wider and more inclusive than it used to be. Let's talk about that, too.

Meh. Today's inclusive web is as "inclusive" as a suburbian HOA. I'm going to put on my cranky geek woman hat here.

Twenty years ago, I was telnetting into a Unix account at the local Freenet (anyone remember those?) and my university account, also Unix. My parents, being drafter-designers, always had the latest shiny x86 running Windows; my schools, university included, all had Apples. No matter: I could telnet from anywhere, pine for email, gopher to check library books, irc to practice French with geeks in France using my own, custom, streamlined irc Perl script to avoid the watchful bandwidth-hawk-eyes of our grad student sysadmins, finger my best friend to see if he was still at the School of Music or if he was at his dorm, write my html pages directly in my www directory, edit GIFs (once the img tag came around) on the Apples and put them in my img directory...

Last year, Gargantuan Client I was working with decided to massively roll out what is now newfangly called "virtualization", which concretely meant no longer having standalone PCs/laptops for employees, but having mobile workstations: you get a laptop with minimal storage, ThinPC OS (minimal operating system), no CD-ROM drive, that sort of thing, and you use it to log in to your own virtual session, from anywhere.

In old-school words: telnetting into an account.

With one big, BIG difference. The one that ties this story into the Internet as it was then, and how it is now. The reason it's suburbia and not wilderness. Just as with real-life suburbia and real-life wilderness, to each their own... but on the web, suburbia is less open than real-life suburbia. On the web, you are not always an employee. There is no reason for your personal usages to be constrained, other than your own knowledge and comfort levels, and obviously, legality and morality. Except... you are constrained far beyond all of that.

Sure, it's easier for people in general to use Corporate-Developed-OS than it is to learn *nix systems. And in general, that's fine. For a corporate system, it can be quite handy indeed. It's also easier to set up a Facebook account and post to that, than it is to reserve a domain name and set up a blogging system. Except that Facebook is for personal use (caveats for being aware of employers etc.).

The issue is, then, web suburbia is almost never on your own terms. Using the best-known example, though this goes for any social site, with the notable example of MeFi where each of we commenters "owns" our comments: You do not own your Facebook account. You can never own your "web apartment", so to speak. You do not even own any of the tools; you can only develop an extremely limited, expressly-permitted, teensy bit of them, and those can be shut down at any time. Heck, you can't even control who sees what – and that's not even getting into privacy, that straight-out means, if you post something, you have no idea which of your friends will see it or not. Facebook's algorithms are a mystery. Sure, it's easy to upload photos... but they're converted to low-quality JPGs automatically. Et cetera and so forth.

Everywhere you go on the web nowadays, sure, it's inclusive... on someone else's, usually corporate terms, applied to personal life.

It's an intrusive, exclusive HOA. Unless you have your own domain and run it with a careful eye to advertising influence so it doesn't go too far (much like MeFi, for instance). Even then, there are blogs written by real people where you can hardly tell whether what they write is actually that person's opinion, because of all the advertising agreements they have.

As a geek, I was fascinated by how easy HTML is to learn. Really, anyone can learn basic HTML. Pretty much anyone can buy (rent...) a domain name, write an index.html, upload it and see their creation. Links make it possible to point to other people doing the same thing. Wondrous! I just wish more people would. I wish we all knew how easy it all is; that you don't need to be a design genius or even use CSS. MeFi can be read and enjoyed as text with basic formatting. Twenty years ago, I too dreamt of a web where anyone who wanted to share, would. It's inclusive at its very root. We just need to be aware of it, and use it.
posted by fraula at 7:31 AM on March 5, 2015 [18 favorites]


I found the whole thing extremely readable, hella more readable than Metafilter. Very pleasant to read. Big friendly text with wide margins is important to me right now because my prescription is bad (this will get to be less of a concern in a week when my new glasses are ready). If that's bad design give me more bad design; I love bad design!

It was kind of a weird essay because it started being very much about one thing and then unexpectedly became about a few other things along the way, right up to a big change in what it was about at the very end. There wasn't time to get too deeply into the stuff that it changed to at the end, but I was OK with that too. "Things suck, they don't have to suck, suckiness is not inherent in the thing, it's an assumption in the way we think about that thing that we bring to it."
posted by edheil at 7:40 AM on March 5, 2015 [2 favorites]


Big friendly text with wide margins is important to me

man, me too. Any site that doesn't let me bump up text size gets crossed off my list. Being in my 40s is hard.
posted by DigDoug at 8:05 AM on March 5, 2015 [1 favorite]


I love my heavy metaphors!

The web windbag is a feature of virtual terrain, among several others, they constitute a new set of "virtual archetypes." This is just a little luxuriously, lenghty. I hope his emerging plan creates the more personal organic web he seems interested in facilitating.

I read creative work one way, that is every letter and punctuation mark. I read "information" differently eliminating most personal, chummy content and unnecessary background, just the facts, please. I feel my mood does not have to be toned in order to take information, to evaluate what presents as info. Design attempts convincing communication by every visually and psychologically cloying means imaginable (the big sell.)

The piece was detailed, and nicely viewable, but long. He revealed he is at sabbatical end, so hence a long stretch on screen, as he gets up and goes back to work.
posted by Oyéah at 8:34 AM on March 5, 2015 [1 favorite]


The issue is, then, web suburbia is almost never on your own terms.

I think this is where I really disagree, and where a huge assumption is being made. Web "suburbia" is a compromise, but the people participating in it have made a choice, and you can't just rob them of their agency like that. People who just want to use Facebook or Twitter aren't somehow stupid, or lazy, or unaware of the tradeoffs involved. They've just chosen to use that interface model because it's simple and serves their needs. The people who want more will find more. All the services you described still exist.

I'm from a similar internet strata from you, and as much as I loved and miss that particular era I'm pretty glad it's gone, because it was incredibly insular. Remember the process for creating a newsgroup? Remember the scorn that greeted the influx of AOL users? It was an unsearchable nest of shibboleths, and while I had a blast, I've learned far more about the world from the modern web than I ever did from the early internet. YMMV.
posted by phooky at 8:45 AM on March 5, 2015 [3 favorites]


Yeah, I spend too much time reading Metafilter, and I keep meaning to cut back. But I think I have such a hard time giving it up because it's such a great living remnant of the Old Web. I miss having a list of blogs and sites to check. At the time everyone knew the killer app was a site that would push data. That turned out to be Facebook. Now it seems like no one remembers how to pull data, or if there's data worth pulling.
posted by rikschell at 8:55 AM on March 5, 2015


My god people read the fucking article, enough with the stupid kneejerk design comments on every. single. god-damn. thing. written by a designer. It is completely worthless commentary. Rarrrrgh.
posted by Potomac Avenue at 9:01 AM on March 5, 2015 [1 favorite]


That was a fantastic essay from the always brilliant mind of Frank Chimero. I really loved the slow pacing and bringing me around to the final point after a long journey and period of discovery. I wouldn't really get why his point about borders & viewports & limits was important unless he took us all the way there along the path.

This is a great framework for how to do design in the future. People talk about designing "mobile first!" but usually that just means smaller boxes at first before sketching out bigger boxes, when the point of this essay is we should look at the content in an infinite space, then figure out what to show and where to stack it without a specific viewport limit in mind, but build from collage-style mockups as appropriate.

I loved this, especially the presentation of it.
posted by mathowie at 9:16 AM on March 5, 2015 [3 favorites]


There weren't any cat pictures.
posted by Chitownfats at 10:02 AM on March 5, 2015 [1 favorite]


My view is that today’s computer world is based on techie misunderstandings of human thought and human life. And the imposition of inappropriate structures throughout the computer is the imposition of inappropriate structures on the things we want to do in the human world.

This is something I've been saying for the longest time to anyone who will listen, and that I say as someone who has been IT professionally for over two decades on top of a decade as a tinkering young'un before that. We are all using tools programmed by people who have a distinctive way of viewing the world. Programmers have to because they have to translate the world into something a computer can understand. This necessarily involves, what I think of as a form of brain damage - oversimplification and narrowing of scope, abstractions that don't necessarily mesh well with reality, expectations that directions will be followed in one of a limited set of paths. And so, everyone in the world is being molded by these tools which have a sparse subset of possible expressiveness because of the nature of the tools and the people doing the creating.
posted by kokaku at 10:43 AM on March 5, 2015 [3 favorites]


"Screens are aesthetically neutral"
vs.
"Abstractions always distort and omit, because they have to. The trick is to be mindful it is happening."

cameras aren't neutral

e-ink doesn't look like an LCD != CRT != OLED != HDR 4K screens. There are limitations in screens that don't match human vision. There might be more differences between his horses than one horse representation on different screens, but those small differences do matter.
posted by morganw at 11:27 AM on March 5, 2015 [2 favorites]


I'm still going through this but one thing sticks out: James Burke is still way ahead of his time.
posted by charlie don't surf at 11:28 AM on March 5, 2015


kokaku: right, at heart it's about simplifying the world into something that can be represented by a Turing machine, a tape passing a read/write head.
posted by grubby at 11:30 AM on March 5, 2015


I found the aspirin analogy early in the article distracting -- leading from an observation that aspirin pills are smaller than they used to be, he claims:
It seems pharmaceutical companies have been able to make the active drug in aspirin more effective in the past few decades. The tiny aspirin pills are hardly aspirin at all, and the drug’s current version is so potent and physically minuscule that it must be padded with a filler substance to make the pill large enough to pick up and put in your mouth. Literally, you couldn’t grasp it without the padding.
That's surely not true, is it? Pills have always been mostly filler with a small quantity of the active ingredient. And I don't believe that acetylsalicylic acid has magically become more potent or that the dose has shrunk. The only thing that's changed is the size of the pill, and maybe that's more about marketing than anything else: that big pills used to be perceived as more effective.
posted by We had a deal, Kyle at 12:14 PM on March 5, 2015 [1 favorite]


I like the design, and I liked it just as much when Douglas Coupland's books started using it 10 years ago. Seeing it online made me want to go read Life After God again.

That's not really snark, it's enjoyable on the screen, and all the best ideas are the ones you steal anyway. Designers are doing interesting things with scrolling these days.
posted by emjaybee at 12:43 PM on March 5, 2015 [1 favorite]


We are all using tools programmed by people who have a distinctive way of viewing the world. Programmers have to because they have to translate the world into something a computer can understand. This necessarily involves, what I think of as a form of brain damage - oversimplification and narrowing of scope, abstractions that don't necessarily mesh well with reality, expectations that directions will be followed in one of a limited set of paths.

I usually divide computer tools into two basic paradigms, you can call it Microsoft vs. Apple if you like. The Microsoft method has always been to teach people abstractions of computer processes. The computer designers create tools to assist people to perform their tasks in the way the computer is most efficient. The Apple approach (Mac approach anyway) has always been Real World Metaphors. The computer designers create the tools to most closely emulate how users do their existing real world processes, and translate them to computer operations. For example, MacPaint had a paintbrush and a pencil. People already knew how to use paintbrushes and pencils.

It is very efficient in terms of computer power, to work in a computer-efficient way. This was important when computers were expensive. But now computers are cheap and you can use them inefficiently, in a human way. People do things inefficiently sometimes.
posted by charlie don't surf at 1:01 PM on March 5, 2015


That's a fair point, charlie_don't_surf. I was actually thinking more of programmatic concerns like names and time. These are assumptions and rules that get baked into applications regardless of whether they are abstractions or metaphors, and they have real-world effects. For example, one can tune out (it's too complicated...I must be stupid, old, incompetent) or feel forced to adopt a different representation just to be part of a community or to be able to use a tool (e.g. social networks that require real names).
posted by kokaku at 2:12 PM on March 5, 2015


I was actually thinking more of programmatic concerns like names and time. These are assumptions and rules that get baked into applications regardless of whether they are abstractions or metaphors, and they have real-world effects.

Until I read your links, I was wondering if you were referring to variable names. I have seen minor wars about variables, like whether you should use InterCaps for variables like CounterIncrement or other formats like Counter_Increment or even counter_increment. This probably has way more effect than intended. I'll tell you a stupid little story.

Way back in the olden days, I took a FORTRAN class. I was a wisenheimer high school sophomore taking a CompSci entry level class at the local university, so I took things way less seriously than the teacher liked. One of the Fortran standards was that integers variables started with I, J, K, L, M, or N to distinguish them from floating point variables, which could start with any other letter. The professor insisted we use descriptive variable names, even though there was a long tradition of using simple variables like an increment named "I." I liked using single letters because I hated keypunching, long names meant more chances to make a typo. And our class assignments were usually short so the code was obvious.

At one point, the instructor told me specifically to use full variable names, redo this assignment and resubmit it with full names. So I rewrote the code so it all looked something like this:

IF (JANE .LT. KATE) THEN
ALICE = ALICE + 1

Nope, the teacher did not like that at all. Now of course this is a completely frivolous use of the FORTRAN standards, but that was my point. You can name variables after people, if it helps you remember what the variable does. You can override the standards so I is a floating point number. It's just an arbitrary choice. But it persists to this day, many programmers still use I, J, or K as increment variables in a loop. It has no meaning today since nobody really uses FORTRAN anymore. But these meaningless anachronisms still persist today. How many of these anachronisms are structuring the internal design of apps today?

I was really interested by a link from John Gruber yesterday. Someone noticed an oddity of the iOS interface, it appears that in the original iOS from the oldest iPhones, you can give touch input while the screen is still being drawn. But in iOS 8, you can't do anything until the screen finishes drawing. There's a video of this interaction being demonstrated, and there's some dispute about what is actually going on here. But my point is, there are small coding decisions inside any OS that have widespread and unintended consequences. And I wonder how many of these assumptions are from the olden days when programmers verified their application results with their slide rules, where you added logarithms from the C scale and the D scale, C+D = x * y.
posted by charlie don't surf at 4:00 PM on March 5, 2015


many programmers still use I, J, or K as increment variables in a loop. It has no meaning today since nobody really uses FORTRAN anymore

They're still used similarly in sigma notation in math texts, aren't they? That probably contributes to the usage longevity.
posted by Chitownfats at 5:33 PM on March 5, 2015


Hmm.. I never thought of that. Obviously the conventions of sigma notation came way before FORTRAN. That language was always for math geeks, in my day all programming classes were taught by the Math Department.
posted by charlie don't surf at 6:01 PM on March 5, 2015


I am just catching some of the earlier parts of this thread..

I just read Joshua Ferris' story "The Breeze." It's a series of vignettes of many of possible ways a woman's spring fling with her husband of several years can be disappointing, and not like the spring breeze that inspired her to seek it out. It's a pretty good story.

Later I learned that Ferris had composed it on smartphone, over a couple of weeks, a process he described as frustrating and painful.


Holy crap, Ferris and I are classmates, graduated same year, Class of 96.

By the time The Breeze came out, this format was already an established literary form. It originated in Japan, it was called a keitai shousetsu, a "cell phone novel." I remember attending a reading by a famous Japanese writer who came to the Workshop for a year, he was known for his keitai novel. I remember asking him some literary questions after the reading, he totally blew me off and ran out to the bars with some cosplay-ish groupies. I probably would have too, if I was him. I have totally forgotten his name, and probably so has everyone else.

Anyway, I will return more to the FPP by giving my initial reaction to the question "What do screens want?" I immediately thought of an old quote by Duchamp, "the audience can never please the painter, it can only please the painting." I remember hearing an hour-long lecture from some idiot postmodernist art history professor on this topic, which I totally disagreed with, but I came out with a firm interpretation of that quote: artists and audiences never meet. When designers create a graphical user interface, they think they are interacting with users. They are wrong. They are interacting with a computer. People look at websites like the FPP and think they are interfacing with the writer. They are wrong. They are interacting with a computer.

This is the ultimate fallacy of artists and designers. They think they are creating an experience for people. No. The people create their own experience of whatever junk you made, and then they tossed it aside.
posted by charlie don't surf at 7:03 PM on March 5, 2015 [1 favorite]


« Older The Bizarro Universe of Italo Disco   |   Measuring out My Life in K-Cups Newer »


This thread has been archived and is closed to new comments