Broadband vs Dialup
August 23, 2004 5:08 AM   Subscribe

There are now more home internet users using broadband than dialup in the U.S. - Does this mean that web designers will continue down the same path as some programmers and create bloated code? Are the days of trying to be efficient and keeping pages less than 70k a thing of the past?
posted by tomplus2 (29 comments total)
 
When I had dialup, it was common practice for me to type in a web address and leave the computer for minutes while it loaded. (And seriously, I'm not just talking about porn here.) I think the emphsis has been on heavy graphically content for a while now.
posted by sexymofo at 5:31 AM on August 23, 2004


Are the days of trying to be efficient and keeping pages less than 70k a thing of the past?

Only if you want to flush your Google rankings down the toilet.
posted by gimonca at 5:43 AM on August 23, 2004


Are the days of trying to be efficient and keeping pages less than 70k a thing of the past?

Bandwidth costs $ for the person(s) hosting the website. Keeping your pages optimized is still important. Better to serve up 500K of optimized code / images / media, than the equivalent of 70K bloated to 500K.
posted by Stuart_R at 5:45 AM on August 23, 2004


Oh come on... I don't know how many of you guys are web developers, but those who are will tell you that the idea of keeping below any size is a nice idea that we all fight for, but which managers and graphic design people keep destroying. IMHO, the idea of a 9 second download was almost never held on anything other than pure text sites.
posted by twine42 at 5:51 AM on August 23, 2004


I am more concerned with inconsistencies in the "code" standard - meaning some things don't work unless you have browser Z, or visa versa.

I see the argument with size, and the places which are necessities (banks, clinics, etc.) should have a modem version and a broadband version (and a flash version?) but until someone whacks us with a user operability stick, the notion of big equating to good is stuck with us. IMHO there are too many sites which just don't work - lousy interaction. It has nothing to do with size.

The browser incompatibilities, though, are just plain stupid on the industry's part.
posted by fluffycreature at 6:06 AM on August 23, 2004


we spent several months redesigning our site which involved shortening path names to graphics (trying to squeeze every last byte out of our code) to be then told that for accesability purposes we had to go through the same code later and add alt tags to each image and java applet - defeating the purpose of the path length.

its hard work trying to keep a page down below a certain weight. guys from ad companies dont care, if they want bells and whistles on their banner ads there isnt much we can do do unless we dont want the money.

id say its an evolutinary thing as with everything. cars... operating systems... doubt it can be prevented.
posted by monkeyJuice at 6:21 AM on August 23, 2004


The browser incompatibilities, though, are just plain stupid on the industry's part.

If that inconsistency means that you can lock users into using your browser, it is quite smart -- at least, for the producer of that browser.
posted by eriko at 6:22 AM on August 23, 2004


I don't think the file size of graphics make much of a difference these days, even on 56k. Most modern browsers do a good job of laying out the page before loading the graphics anyway.

What does make a difference is complexity of the HTML and flash/java. I've never seen a browser or OS that doesnt lock up for a while when it's trying to initialise a java applet.

the newly redesigned mtv.com is a bad offender. View the source of that site, jeez. Nothing wrong with 1, maybe 2 flash movies embedded in a page, but that site must have, like 10.
posted by derbs at 7:25 AM on August 23, 2004


The surfacing of low bandwidth items (ie, cellphones) would suggest a return to low bandwidth pages.
posted by fletcher at 7:45 AM on August 23, 2004


On the other hand if you have ALT tags for all images people can browse your site with out images on. Which is what I do over my 28.8 link at home. And blind users can browse your site with a screen reader. The internet is an awesome resource for those with limited vision.
posted by Mitheral at 7:55 AM on August 23, 2004


Speaking from a personal perspective, even though I am a user of low-bandwidth portable devices I make no real effort to limit the size of my page. I use an HTML to TEXT tool if needs be.
posted by ed\26h at 8:06 AM on August 23, 2004


Converting to something like (X)HTML Strict and separating display from content (via CSS) has been a huge bandwidth saver on a lot of the large websites I design/maintain. That, along with things like mod_gzip, make a big difference in the "feel" of a site, even on broadband. Not to mention the pocketbook (bandwidth costs).
posted by afx114 at 8:08 AM on August 23, 2004


monkeyJuice: Shortening repetitive paths isn't going to buy you much; gzip should be largely removing redundancy like that for you. You are gzipping content, right?

If bandwidth's an issue, you're better off worrying about things like handling If-Modified-Since/If-None-Match in your dynamic code and helping out proxies with proper Cache-Control headers.. and of course doing as much as possible in CSS.
posted by Freaky at 8:08 AM on August 23, 2004


derbs - i count 3 flash animations on mtv.com. (flashblock only has 3 icons to click on.) they just made them all huge, and that makes it look like more. but it is an isane amout of flash... there's big blocks of text that could have been easily displayed as, oh, text, maybe.

flash for navigation menus or page content pisses me off. so do designers who use pictures of text rather than actually using text. i understand the desire to make everything pixel-level perfect, but we're working with the web here, not with print layout. i've gone from relatively rigid tables to as elastic a design as i can with the sites i maintain, and i always make sure i can read them with an old browser before going live. if i can't access the content with netscape 1, i didn't do a good job of putting content and navigation into the page. with css-positioning i can actually put content first and nav menus second in the code, but have the order reversed in a newer browser. google pagerank jumped up quite a few notches after doing so. is that enough incentive to do better design? it was for me.

anyway, broadband, while nice, isn't an excuse for bloated pages. i'm not an expert (self-taught) but if i can see that a corporate website is horrific, why can't the (probably better informed and better funded) person who built it see that? is it really all about convincing the boss, or are some of the corporate designers just not as picky about standards as the zeldmans of the world?

(i mean, you use css and still drop in friggin' font tags? what was the point of the css then, you dope?)
posted by caution live frogs at 8:40 AM on August 23, 2004


From an interview with Bradley Grosh (Gmunk.com)...

You are just about one of the only 'design' sites that seems to not care whether they have huge 900k+ files on the site, are you trying to predict the broad band future before it arrives:

Absolutely.. why compromise your work when you know that in about one year it will be considered small by the current bandwidth standards?.. Naw, I know I get a lot of shit for that, but I've seen some pieces that were compressed to the point that the piece was totally ruined...

'Hey, check out my new splashpage, it's a mosaic of artifacts and muddy, desaturated color...'

It's like a really bad first impression.. You know, dandruff, spinach in the teeth.. bleeding gums...you want your shit to look good when you publish or nobody will respect you, even if they have to wait and extra minute or two... but you could argue that 'you create for your medium and accept the compromises..' Naw, I disagree because our medium is forever dynamic and improving upon itself... are you going to want to repurpose all your graphics in 6 months because the connections all got faster? why not design for the future? and that's what I do....

Do you care about us poor saps who still have a 56k modem:

Sorry bro, but I can't consider the 56K posse as 'dedicated' webSurfers.. anyone who makes surfing the web a priority in their lives (sadness indeed) would be smart enough to buy a Cable modem or DSL... plus, a majority of my audience all work within the confines of a T1 network, so to them, 900K files are sissy lil' PussDroplets..


posted by bluedaniel at 9:42 AM on August 23, 2004


This assumes people who browse the web on portable devices aren't "dedicated". It seems like CSS + XHTML (+ mod_gzip if you've got it) solves most of these problems. And splash pages on a normal site can't be ruined. They already suck.
posted by yerfatma at 10:01 AM on August 23, 2004


Tossing in my vote for smaller page sizes. Though I recently joined the broadband club, I still see the value in serving optimized code. All sites I design adhere to separation of content from presentation and the benefits of improved page rank is one of my primary selling points when I meet with prospective clients. Like CLFrogs above, non-text presentation of navigation and text content shits me to no end. I still do a sizable percentage of my browsing using Lynx--it's easier to mask my slack-time at work when I've got terminal windows chock-a-block with text all over my desktop.

Oh, and Bradley Grosh can suck it.
posted by Fezboy! at 10:06 AM on August 23, 2004


Freaky
i think it was just a case of making the code a bit tighter..

rather than having to type: /core/images/transparent.gif we now just code /core/i/t.gif thou granted, its never going to make a noticable difference to the page download speed.

content is xml rendered against xslt so theres no need for gzipping i think -
posted by monkeyJuice at 10:07 AM on August 23, 2004


People like Bradley Grosh are the reason we need Webmonkey Licenses issued before people can take a dump on the web.
posted by influx at 10:28 AM on August 23, 2004


The bigger change that's going to come about from near-universal broadband, I think, is that more and more sites are going to move away from a page-based metaphor for their structure.

When you think about it (or at least when I think about it), issues like "page weight" aren't the only constraints that survive from a low-bandwidth world. The whole premise of pages being tossed back and forth between the browser and a server is really the only model possible under a low-bandwidth model, but the page-based model of "request...load...post...re-load" makes for a crappy experience when you're doing anything more interactive than reading pages. Who here actually _prefers_ to deal with a DB through a browser, for the sake of the experience?

I know MS has tried to address this, badly, by cramming ActiveX objects into everything, and I know that a lot of sites are trying to approach this using Flash (and not always doing a great job of it). I'm definitely not arguing for web pages to get overloaded with crap. Experiences that are about reading should continue to be clean, efficient, and quick. On the other hand, I think that many applications and other online experiences that require more interactive complexity are going to (hopefully) move to a more responsive and fluid model, that doesn't still feel like a slo-motion tennis match with a server.
posted by LairBob at 10:39 AM on August 23, 2004


Stop worring about things like "ALT" tags and length of URLs and get mod_gzip installed. :-)

Less graphics is *really* the way to trim down a webpage. If you're using mod_gzip, a bunch of URLs that have "/this/is/my/path/to/graphics" instead of "/graphics/" won't make a page any bigger at all (well, technically...)
posted by shepd at 10:57 AM on August 23, 2004


i work on a newspaper site and the RNIB is asking for commercial sites to be made more accesable - its six of one half a dozen of another
posted by monkeyJuice at 3:19 PM on August 23, 2004


There's a place for high bandwidth media, but it's not in otherwise functional information pages. Generally, something that requires high bandwidth is going to require a user's full attention. Moreover it's pretty clear that high bandwidth background effects (music, animations, etc.) drive people away like the plague. So hopefully, this will enable a great deal more high bandwidth content, WITHOUT overloading existing pages. And, what everyone else basically said: Good web design practices tend to reinforce low page weight anyway.

I think a critical point was reached when clients stopped thinking of their websites with a "place" metaphor. Maybe the word "site" threw them off at first, but the idea of creating ambience is no longer in great vogue. Google helps too. Clients are beginning to recognize the need to move from Scarlett to Melanie: familiarity and ease over excitement and challenge. (And this is historically true of most media: once the initial excitement is over, they begin to move towards transparency of the apparatus)
posted by condour75 at 4:04 PM on August 23, 2004


flash for navigation menus or page content pisses me off.

Then, whatever you do, I suggest you avoid looking at corporate or shopping or portal or pretty much any Korean websites. I'm not sure if I've ever seen one that doesn't use Flash in utterly gratuitous ways, at the very least for the requisite spattering of ads, but usually for the navigation and content. It'll give you an aneurysm!

But then Korea has the highest penetration of broadband in the world (mostly because of their recent and omnipresent urge to live in vast human beehives), so, once again, this might be your future.
posted by stavrosthewonderchicken at 4:27 PM on August 23, 2004


Only people with a punishment fetish would browse porn at 56k. Oh it hurts so good...
posted by Keyser Soze at 4:46 PM on August 23, 2004


Define broadband. We live in Seattle, one of the most "wired" cities, supposedly, and our so-called DSL from Qwest is only 256 Kbps maximum, and half the time it trains down to 128 or just goes completely dead for 20 minutes at a time. Qwest claims they can't improve our service, and if it ever gets shut off they won't let us have it turned back on because they have retroactively decided we don't qualify anymore, and we "cost them too much" (!). (This after two years of great high speed service before the speed and uptime suddenly went to crap when Qwest made changes.) And we can't get a cable modem because the cable company that serves us doesn't offer a plan that fits our requirements (a business web server without a problematic bandwidth limit). If we lived across the street we'd have a different cable provider, but nooooo... competition? WTF is that? Never seen it.

So people may technically "have broadband," but that doesn't mean that efficient pages aren't still needed, even by many of the broadband subscribers.
posted by litlnemo at 5:00 PM on August 23, 2004


more and more sites are going to move away from a page-based metaphor for their structure

IMO, if it doesn't work in lynx, it ain't the web. It's just some binary hoo-hah that happens to be served on port 80.
posted by George_Spiggott at 5:41 PM on August 23, 2004


IMO, if it doesn't work in lynx, it ain't the web. It's just some binary hoo-hah that happens to be served on port 80.

Fucken Luddite.
posted by sharpener at 10:46 PM on August 23, 2004


The web is global anyway. Just because most Americans are using broadband doesn't mean the rest of the world is.

I do suspect, though, that people who do not yet have web access will get broadband first rather than dialup.
posted by etoile at 9:28 PM on August 25, 2004


« Older The Frugal Frankenstien   |   Girl Hanged Newer »


This thread has been archived and is closed to new comments