Join 3,424 readers in helping fund MetaFilter (Hide)


Collaborative Insanity
November 20, 2010 2:55 AM   Subscribe

A provocative short essay on design education by Andy Retludge: If you emerge from university today with a web design degree, chances are rather slim that you’re employable as a user experience (UX) or web designer. Maybe you learned a lot of stuff; it’s just probably the wrong stuff. Congratulations, you’ve been defrauded. Hope it didn’t cost you or your parents too much.
posted by parmanparman (57 comments total) 24 users marked this as a favorite

 
That's an interesting beginning to introduce his opinion. It's an opinion I tend to agree with, but I'd have liked to see him back it up with facts and figures. Or to name names, both for the institutions he believes are ripping people off, and the ones he thinks are doing things the right way.

My workplace likes to hire people straight out of study, give them some on-the-job training and valuable experience, then give them a good reference as they move on to more interesting jobs. We know these students won't know much about real-world web design, we know that we don't have the most exciting work to offer, but we try to be a decent first employer for them.

But yeah, most of these people are only trained in how to use Dreamweaver, and have been told not to use more than 2 fonts on a page or too many bright colours. They've got no idea how to use the cascading part of cascading stylesheets, they don't understand why inline scripting ain't a great idea, and they're a bit fuzzy on the idea of fluid or elastic layouts. Some of them aren't even sure when is a good time to save a graphic as a gif rather than a jpg.

There's definitely a lack of decent education in the basics of making a website.
posted by harriet vane at 3:31 AM on November 20, 2010 [4 favorites]


And I didn't mention any user-experience stuff because given that they barely know how to construct a site, UX is just beyond them at that stage. I'd love to get some training in UX but have been reluctant to part with my hard-earned cash given that some of the courses haven't been updated in a decade or more.
posted by harriet vane at 3:35 AM on November 20, 2010


This is more of a rant than an essay. His topic is interesting, but he doesn't really back up his allegations, just asserts them, and even though I was sympathetic to the idea that acedemia is moves too slowly to properly educate someone in this field, I walk away from reading this unconvinced.
posted by Diablevert at 3:45 AM on November 20, 2010 [3 favorites]


Similarly, when I used to hire people to work in a TV production company (as researchers, editors, producers etc) the 'media studies' candidates were a nightmare. They thought they knew how to make TV shows, but they'd been taught by people who couldn't maintain a career in the industry and hadn't done so for decades. In the end it was much easier and more cost effective to hire people with a different skill (history, art, science) which we didn't have, and teach them to make TV shows, which we knew how to do.
posted by unSane at 4:12 AM on November 20, 2010 [8 favorites]


To some degree isn't this true for most professions?
posted by R. Mutt at 4:56 AM on November 20, 2010 [5 favorites]


I think he's right on a lot of counts, but wrong on one. There's several inter-related problems.

The faculty/administrative overhead involved in curriculum redesign in many colleges means that curricula do not get redesigned very often. So yes, students are still learning Dreamweaver.

Also, many courses do not require students to think like designers within a user-centered design or user experience design. What they do teach is the mechanics of how to get whatever is in their heads onto the screen (in Dreamweaver, etc.). Even if whatever is in their heads is crap, from a UCD/UX perspective.

Teaching students to think like designers requires more than one course, and many colleges are unwilling to offer suites of integrated courses that do this. Another option would be to have industry internships where they have real world design experience.

So it's a problem. However I think the real problem is not that students don't have the latest technical skills, but that they have not acquired higher order design skills, as a result of practicing design, during their time in college. But this goes for a lot of other professions as well.
posted by carter at 5:02 AM on November 20, 2010 [2 favorites]


He's talking about the problem of teaching the tools, rather the foundational arts of communication. Cascading style sheets or inline scripting and such are also tools, and in a few years are likely to be replaced by something else. What needs to be taught are subjects like art, the aesthetics of design, communication theory, psychology and literature.

And the problem is hardly limited to web design, nor is it limited to 4-year colleges - the whole trend toward "vocational education" in high schools and "job-ready skills" in any field in university and community colleges is full of this flawed, short-term thinking.

All that old-school "well-rounded liberal education" that's been rejected in recent decades as too slow and irrelevant really does have value after all. The corporations and business interests pushing for our educational system to become their training department may find they are being short sighted.
posted by tommyD at 5:05 AM on November 20, 2010 [23 favorites]


To paraphrase carter's observation and add my own two bits worth: students need to be taught how to think through the problem/solution in context of the user centered approach. In addition, and particularly for "webdesign" etc it becomes even more critical that they be taught a holistic approach to the solution finding, that is, not simply the visual aesthetics alone but also the branding/communication/marketing and corporate strategy elements as drivers which influence the final "design".
posted by The Lady is a designer at 5:05 AM on November 20, 2010


What is the purpose of an academic course? Is it education - teaching students about the principles and history of a particular academic discipline? Or is it training - teaching students how to do a job so their future employers don't have to.
It seems that the author of the article thinks the second, wanting so-called academic institutions to churn out workforce-ready UX designers.
On preview, I have deleted my own opinions because tommyD said the same thing better. Educate people in universities, then train them in the workplace.
posted by nowonmai at 5:14 AM on November 20, 2010 [1 favorite]


I'm launching a web project on Monday that my organization has been working on intensively for the last 4 months, and I realised the other day that this project has been worth more to me than any course I could have taken in web development. It has required daily research on my part on subjects including: University and college courses wouldn't teach me any of that because they aren't real-world situations.

Courses might have been able to teach me how to learn those things and what to keep in mind: Of course, what do I know. I'm a Web Producer with a BA in Linguistics.
posted by heatherann at 6:22 AM on November 20, 2010 [9 favorites]


To some degree isn't this true for most professions?

Bingo. I learned a lot in school, but there are always going to be things you have to learn through experience. No degree program can ever be perfect.

For instance, we keep hiring people with CS degrees to write C code, even though most CS programs these days seem to be all about Java. That's a shame, because unlike Java, there is no automatic memory management in C. I find that programmers raised on Java haven't gotten into the habit of thinking about cleaning up after doing something. This dramatically changes the way you write code: you don't have to keep track of all your data all the time, because when you no longer need a datum, it will just evaporate on its own. So that doesn't enter into your thinking. These graduates visualize programming and the design process a lot differently from those of us who majored in engineering and think in C. They produce a lot of memory leaks, and have a hard time learning not to, because it requires a whole new skill set.
posted by Xezlec at 6:25 AM on November 20, 2010


This:
To be fair, a primary reason university and college programs cannot change to remain relevant is because the technologies, standards, and practices one must understand in order to remain employably-relevant are changing on an annual or even monthly basis.
But then I think in a lot of cases the point is to get a degree in something that sounds relevant. Whether its actually relevant or not is pretty beside the point, you can figure out how to do stuff once you've actually got the job. Having a degree is about getting a job, not doing a job (at least, from what I've seen of the modern workplace).
posted by memebake at 6:29 AM on November 20, 2010


Having been involved and around universities for the past 15 years I totally agree with the rant -- curriculum changes at a snail's pace while the industry changes drastically month to month.

I'm also reminded of a friend who finished a MFA in art and one of the last courses he took was "web design" and they learned how to export flash files and FTP them to a server. I felt this was a vital subject in this day and age but he got 1/100th of the info necessary to understand web publishing. This was two years ago.
posted by mathowie at 6:52 AM on November 20, 2010 [1 favorite]


Xezlec nails it. I get the point of the OP but he's only half right; yes it's a problem that you hire a carpenter who knows how to use a hammer and a saw but not how to start with raw materials and build a chair that won't fall down. Diagonal bracing? Do they still teach that in shop class?

But you can't get to the part where you build good chairs until you know how to use the hammer and saw. And if you apprenticed with someone who was fond of nail guns, you will be helpless when confronted with a regular hammer even if you have the strong chair thing down pat. It's all important.

And this is a particular problem with anything related to computers, because we have adopted the habit of thinking of computerized tools as black box appliances like a washing machine that just do what we say reliably. But computers are actually vastly complicated and full of more traps than a Saw movie, and so we get "programmers" who can't do memory management and don't know the limitations of floating point math. The computer manages to do it automatically just often enough that a lot of people think the details have become an irrelevant distraction. But that's not so; if you are designing websites you really are programming a computer, and if you don't know how the computer works to convert the directions you write into real world activity, you will eventually screw that up.

If I were hiring someone to do my job (and my company has flirted with the idea of getting me some help at times) I'd take out an ad in MAKE: magazine and look for someone who does hobby robotics. They will be cheaper than a college graduate and have more useful skills on the day we hire them.
posted by localroger at 7:02 AM on November 20, 2010 [6 favorites]


Localroger: You articulate it well. I'd say that over the years the abstraction and increasing "user friendliness" of the front end has meant that we now have people at the machine who have no grounding in the basics of what makes the machines work. I don't mean technologically but logically - GIGO comes to mind as the perfect example of encapsulating what you're saying in your second paragraph.
posted by The Lady is a designer at 7:08 AM on November 20, 2010


Having been involved and around universities for the past 15 years I totally agree with the rant -- curriculum changes at a snail's pace while the industry changes drastically month to month.

There's truth to this. But I think the forces acting on the curriculum are more complex.

When I was in graduate school, there was pretty much an uprising by the masters students. The faculty wanted to move the curriculum further towards first principles, the theoretical underpinnings of the field -- towards (if it had been a design department) what tommyD described above as:

He's talking about the problem of teaching the tools, rather the foundational arts of communication. Cascading style sheets or inline scripting and such are also tools, and in a few years are likely to be replaced by something else. What needs to be taught are subjects like art, the aesthetics of design, communication theory, psychology and literature.

The masters students revolted. They knew that what they needed was practical tools for the job. Period. They wanted the laundry list of acronyms, software packages, and key skills. They didn't want first principles, at all. And -- given that they were paying the tuition -- they won. The curriculum was adapted to their demands, and they got those tools. Whether it's the right choice or not, I'm not sure. On the one hand, I doubt it, because I think that having the foundational arts carries you a lot further than the tools. On the other hand, having the tools can get you that first job, which having the foundational arts may not be much help with.

So the point here is that odd curricular choices can be as much student-driven as they are faculty- or inertia-driven; there are very heavy market pressures at play that structure a lot of the pre-professional programs.
posted by Forktine at 8:06 AM on November 20, 2010 [1 favorite]


Not a 'degree' (which I still need to fix,) but I did one of those 'every day, all day, for months' Chubb courses on web development about a decade ago, and while I pretty much never used any of it as it was already outdated when I graduated, having it plus learning my MOS at US Army Signal School on my very thin and oherwise no-tech-history resume probably was the only reason I was able to get my first industry job at the time. Which led to another which led to another, and, while I'm still early-career, it kickstarted me up the line a bit and even if I never wrote a line of Java since, I consider to have been a good thing for me.

Or, to put it another way:

I think in a lot of cases the point is to get a degree in something that sounds relevant. Whether its actually relevant or not is pretty beside the point, you can figure out how to do stuff once you've actually got the job.
posted by John Kenneth Fisher at 8:08 AM on November 20, 2010 [2 favorites]


I've talked to at least a dozen people who have come out of various HCI Master's programs and who find their previous excitement for UX work all but quashed when they realize that they can't get a job, despite the market clamoring for UX people. And they can't get a job because these programs don't teach them anything useful: they've been taught to do wireframes in PowerPoint, not Visio or OmniGraffle or Axure; they've been assigned group projects but not taught how to work with project managers or account people or designers or developers; they don't know how to estimate their work products; they don't know how to present to a client or an internal team; in short, they can--at best--be hired as interns until an agency or company can train them.

These programs might--and that's a big 'might'--be useful for someone with at least a year of background in web design or development. But too many people enter these program as career changers and they are, every single time, totally screwed.

With more HCI/IxD/UX programs being carted out with each passing term, Andy's essay is both timely and a badly needed "Enter At Your Own" risk for anyone contemplating a move into User Experience via these antiquated, money-gouging programs.
posted by gsh at 8:17 AM on November 20, 2010 [2 favorites]


Heh. My degree mostly taught us the skills required to make interactive CD ROMs. That said the "soft" UI and psych stuff I studied has, over the years, given me a remarkably solid foundation for my work over the years.
posted by Artw at 8:52 AM on November 20, 2010


On the other side is the unfortunate stereotype that someone who has "just" a degree necessarily doesn't know anything. It's the same as any other stereotype and totally maddening. It cuts you off at the knees.
posted by amethysts at 9:09 AM on November 20, 2010


My point is that people should be judged on their own merits and what they actually know, and not by what someone decided in their own head that they were or weren't taught.
posted by amethysts at 9:13 AM on November 20, 2010


I teach web design and development while freelancing, and, like others here, I broadly agree with Andy's points. Curriculum development is achingly slow: at my institution there is an official two year turnaround period for any major change to course content. I'm currently writing "fast track" material that won't be taught for at least another year, portions of which will be outdated by the time they arrive in the classroom.

My solution to this has been to ignore the rules. Officially, teaching staff are meant to use a WebCT CMS (currently transitioning to a Desire2Learn platform) to present online course content. I took a quick look at both platforms, scoped the limitations of each, and said "screw it" - I write all my course material (with the exception of actual assignments) on my blog, under Creative Commons, outside the institution, with entries constantly modified and updated. Officially, we're meant to re-apply for course development if more than 20% of an outline changes, which is, as I said, a two-year process. I ignore that - next year I'm teaching 2nd year students HTML5, which appears nowhere on the current curriculum. Finally, I have a fairly high failure rate: around 10% of the class body falls to the wayside each year, a rate that is considerably higher than my colleagues. If the students can't do the work, or consistently fail to demonstrate fundamental skills, they get cut.

I would bring in the point that the toolmakers of the industry - Adobe in particular - have created the consumer expectation that web design and development should be easy; just the clicking of a few buttons. This comes through loud and clear from students, who wander into a DreamWeaver class (yes, I do teach it) with the belief that the tool is somehow full of magic beans that can read their minds and make a website for them. I've insisted that XHTML and CSS classes (taught by me or one of my graduate students, using a basic text editor) become a pre-requisite to taking any class that uses a "WYSIWYG" tool.

Finally, on the subject of teaching design fundamentals: I do have a great deal of sympathy for that point. If I had more time, I would make the semester completely free of computers, and have the students draw, build and critique their work by hand. While extremely valuable, that "wax on, wax off" approach takes time that I do not have. Instead, I try to slip in fundamental lessons while teaching, and attempt to break the students out of "dialoging with the computer": having them "build" valid documents by wearing T-shirts with XHTML tags printed on them while playing dodgeball outside; touring the campus and critiquing UI and UX design choices in the architecture and design; discussing how objects in the real world could be made better, and translating that to their experiences online.
posted by Bora Horza Gobuchul at 9:37 AM on November 20, 2010 [7 favorites]


Definitely something about the tone of this article that makes me want to argue with the author even though I agree largely with many of his points.

I have a unique perspective on this in that 2005 I was given the opportunity to design the curriculum and subsequently become program director of a 9 month interactive design program. The organization I worked for was associated with a university but was an independent school which meant I had free rein in designing the curriculum and didn't have to worry about too much meddling.

Designing the technical side was in some ways easy (especially since I had been in the design/education industry for a while). I emphasized hand-coding and critical thinking over reliance on programs like Dreamweaver. I built the program based on the skills needed in the industry first. The school I worked with had a framework which relied on using teachers who were currently working in the industry, so it became more of an apprentice model. Finding those instructors, however was difficult, I interviewed many prospective teachers who brought what I considered an academic model with them, this often meant a weak or flawed model of what interactive design was.

Folding in "real world" lessons is actually very hard to do. There is a tension, in particular, between teaching the aesthetics of design, technical skills and working with feedback. It's frustrating for students to not be able to execute their vision because they haven't learned Photoshop yet. At the same time, motivated students are looking for honest critique of their work. This collision can result in bruised egos, bitter students, etc. Managing this requires paying close attention to the "mood" of a class. This management takes a lot of energy, believe me, and I don't really know if "traditional" academic settings allow for this.

I had students who successfully transitioned from the program into design jobs, many of my graduates started small design firms and are doing very well. While these success cases made me proud of my work, there were some students who were just not cut out for it.

I do wonder if 4 year undergraduate programs in design are simply a mistake. My program wasn't a graduate program but the students were adult learners and most came into the program highly motivated to succeed. So while I agree with the author that self-direction can work, it cannot hold a candle to motivation + a focused program.
posted by jeremias at 9:49 AM on November 20, 2010


Instead, I try to slip in fundamental lessons while teaching, and attempt to break the students out of "dialoging with the computer": having them "build" valid documents by wearing T-shirts with XHTML tags printed on them while playing dodgeball outside;

Bora, your approach sounds dead on. If I ever get a couple of million bucks to go make a new interactive design school you should join me!
posted by jeremias at 9:53 AM on November 20, 2010


Honestly, I've seen this same rant in a lot of industries. There's a second half to it, sometimes, where one grumbles about investing in teaching people how to Do Things Right, only to have them get wooed away by other companies for higher salaries than you can pay.
posted by egypturnash at 9:58 AM on November 20, 2010


TBH If I'm working with someone who is a "UX Designer" (the technical term is "Photoshop monkey") the less they think they know about programming the better.
posted by Artw at 9:58 AM on November 20, 2010


Bora Horza Gobuchul: This comes through loud and clear from students, who wander into a DreamWeaver class (yes, I do teach it) with the belief that the tool is somehow full of magic beans that can read their minds and make a website for them.

This isn't just students either. I can't tell you how many work environments I've been in which the manager/co-worker/whoever it is seems to believe that using Dreamweaver is the only way to build a webpage or a website, or that doing something without using Dreamweaver is tantamount to driving on the wrong side of the road.

jeremias: It's frustrating for students to not be able to execute their vision because they haven't learned Photoshop yet. At the same time, motivated students are looking for honest critique of their work. This collision can result in bruised egos, bitter students, etc. Managing this requires paying close attention to the "mood" of a class. This management takes a lot of energy, believe me, and I don't really know if "traditional" academic settings allow for this.

Yep. If you aren't ready for a bruised ego, you shouldn't pursue web design. My ego is bruised almost every time I wrestle with a new project. I've learned to accept how little it is I really know, and to be grateful when I do actually learn a new skill, but I also have learned to accept that there's a lot that I will try that will just be wrong.

I never took a web design course. There are many things I probably missed by not doing so. But I think the skills I obtained by project-managing and working on websites in the real world are skills that I never could have gotten in any course, no matter how well-constructed.

I'm no expert by any means, and I know how much there is that I still have to learn. That's one of the reasons I like mefi -- there are a lot of dumbfoundingly brilliant CS and web people here, and I know that if I lurk, I have much to learn from them in the right context. (And the same is true of other fields as well -- lots of hellishly smart people here, period.)
posted by blucevalo at 10:00 AM on November 20, 2010


The more problematic issue has to do with deliberate institutional recalcitrance at the idea of modifying or evolving degree program curricula, even in the face of ongoing expert, professional advice.
posted by The Lady is a designer at 10:08 AM on November 20, 2010


Should have added how much it costs the students to try and find a job afterwards especially when these days a private university education costs almost as much as a small mortgage. Design is one of the few fields that is fundamentally based on practice, and good practice at that.
posted by The Lady is a designer at 10:10 AM on November 20, 2010


The more problematic issue has to do with deliberate institutional recalcitrance at the idea of modifying or evolving degree program curricula, even in the face of ongoing expert, professional advice.

The most problematic issue is that once upon a time businesses expected to train (via apprenticeship or whatever) their staff, now they expect to warp the whole education system around the objective of shitting out little cogs for whatever their machine du jour happens to be, and throw squealing hissy fits if the cog in question hasn't learned excatly the tool or methodology they want.
posted by rodgerd at 10:42 AM on November 20, 2010 [7 favorites]


Yeah, I'm really rather of the opinion that a degree course should not be a certification in Dreamweaver CS 5 or what the fuck ever, and the use of such tools should only be taught in as much as it helps the learning of deeper principles.
posted by Artw at 11:01 AM on November 20, 2010 [3 favorites]


There is at least one academic discipline that handles this whole problem properly; it identifies talented students at an early age-- late grade school, typically-- and at every level gives them the skills they need to succeed at the next level, yet does not close the door against talented newcomers at any stage; it has no trouble finding the resources and skilled teachers needed to let students make the most of their capacities, and when students graduate they are motivated and have all the skills their industry requires.

Yes, I'm talking about football.

It's no accident Gates and Allen came out of a tiny computer club at an elite private school.
posted by jamjam at 11:24 AM on November 20, 2010 [4 favorites]


if you are designing websites you really are programming a computer...

Seems like faulty logic to me. No-one smugly says "In my PDFs, I hand code all my bezier curves directly in PostScript, not like all those Illustrator monkeys who don't know what goes on behind the scenes!" You're saying that a computer's surface appearance doesn't really matter, what matters is what goes on behind the scenes, the surface is an illusion that conceals the truth. This suspicion of surface illusions is exactly the same kind of thinking that concludes that design itself is superficial, since it deals only with meaningless outer layers that are disconnected from reductionist "reality", and why programmers so often create unusable software. Users of the software are also expected to ignore the surface illusions and comprehend the vast, complex internals that really count in order to use the software.

In other words, the moment you say "Yes, but you really need to understand how things really work!" you're on the way to dismissing the entire field of UX as unnecessary. This insistence on a single, neutral, unambiguous, objective reality that really matters that elite programmers understand best is an modern idea. UX represents a postmodern perspective that says that surfaces matter and truth is multiple, people perceive reality differently depending on where they're coming from, and they aren't wrong for thinking that way. When a geologist, a farmer and a poet stand in a meadow, they perceive things differently. In the programmer mindset, the farmer and the poet are simply wrong because their perception isn't of how things really are behind the appearances.
posted by AlsoMike at 11:39 AM on November 20, 2010 [5 favorites]


When I need web designers, I try to find recent graduates in Architecture, preferably from Columbia or Princeton. They know how to set up multimedia presentations, better than design-school students, because they have 3D geometry down, and they know that every little choice has to be chosen for a reason.

They are not thrown by projects that change course once underway, because they are clear that they are not the client and never will be.

Their work ethic is gargantuan: Architecture schools cannot close their studios pretty much ever, or they will lose accreditation.
posted by StickyCarpet at 11:54 AM on November 20, 2010 [3 favorites]


This suspicion of surface illusions is exactly the same kind of thinking that concludes that design itself is superficial, since it deals only with meaningless outer layers that are disconnected from reductionist "reality", and why programmers so often create unusable software. Users of the software are also expected to ignore the surface illusions and comprehend the vast, complex internals that really count in order to use the software.

In other words, the moment you say "Yes, but you really need to understand how things really work!" you're on the way to dismissing the entire field of UX as unnecessary.


While the basic contention in this comment is reasonable and makes sense, the articulation itself is in an absolutist tone. Just sayin...
posted by The Lady is a designer at 12:00 PM on November 20, 2010


Isn't that what internships are for?
posted by Brocktoon at 12:25 PM on November 20, 2010 [1 favorite]



what most 'parents' don't seem to have gotten thus far is that no matter what creative-field higher education you get you're never done learning. the scenery constantly changes and you need to adapt. that is true for web design as much as almost any other creative field. I was never a fan of web-specific programs, by which I mean programs that focus nearly exclusively on the web. I am a believer in the idea of giving a designer a holistic design background before enabling them to specialize later. this is about teaching basics and patterns of thought, not programs. there will always be enough wrists that can move a mouse in a program but those who can imagine will always lead them.

my beef with higher education is a different one: the earnings potential is vastly out of balance with the tuition charged. this is especially problematic since colleges make it so ridiculously easy for students to sign up for bankruptcy-surviving student loan debt.

allow me to present you with an example: my alma mater, art center college of design in pasadena, currently charges undergrads $16,296 tuition, per term. they run a trimester system, so that's $48,888 per year. the average student stays there for eight trimesters plus one (cheaper) academic term, so it's reasonable to expect $130.368 plus perhaps $5,000 for the academic term and various fees. this is a not-for profit school, mind you! now students are getting a really good education here, I have very little qualms about that, but let's say you graduated $150k in debt after having a few living expenses chalked up on your accounts, too. I know of no illustrator or fine artist who did not have massive problems making that back. the photographers are often in extremely tight spots since not only do they have to cover additional costs for cameras and other production costs but are also most often starting their own businesses after graduation. your average graphic designer is going to start out making perhaps $40k the first year out.

I graduated from the advertising program there and I am roughly six years out. I would have no qualms recommending art center to students in order to learn the skills they need to get into this industry but the costs keep me from following through on that feeling. a junior art director graduating and getting a job can probably expect to make $40k in their first year. fast-foward two years or so and they'll make $50-55k. an acd after five years of college should make $120k but not everyone even gets there. the ones who make it to cd at a decently big shop (read: madison avenue), don't have to worry about their student loans anymore but not everyone makes it that far. I graduated with perhaps twenty kids in my class. about the same number dropped out during those terms. perhaps five are in positions where I would suspect they really made it. let's be charitable and say it's ten and I'm just not looking enough. that's still 50% of people who are not pulling enough money in not to defer those expensive loans they took out with sallie mae.

in my time there I observed a frighteningly casual attitude among administrators to raising tuition well beyond the rate of inflation. this was usually followed by them congratulating themselves publicly on their restraint. I've seen kids who were vastly more talented than me drop out because they couldn't wing it financially anymore. the most amazing graphic designer you never knew dropped out in my fourth term when he didn't get a scholarship, a system one can describe as a joke at best, and went back into construction. he said he was just going to take the summer off to make some money so he could continue. he never did.

I agree with andy in that there is a real problem with higher education, especially when it comes to fine arts. I am close to agreeing that it's a ripoff. the college in my example probably doesn't intend to harm their students but an exceptional college education enabling you to be a leader in your chosen field in the united states has become something that is prohibitively expensive and very close to being not worth the cost.

if I were in any position to suggest legislation I'd propose the following: continue allowing colleges to charge whatever they want but tie loan repayments to actual earnings. after ten years all your earning listed in your tax returns is added up. maximum permissible loan repayment is 15% of that sum. anything more the college charged is to be forgiven and/or refunded. so if you made $50k for ten years, your total earnings would be $500k and 15% of that would be $75k. that's not a low number and colleges should be able to make due as long as their programs provide actual value. I can instantly think of a few areas where this would not work but it's a decent start for a debate. it also takes care of outfits like the university of phoenix or kaplan and similar, (imho) sketchy companies.
posted by krautland at 1:29 PM on November 20, 2010 [4 favorites]


Reading the article itself:
Academic institutions have proven that they’re usually incapable of keeping up even with decade-to-decade industry evolution...
When I read that, I immediately thought about how multitouch interfaces have been knocking around HCI research departments in universities for decades, the internet was a publicly-funded academic research project, etc. I thought, "Wait a minute, private free market dynamism vs. public universities, is this guy a right-winger?" so I checked his twitter and it was amazing:
Tyranny must be met w/violence.

it's not "the Bush tax cuts" we're debating, it's the Obama tax increases!

You won't find conservative designers/artists creating images like http://bit.ly/b3UaHK because conservatives are decent human beings

I notice that the liberal idiom is to parrot what leaders have said/reasoned while the conservative idiom is to reflect on core values.
These political beliefs aren't irrelevant. The advice in his followup article is practically a conservative paean to personal responsibility against the tyranny of Big Academia. How does that set of beliefs play out when you're serving on a curriculum development panel for a college? This anti-institutional bias is a self-fulfilling prophecy: he stigmatizes academic design education in general, valorizing individual effort and achievement in the market, which makes it less likely that people will go through academic programs to become competent college-level design educators. That's the idiotic short-sightedness of the whole "University programs need to be tailored to the needs of employers!" thing. So where is the next generation of professors going to come from, genius?

This is where these conservative reactionary ideas are unhinged even from capitalism. Employers need educated white collar knowledge workers, and a university system is the best way to do that. It doesn't matter whether the employees got their education through hard work and personal responsibility or were spoon fed by their paternalistic socialist overlords in academia. It's almost like this guy would prefer an "only the strong survive" system where people pulled themselves up by their bootstraps, which produces fewer good employees and less economic activity because what matters is his "values" and that people who reap society's rewards embody his values, irrespective of the obvious practical effect that the rewards would be less, even for the winners. At least we can be sure that the logic of capitalism will drive his views into irrelevancy.

The Lady is a designer, on my last comment: the articulation itself is in an absolutist tone

A fair point.
posted by AlsoMike at 1:59 PM on November 20, 2010 [2 favorites]


If I had more time, I would make the semester completely free of computers, and have the students draw, build and critique their work by hand. While extremely valuable, that "wax on, wax off" approach takes time that I do not have. Instead, I try to slip in fundamental lessons while teaching, and attempt to break the students out of "dialoging with the computer": having them "build" valid documents by wearing T-shirts with XHTML tags printed on them while playing dodgeball outside; touring the campus and critiquing UI and UX design choices in the architecture and design; discussing how objects in the real world could be made better, and translating that to their experiences online.

Now, THAT is a course I would take and benefit from. I have just subscribed to your newsletter blog.
posted by heatherann at 2:01 PM on November 20, 2010 [1 favorite]


I taught web development at a large university this year. And it was a big, bad lesson in how terrible web curriculums are in universities.

The course had a web design class as a pre-req. I reviewed the syllabus and was gobsmacked at how much was NOT taught in the class. It was a lot of "look here's a neat thing you can do with (CSS||jQuery)!" as well as a required paper about something web related. At no point were the students ever asked to design a website, other than a repository page where all their "projects" would be linked from.

I expected all the students to know enough about web design to know how to build a passable site. I learned that was a bad assumption. I ended up having to jettison some of the development stuff I wanted to do just to teach these kids how to use CSS. I made them build websites. I put them in teams where I could hook up the good designers with the good coders.

The crazy thing is, I got near universal acclaim from the students -- even the ones I mercilessly graded down -- because, in the words of one student, "This is the first time I've ever gotten to build anything."

All this has led me to believe that web design and development shouldn't be in the hands of cloistered faculty; it should be in the hands of on-the-ground practitioners who design and build for a living. And yet, not only are universities unwilling to bring in outside designers and developers to teach, outside designers and developers are unwilling to teach. And that's disappointing.
posted by dw at 2:09 PM on November 20, 2010 [2 favorites]


Heh. Though I mock it for being all CD-ROMy when the web was arriving, the best thing about my back-in-the-day degree course is that they got us all building stuff in Hypercard straight away.
posted by Artw at 2:34 PM on November 20, 2010


To some degree isn't this true for most professions?

Yes, and to rant about it as though it's an original thought -- rather than something that has come up again and again in most fields since at least the '60s -- makes me question both the author's background and his follow-up advice (which I haven't read).

Frankly -- as a content guy who's made a good living in UX for the past 8 years and is totally retarded graphically -- I don't even see UX as the designer's problem. The designers I work with design mainly to my specification, or to specs we've hashed out together. UX design doesn't (or shouldn't, at this point) exist in a vacuum. UX isn't about graphics any more than architecture is about is about placing pretty sculptures in the atrium.

If I'm working with someone who is a "UX Designer" (the technical term is "Photoshop monkey") the less they think they know about programming the better.

Agreed, and I'll add HTML, CSS, and Dreamweaver to the end of that (even though I know at least one designer -- with only a Philosophy degree! -- who does all of those things extremely well. Of course, he didn't learn any of it in school...)
posted by coolguymichael at 2:51 PM on November 20, 2010


he stigmatizes academic design education in general, valorizing individual effort and achievement in the market, which makes it less likely that people will go through academic programs to become competent college-level design educators. That's the idiotic short-sightedness of the whole "University programs need to be tailored to the needs of employers!" thing. So where is the next generation of professors going to come from, genius?

Now here is where it gets interesting. We're talking user centered design and higher education. Who are the users that the university system is designed for? And if there are multiple stakeholders than how are their needs prioritized or given weight? Do students matter at all or will they continue to be called FTEs?

Next thought is that programs shouldn't be tailored to the needs of employers per se unless its Hamburger University in Oakbrook IL but the needs of the design industry and its sustainable future - that is, to summarize from many of the comments above, or rather, to synthesize - current and future designers, design educators, researchers, practitioners first and foremost and then and only then "employers". Imho its the difference between giving a man a fishing rod or a fish and a map to the nearest fishmongers.
posted by The Lady is a designer at 2:58 PM on November 20, 2010


Artw: You weren't at AIS, were you?

When I got called in to teach "Basic HTML" there in 1997, my students were all in their last term, and they told me that they had previously been made to learn HyperCard. But then I think the school realized they needed HTML instead and switched the curriculum around a bit.

Digression: I wish Apple hadn't abandoned HyperCard the way they did, though. What I wouldn't give for a modern version of HyperCard to come free with every Mac today. I learned so much with that program back in the day. And some of you might have even used a stack that I wrote -- me, a non-programmer! I released the stack as postcard-ware, and for years I got postcards, until I finally closed my old PO Box. It's a tragedy that Apple lost interest in HyperCard.
posted by litlnemo at 3:04 PM on November 20, 2010 [2 favorites]


Sorry, kids, but no: a UX designer is most decidedly not a 'Photoshop monkey'. User Experience designers are not visual or graphic designers. We're the nerds who crank out the user flows and sitemaps and wireframes and conduct the usability test and write research reports and conduct ethnographic interviews and figure out what happens when you click that button over there.
posted by gsh at 4:05 PM on November 20, 2010 [2 favorites]


Xezlec nails it. I get the point of the OP but he's only half right; yes it's a problem that you hire a carpenter who knows how to use a hammer and a saw but not how to start with raw materials and build a chair that won't fall down. Diagonal bracing? Do they still teach that in shop class?

It's not even that. Not only do you not get taught to use a hammer, or how to build chairs that don't fall over; you learn how to use a screwdriver, but ONLY with a hex bit.
posted by thsmchnekllsfascists at 4:49 PM on November 20, 2010


TBH If I'm working with someone who is a "UX Designer" (the technical term is "Photoshop monkey") the less they think they know about programming the better.

Because I LOVE using software designed by engineers.
posted by thsmchnekllsfascists at 5:22 PM on November 20, 2010


StickyCarpet: I'm glad to here that some architecture graduates are being picked up by Web design firms because its likely that the profession will shed whole swaths of recent graduates for the exact opposite problem described in this essay. The balance between professional training and general education is just that: a balance, and it's interesting to read about professions on the other end of the spectrum as architecture.

Architectural educations have gotten increasingly speculative and generalist over the last 20 years to the point now where firms are beginning to prefer to hire students based on their practical skills alone (especially in a recession economy). This is the danger of only teaching a general education: if you ignore practical skills, companies will start hiring only based on them, resulting in great short term gain for them (less training needed) but terrible long term growth and innovation (from lack of foundational knowledge). Unfortunately companies rarely care about the latter (even many design companies, in my opinion, who often care little about the ideas of their younger staff) and so it will always be in the university's hands to strike that balance between professional skills and general education. I used to be very much in favor of purely general educations in design fields but it didn't take seeing too many people then being forced to give up their professional dreams because no one will hire them to give up on that idea.

And another thing! For students, its easy for them to close their eyes to what may be seen as esoteric or general learning. Just because you teach it, it doesn't mean they will learn. There were people in my school who after 4 years of education still had not grasped general principles and basic histories of design. At the end of it, it was hard for me not to wonder if they should stick to practical stuff and let the people who were interested in going deeper take it up by themselves, rather than waste time preaching to people who did not want to listen. I still firmly believe in balance, but these experiences have made me deeply skeptical towards generalist pushes in education now. This is the problem in all fields now, from law to art (except ones with excessively deep pockets for training, like finance) and its not going to get simpler if professions keep getting more and more complex.
posted by tmthyrss at 6:09 PM on November 20, 2010 [1 favorite]


AlsoMike: You're saying that a computer's surface appearance doesn't really matter, what matters is what goes on behind the scenes, the surface is an illusion that conceals the truth.

You completely misunderstood what I said. Of course the surface appearance of an application matters, and much software fails because programmers aren't taught to design user interfaces.

What I am saying is that if you do not have at least a cursory understanding of the processes that go on behind the surface illusion, your surface illusion will fail. You will ask the computer to do something it can't do, or that's very difficult or that is unstable, and you won't even know why it doesn't do what you expect, just as a carpenter who isn't educated about the strength of wood and joinery might make a chair that is very beautiful and comfortable but falls apart after some use.

This is not an either/or dichotomy where you understand either the nuts and bolts or the overall user experience; you need to have an understanding of both in order to make a successful application. I have frequently voiced the complaint here that Design with a capital D is frequently practiced by people who lack the nuts and bolts underpinnings, but I'm just as bothered by the reverse phenomenon of software based on solid algorithms but presenting a messy, confusing, hard to navigate and error-encouraging user interface.
posted by localroger at 6:29 AM on November 21, 2010


The web has been steadily moving away from the "design in Photoshop, slice into imagemaps/Flash movies, post" model for the last five years or so. A successful web designer these days needs a very broad range of knowledge... not least, if he intends to freelance, scripting and programming. The field is maturing and it's becoming too big for one person to do all of the work. That's not necessarily a bad thing; it's just a thing, and it's not going to stop.
posted by sonic meat machine at 2:21 PM on November 21, 2010


This is not an either/or dichotomy where you understand either the nuts and bolts or the overall user experience

I've interviewed people for UX design positions, and there are LOTS of candidates who are good at visual design who've chosen to focus on the nuts and bolts of HTML/CSS and front-end development work at the expense of solid UX skills, and this disqualifies them from the position. That's an either/or right there. Qualified UX designers are harder to find because so many have tried to become closer to engineers, giving up what's unique about the design perspective.

I agree that an understanding of both nuts-and-bolts and the user's perspective are important, but think about the fact that a designer-developer ratio of 1:7 is usually considered barely adequate, and 1:15 or more is closer to the norm. This leads to a common scenario: the designer's perspective is often overwhelmed and ignored, their expertise contradicted by people who know less than them (something that developers are no stranger to), the organization is heavily biased towards the nuts-and-bolts perspective, it goes without saying that there are serious usability problems with the project, and when designers try to fix this, developers say "Yes, but shouldn't there be a balance?" Quite frankly, it's a bizarre thing to say when things are already so unbalanced.

This may be a provocative thing to say, but sometimes knowing less is better than knowing a lot, because it's closer to the mindset of most users. If this was a powerpoint, I'd have some Zen rocks and a quote about beginner's mind right here.
posted by AlsoMike at 12:03 AM on November 22, 2010 [1 favorite]


Well AlsoMike, I see your point but I guess my problem is that in my industry it's very uncommon to have as many as eight people working on a single design, which makes even a 1:7 ratio (much less 1:15) kind of hard to come by. What this means is that we can't afford that kind of specialization; at least one person needs to have both the design sense and the nuts and bolts, or you will either get something that works but is ugly as sin (the usual result) or (rarer, but worse) something pretty that doesn't work.

I do agree that good design often gets buried under implementation considerations and that's especially likely to happen if you have one UX specialist trying to prevail over seven engineers. What we should be doing is teaching engineers to make things that don't suck. This has lots of aspects, not just making the most commonly used stuff easiest to get to in the UI instead of burying it under three menu levels but also not putting bolts in places where you can't get at them with a spanner.
posted by localroger at 6:07 AM on November 22, 2010


You mean, in other words, that user friendliness and common sense approach to designing/building needs to be taught to everyone/anyone relevant/involved ?
posted by The Lady is a designer at 7:16 AM on November 22, 2010


Artw: You weren't at AIS, were you?

Bournemouth university, on the Human and Computer Sciences course that they ran for 2 years befoire merging it back into Applied Psychology and Computing.

The "Computer Sciences" bit is a bit misleading - we didn't spend much time studying big O notation or building linked lists or any of that, it was very much a course focused on multimedia interaction.

And, like I say, they were very big on getting us started straight away with Hypercard, which we moaned about a llittle at the time becuase it wasn't colour (unless you jumped through some hoops), and we moaned about afterwards because it was becoming clear that it wasn't what we were going to be using when we went into the "real world"*. But for getting the maximum number of us building things and thinking about how to build things straight away it was great. And that's the experience that counts. Learning the ins and outs of a particular tool is secondary to that.

* We talked about "the real world" a lot, often with the assumption that it would be a well ordered place that made sense. Ha ha.
posted by Artw at 10:21 AM on November 22, 2010


And, yes, it is a great shame that there is nothing that quite matches Hypercard these days in terms of getting a quick start but still having access to considerable power.
posted by Artw at 10:22 AM on November 22, 2010


What this means is that we can't afford that kind of specialization; at least one person needs to have both the design sense and the nuts and bolts

I think it's fair to interpret the affordability problem in a different way: whoever sets the budget for those projects doesn't value UX. So why would they be OK with developers spending time interviewing users, exploring the problem space, making wireframes, conducting usability tests, etc., instead of writing code? Even if the developers did have those skills, project management would think it's a waste of time. I'm sure they would be happy if developers read a few UX books in their free time, because then they get something for free, which is the ideal price for something you don't value. The hope is that developers are already implementing the interface, and if you just teach them the rules of interface design, they can just apply that knowledge while they do it. This has some value, but it's limited, because good design takes time; and it could easily lead to a situation where PM says "The developers are all UX experts now, so we expect no usability problems for the next project! Oh and the deadline hasn't changed." Making everyone on the project/company responsible for it has a similar problem: since everyone is a "UX designer", no-one is responsible for it specifically, so it's the first thing to go out the window when deadlines slip. And they always do. That's the sort of thing company mission statements are made of, which are of course meaningless - "We put the customer first!" etc.

Mission statements are practically free, and I think UX is an investment, you have to put up some money to get the return. I don't like strategies that try to achieve good design for free.
posted by AlsoMike at 6:14 PM on November 22, 2010


Well AlsoMike, that's a good point. In the very small projects I work on, where I am pretty much the entire team, I have made it clear for nearly 20 years that part of the cost of the project is me personally visiting the site and talking to the operators who will use the system, and seeing any existing system that we might be replacing in use. It's amazing how bad you realize some specifications are when you see with your own eyes how they have to fit in. And after I build an application I spend time operating it, looking for bottlenecks and annoyances and I'll tear it apart and rebuild it if I'm not happy with it. None of our competitors do these things and it has made the basis of a considerably good reputation in our little corner of industry.

Sometimes these costs are non-trivial, especially when we're doing the site visit on spec and it's 250 miles away from the office where I normally work. Fortunately our management is now hip to the necessity of doing it.

So yes, design takes time, whether you're hiring a specialist or expecting the developers to do it all, and I say that as one of those developers.

I suppose I should be glad I don't work for one of those companies where people like me are regarded as interchangeable cogs who have to sweat deadlines. I get pretty much whatever I say I need in order to make the final product work. If I didn't, I probably wouldn't be looking at 25 years at the same company.
posted by localroger at 5:21 AM on November 23, 2010


« Older The website Sharenator introduces Webempires which...  |  A Faustian Bargain:... Newer »


This thread has been archived and is closed to new comments