The digital humanities...plural...
May 12, 2011 9:22 PM   Subscribe

How to define digital humanities? "the humanities done digitallys"? Should we expand the definition of the field to include, as I've heard it said several times, "every medievalist with a Web site"? Undoubtedly not. Yeah, not. Rather, The particular contribution of the digital humanities, however, lies in its exploration of the difference that the digital can make to the kinds of work that we do, as well as to the ways that we communicate with one another.

Some other links to DH resources. [Full disclosure: last link is to my institution but I have nothing in the least to do with it.]
posted by Mngo (38 comments total) 23 users marked this as a favorite
 
And of course, I failed to get into the fairy edit window to correct the first link to "the humanities done digitally". My bad. If we did the humanities digitally, surely this kind of mistake wouldn't happen, right?
posted by Mngo at 9:23 PM on May 12, 2011


Digital Humanities is a rapidly evolving new field dedicated to arguing about what is meant by Digital Humanities.
posted by LarryC at 9:31 PM on May 12, 2011 [5 favorites]


a rapidly evolving new field dedicated to arguing about what is meant by Digital Humanities

That sounds depressingly plausible, and sums up much of what's wrong with the analog humanities. I have this hideous vision of Internet September redux, in which eager new students of the social studies announce Exciting New Research projects which will involve studying the behavior of how Ordinary People use the internet to answer Important Questions about The Internet's relationship to Society.
posted by anigbrowl at 10:09 PM on May 12, 2011


The last link is to the Digital Humanities at UW-Madison, where I work: this post by my colleague Michael Witmore (about to leave us to take over the Folger Shakespeare Library, apparently) is a nice example of the form, and of collaboration between English professors and computer scientists.
posted by escabeche at 10:22 PM on May 12, 2011 [1 favorite]


It's called the "digital" humanities because you type with your fingers. In the old days we had the manual humanities, because when you write with a pen, your whole hand is involved in the writing of each letter.
posted by kenko at 11:07 PM on May 12, 2011 [3 favorites]




Why not just 'humanities', you might ask? One of the answers is to do with career development. There now exists a cohort of very smart people fresh from PhDs or postdocs in which computational methods were absolutely crucial. But anyone in the humanities who puts their methods in front of their materials is unlikely to go far. In literary studies, that was as true of bibliography and philology as it was of poststructuralist or other theory - almost no-one got a job as a pure theorist, but some did as new-historicist Shakespeareans or feminist Victorianists. The emphasis, as in digital humanities, was on the second part.

Now, that cohort I mentioned wants tenured jobs; graduate students in diverse fields want training in these skills; and the projects themselves keep on coming - and you really do want people with a deep understanding of the materials as well as software engineering skills. Absent the science funding model (which is admittedly somewhat present in the digital humanities) you need to agitate for the importance of the thing you do in order to get those tenured positions, around which other resources are still structured in the academy.

For the time being, this staves-off answering the question about what are your methods in contact with. Is a digital mediaevalist the same as a digital Victorianist (answer: no, because the materials are different and do not exhibit the same properties of consistency and self-similarity as do natural-world objects, as in biology). So the talking about What We Do is and will continue to be interminable, which is where theory gets back in and existing scholars are relieved to find something to talk about with these alarmingly smart people carrying what appear to be methodological black-boxes capable of transforming how we think about our most familiar and cherished objects. You can also teach something that is not purely programmatic but also has the kind of intellectual open-endedness that advanced students need. You can extend that conversation into book-form, which is how you will get hired and promoted - not by producing software tools or companions, but monographs. I'd love it if that were true of electronic archives as well, but it hasn't been true of printed critical editions for some time, so that seems fair.

But we are also seeing the rapid re-engineering of the academy along neoliberal lines, and if we all end up being casualized by it then maybe the conversation will no longer be necessary (nor will anyone have time for it). I'm not sure that would be an improvement.
posted by GeorgeBickham at 11:10 PM on May 12, 2011 [5 favorites]


I'm with George, it's time for action. If only the hard toiling new-historicist Shakespeareans, feminist Victorianists and digital mediaevalists could put aside their differences for a few days and down quills, pens and keyboards together I'm sure the neoliberal state would soon be on its knees and begging for forgiveness while tossing $100 bills at all these extremely smart people doing this incredibly significant work.
posted by joannemullen at 12:04 AM on May 13, 2011 [1 favorite]


Joanne - sorry if my putting 'our' before the phrase 'most familiar and cherished objects' struck a nerve. I should have realized that many, many people on Metafilter are not remotely interested in language, tradition, culture and history, even their own. Silly old stick-in-the-mud me...
posted by GeorgeBickham at 12:25 AM on May 13, 2011 [1 favorite]


George: Why not just "humanities?" While for some people, the answer is related to career development, I don't think it's the most revealing answer. Here are a few alternatives:

* There is genuine expertise to be shared

* Here in the UK, humanities students stay within their faculty, and general courses in software engineering are rarely offered outside computer science faculties. Digital Humanities gives humanities faculties an avenue for developing digital methodologies

* It's an area we're still learning about, so we should reflect on the implications of our use of technology We should also naturally expect to see many earnest but embarrassingly poor quality writings on the topic, as people come to terms with these new practices. That's OK so long as things improve over time.

* Administrators see digital humanities as a connection to the world of business and a possible avenue for making the humanities "relevant" and profitable

* I think digital humanities is an option, not a requirement. In Classics, lexicographers, historians, archaeologists, and political theorists work and bicker together pretty effectively. Digital humanities is another approach in this list.
posted by honest knave at 1:37 AM on May 13, 2011


I should mention that my personal interest in digital humanities involves trying to improve research writing practices and formats. I like things like links, stretchtext, and rhetorical structures (see also Whitney's thesis and Euclid's Elements).
posted by honest knave at 1:44 AM on May 13, 2011 [3 favorites]


George: Why not just "humanities?"

honest knave, I agree with all these points, including the necessity for critical reflection to take place, which is why DH is more than just training. I do worry about the compartmentalization of approaches within the humanities. I think, for example, that literary critics and historians should edit texts, or at least have an appreciation of textual issues; and that editors should at least have experience of transcribing sources. There should be no black boxes. Administrators like compartmentalization, unfortunately, but DH can hopefully address that, not least by offering career pathways for scholars who are sometimes treated like hired help.
posted by GeorgeBickham at 2:49 AM on May 13, 2011


I work as a (the) developer for a very small not-very-prominent digital humanities research group in a very prominent university. We are constantly struggling with this question, and I think this is really a proxy for our true struggle, which is getting people (most importantly humanities scholars) to be willing to understand the possibilities of computation in the humanities. As the author of this piece aptly puts it, "Digital humanities thus grows specifically out of an attempt to make "humanities computing," which sounded as though the emphasis lay on the technology, more palatable to humanists in general." This is dead-on.

The real problem is a sort of catch-22 where the old-school humanities scholars don't seem to be interested in harnessing the power of computation, because we've always done it this way and we don't understand computers anyways, they are not relevant to us, and so this "field" has been created to try to at least allow those scholars who "get it" to push their work forward in the digital realm (in my opinion, the goal asserted by the author in the quote above has long since gone by the wayside), using computational techniques, and more importantly, building the sort of infrastructure required to do support research projects in the humanities that use computational techniques.

Unlike the sciences, where the benefits of using computers to crunch numbers was immediate and obvious and in fact drove computational technology and continues to do so, it's hard for many scholars in the humanities to think past their familiar uses of computers, like web sites or printing or IT connecting your computer to the network and the like. I've even had conversations here on Metafilter just trying to get people to consider something past this and it's like pulling teeth; people think they know what digital humanities is, it's like teaching classes online and stuff and that's that.

But the irony is really that computational methods are just another way of working, and while profound, they will only become truly profound once they become an integral, taken-for-granted part of the humanities. "Of course I ran those digitized 16th-century documents through the named entity recognition analyzer, and then did lexical analysis on them before loading them into the db and setting permissions in the web app which lets me do shared analysis with other experts in the field. That's just what you do!" ...it'll be a bit of time before we get there. But once humanities really gets it as a field, the possibilities will become staggering.

This also goes hand-in-hand with the changing nature of how humanities projects are done: the size and composition of teams is changing, the physical resources necessary are changing, and, of course, the funding is going to have to change to support all of this. More targeted sub-disciplines—like we have in the sciences, like with physical simulations or biological computing—are springing up or are in their nascent stages, and all of this is further incentive to let go of the "digital" in front of humanities. At some point, it will be well known how you do this type of analysis versus that in terms of what computational specialists are required, and that's the level where the computation-as-a-specialty labels will settle (like, "computational linguistics," which a lot of earlier efforts in so-called digital humanities were focused actually).

In the meantime we're stuck with this sort of frustrating existence of a field that ought not to exist, and won't, once the clued-in scholars and students outnumber the clueless ones.
posted by dubitable at 3:16 AM on May 13, 2011 [7 favorites]


...and you really do want people with a deep understanding of the materials as well as software engineering skills.

GeorgeBickham, I agree that's nice when you get it; but having worked with only a few folks like this, folks who are tremendously talented (I am but a humble programmer, without the humanities expertise), what it really boils down to, practically speaking, is that we are going to have larger teams of specialists. I am not disagreeing with you guys who are arguing for trans-disciplinarianism (is that a word?), but what I've found in practice is that you want some scholars with deep knowledge of their field who have a basic grasp of what computers can do—but aren't necessarily programmers—combined with programmers who can appreciate what sorts of questions the scholars want to ask, and why, and can help them extend their basic understanding of computational techniques to a broader and/or deeper range of possibilities.

Maybe we're arguing for the same thing, I don't know; but it strikes me that these are two different types of specialists, while both perhaps can and should fall under the big umbrella of the "humanities."
posted by dubitable at 3:25 AM on May 13, 2011


dubitable, while there are good reasons for the rebranding of humanities computing as digital humanites (although not everyone would agree that they are the same thing), one unwelcome consequence has been to present dh as the new hotness, whereas (as you will know) it has a history practically as long as computation. Some methods are new, like the visualizations which are, tautologically, the most visible. But computational linguistics, computer-aided author-attribution studies and text-collation have been around for half a century or more. I think that some sense of the already-embededness of some practices within mainstream humanism, marginal though some of them may be, would be no bad thing.
posted by GeorgeBickham at 3:29 AM on May 13, 2011


dubitable, while there are good reasons for the rebranding of humanities computing as digital humanites (although not everyone would agree that they are the same thing), one unwelcome consequence has been to present dh as the new hotness, whereas (as you will know) it has a history practically as long as computation.

Yeah, just to be clear, I'm not shooting for changing the label of digital humanities to anything—I want it gone, as its own field, altogether. Fundamentally I think it's ridiculous that it exists. But, I'm tilting against windmills a bit here, and more to the point, it's likely that this is just what had to happen to get us to the next level. There are always growing pains.

As you say:
I think that some sense of the already-embededness of some practices within mainstream humanism, marginal though some of them may be, would be no bad thing.

Exactly...all I'm really trying to say is that this is going to happen more and more, inevitably. Right now there is a lot of consternation and gnashing of teeth but I suspect it will be more or less forgotten in fifty years when all humanities scholars, to a greater or lesser degree, are using computational techniques.
posted by dubitable at 3:37 AM on May 13, 2011


Dubitable, I think that we are arguing for much the same thing, although maybe from different ends. As a humanist with some limited technical familarity, I'm lucky enough to have worked with people who are both technically proficient and have advanced degrees in the humanities. One of them, however, was only able to do his humanities work at more or less pro-bono rates because he made a healthy living from commercial software development. Another, dependent on project work saw his income halve last year as work dried up. Boo-hoo, some might say, that's simply what life is for freelance developers is like outside of large institutions. But there aren't many people with both the skills and the ability to at least relate to the humanities at the wages that are generally on offer. As you say, the same is true going the other way and I totally agree that teams with a spectrum of specialisms are the way to go. My point is also that an intersection of skills in the one person, or that spectrum within a group, is what people are looking for but that it's hard to specify exactly what it is.
posted by GeorgeBickham at 3:43 AM on May 13, 2011 [1 favorite]


And furthermore, on preview, what you said above.
posted by GeorgeBickham at 3:44 AM on May 13, 2011


Dubitable, I think that we are arguing for much the same thing, although maybe from different ends.

Haha...yes, exactly!
posted by dubitable at 3:57 AM on May 13, 2011


It's called the "digital" humanities because you type with your fingers. In the old days we had the manual humanities, because when you write with a pen, your whole hand is involved in the writing of each letter.

I think it's exciting regardless of whether it's done digitally, manually, or even orally.
posted by Kabanos at 4:36 AM on May 13, 2011


the old-school humanities scholars don't seem to be interested in harnessing the power of computation, because "we've always done it this way and we don't understand computers anyways, they are not relevant to us"

I've come in contact with this attitude before and I always find it absolutely incomprehensible, because these same scholars are always more than happy to use databases (concordances, archives, the OED) provided they were assembled manually. The tendency towards large-scale data analysis has always been a part of the humanities.
posted by Pickman's Next Top Model at 5:15 AM on May 13, 2011


I know someone who uses OCR and uses a computer program to help him analyze historical sources for qualitative analysis. I often use spreadsheets for my own qualitative notes, because it's much easier to sort them into a crude database than if they were in a text file. This isn't so different from the long habit among historians of using index cards, but I'm enough a child of the 2000s to think if index cards as a primitive analog database. I also use GIS in my historical analysis (for qualitative as well as quantitative data). Computers are essential to our understanding of humanities research.

But the huge barrier to more digital humanities is the widespread computer illiteracy among humanists (even young ones) and the almost-complete lack of institutional support. A sociologist I knew had places she could go to hire computer programmers; a historian I know couldn't even get a little database support -- the IT people in the history faculty weren't there to help people with research, that would be crazy. My own institution was good in opening support up regardless of department, but they offered no databasing workshops for humanists (I did that elsewhere) I was the only humanist in my GIS classes. And there was nothing at my uni to help you set up qualitative analysis software, which should frankly be taught in every first year English/lit class because it's wonderful for coding and organsing qualitative data.
posted by jb at 5:21 AM on May 13, 2011 [1 favorite]


Digital Humanities is a rapidly evolving new field dedicated to arguing about what is meant by Digital Humanities.
posted by LarryC at 12:31 AM on 5/13


That means the field has entered its adolescence. Don't worry, in a few more years it will mature and become more comfortable with itself, and then it will just go around telling all the other humanities fields that they are dinosaurs and clearly its the only style/topic of research worth pursuing. After 50 years, it may settle quietly into middle age.
posted by jb at 5:26 AM on May 13, 2011


escabeche: " a nice example of the form"
Yeah, so sad for UW to loose him, but it's a great gig. That blog has several nice examples of the possibilities for DH to be much more than an argument about itself.

Kabanos: "even orally"
I'm in performance, myself, so maybe I'll start describing my work as "physical humanities..."
posted by Mngo at 5:44 AM on May 13, 2011


But the huge barrier to more digital humanities is the widespread computer illiteracy among humanists (even young ones) and the almost-complete lack of institutional support.

Of course these two things go hand in hand, and if the humanists were clamoring for it more generally things would move forward faster, but you've made such an important point in your comment I think: there is such a lack of these sorts of resources in most academic institutions that it forces those scholars who are interested to go elsewhere, where the resources are available—or just drop the ideas. "Oh well." There are in fact plenty of humanists who do get it, but are not able to push forward with their ideas because of the lack of support you're talking about. And, it keeps these sorts of projects invisible for those scholars who might have a light-bulb go off if they were to see some better examples of people doing the sort of work that has relevance to their own work, but with a computer. It's very frustrating.

In fact, I think that "digital humanities" as a discipline is much more relevant in terms of its effect on helping create these sorts of resources at the institutional level than as any sort of "new way of doing humanities" concept.
posted by dubitable at 5:46 AM on May 13, 2011


the IT people in the history faculty weren't there to help people with research

This is true in every department, not just the humanities. It's very difficult to get IT staff anywhere in most universities who will assist with research projects beyond setting up a computer and making sure you can print. Programming staff often have to be hired specifically on short-term grants, which hurts stability. I'm in visualization and since my current projects involve medicine, grants are available. I'm sure it's much harder to find a granting agency that would fund a interactive visualization related to power structures in ancient Rome, for example.

Like medical visualization, digital humanities (in the sense of computational investigation of humanities, rather than humanities investigating computer media) is not really a sub-field, but an inherently cross-disciplinary area of research. I see the same kind of thing that dubitable is talking about: biologists or doctors look at some of the projects that we've already done and it makes them think about how visualization can be applied to their own work. They start asking questions about what can be done. Sometimes we tell them, "Yeah, wait 20 years for that one.", and sometimes we tell them "Sure, we have a program that does that already.", but more often than not, we can come up with some interesting research. The domain specialists often don't even have a clue about what they can do with computers.
posted by demiurge at 7:52 AM on May 13, 2011 [1 favorite]


I wonder if the neoliberal state throws out $100 bills for the best threadshitters?
posted by sinnesloeschen at 7:53 AM on May 13, 2011


Programming staff often have to be hired specifically on short-term grants, which hurts stability.

+1

I had the privilege of working on a well funded humanities computing project (NINES at UVa) and one of the main goals we achieved was securing institutional support for the project and tools after the grant period expired. Unless a programmer is building a tool for other programmers they are always learning an alien subject domain. The intersection of computer science and humanities is uncommon so projects spend a lot of the budget to negotiate translations issues between the software and humanities domains.
posted by dgran at 8:25 AM on May 13, 2011


The domain specialists often don't even have a clue about what they can do with computers.

It's true. I'm a digital humanities PhD student in London as well as a former software developer, and I'm constantly being amazed by how many practitioners in my little sub-niche are completely unaware that there are better ways to do what they're doing.

I'm not going to name names, but I'm thinking specifically of one local research assistant who was spending days laboriously cutting and pasting text into rows in an Access table and then manually manipulating his data. I introduced him to some very basic SQL and scripting and it was as if he'd had a road-to-Damascus moment.
posted by Mr. Bad Example at 9:20 AM on May 13, 2011 [1 favorite]


the huge barrier to more digital humanities is the widespread computer illiteracy among humanists (even young ones) and the almost-complete lack of institutional support

Really? It seems very odd to me that so much of this thread is about obstacles to the growth of digital humanities, when in fact the field is a massive fad at the moment, a trendy and fast-growing area in an institutional climate where almost all other subfields are shrinking and being defunded. And the computer-illiteracy (and statistical illiteracy) of most humanities scholars is really responsible for the growth of the fad, with its attractive veneer of empiricism and quantification — a lot of statistically dubious work that wouldn't pass muster among (e.g.) computational linguists is getting trumpeted as major methodological innovation in (e.g.) English departments.
posted by RogerB at 9:24 AM on May 13, 2011 [1 favorite]


Are their jobs for Digital Humanities grads? I am tremendously interested in it as a subject matter but with all graduate education I am terrified to take on debt without possibly of recompense, or worse, be pigeon holed into limited career options.
posted by 2bucksplus at 9:32 AM on May 13, 2011


a lot of statistically dubious work that wouldn't pass muster

Well, you don't want the English professors to do the coding and statistical analysis, get the computational linguists to do it. But they have to work together to get something done. If there's trash work being published, it's the fault of journals who don't get the appropriate experts to read the submitted manuscripts.

Are their jobs for Digital Humanities grads?

There are jobs for people who can do statistics, computational linguistics and visualization. Jobs for English and history PhDs... not so much. That's what I mean about Digital Humanities not being a sub-field.
posted by demiurge at 9:36 AM on May 13, 2011 [1 favorite]


when in fact the field is a massive fad at the moment, a trendy and fast-growing area in an institutional climate where almost all other subfields are shrinking and being defunded.

Please elaborate: what school(s) are you talking about, and what programs or "fads" are you talking about? The NEH just dropped a grant we applied for because they were low on funds. If digital humanities is a hot fad then perhaps that's only in relationship to specific humanities programs in specific universities; it's certainly not the case where I'm working, where we have to struggle to survive, and from everything I've heard all the major grant-givers—the NEH, the Mellon foundation, etc.—are pulling back. Funds are only being given out on an institutional basis via specific private grants, as far as I can tell. Seriously, you have to be more specific than that if you are going to be so contentious.

...a lot of statistically dubious work that wouldn't pass muster among (e.g.) computational linguists is getting trumpeted as major methodological innovation in (e.g.) English departments.

...like?
posted by dubitable at 10:25 AM on May 13, 2011


Please elaborate: what school(s) are you talking about, and what programs or "fads" are you talking about?

When I say it's a fad right now, I'm talking primarily about hiring and about the kind of attention the research gets, not grant funding. Digital humanities is a very fast-growing hiring field in a market where many core humanities fields are seeing annual double-digit percentage drops in tenure-track openings. And there are many more well-attended panels on digital-humanities topics every year at big humanities conferences like AHA and MLA, where digital-humanities work is regarded as one of the hottest fad topics in the back-channel conversation. I can easily appreciate that it could still be hard to find grants for collaborative research projects — orgs like the NEH are cutting everywhere, and digital humanities projects that involve things like hiring programmers are quite expensive by one-person pen-and-paper research-project standards — but in the broader institutional environment of the humanities, where programs like classics and foreign languages are simply being cut and their scholars left unemployed, digital-humanities people and projects are doing very well indeed. It's bizarre to me that anyone in the humanities would find this a "contentious" idea.
posted by RogerB at 10:48 AM on May 13, 2011


Hey, didn't realise there were so many of us on here! Cool.

I agree that in the future we aren't going to see anything called 'digital humanities' (any more than there's a 'digital physics' now), but maybe that's already changing today. We have an undergraduate class on computing for literature and language students, which always does pretty well enrollment-wise; the language students in particular are often already confident with the idea of corpora and quantitative analysis in general. Baby steps.

That said, there's definitely a growing perception out there that DH, and 'digital things' in general, are some kind of massively popular exponentially-growing trendy fad that's sucking money and life out of Real Serious Research. Which isn't really my experience in the field. I was at a seminar a couple of months ago where a medievalist scholar argued passionately that one of the problems with digitising medieval texts is that it means there's no money to fund libraries or train PhD students in palaeography (eh?) - after which I went back to my desk, with its solitary network point I had to argue for when 'you can pick up the WiFi signal from there, can't you?' and its struggling slow 5-year-old Dell I couldn't get a replacement for because hey, it can run Word and PowerPoint, what more do you need?

If we're getting all the money and kudos, I can only assume one of us is hoarding it like Smaug.
posted by Catseye at 10:56 AM on May 13, 2011


I'm not going to name names, but I'm thinking specifically of one local research assistant who was spending days laboriously cutting and pasting text into rows in an Access table and then manually manipulating his data. I introduced him to some very basic SQL and scripting and it was as if he'd had a road-to-Damascus moment.

Access is horrible; it's why I always do my data-entry into a spreadsheet and import it for queries/building relationships. But how could SQL and/or scripting help
one do your data-entry?
posted by jb at 11:07 AM on May 13, 2011


But how could SQL and/or scripting help one do your data-entry?

It'd save him from manually cutting and pasting, for one. The much less painful way to do what he was doing is to pattern-match the things you want to store, extract them from your source(s), and then insert them into your database for further manipulation.

And it wasn't just the data entry, it was the actual analysis he was doing manually. He was actually literally eyeballing thousands of records looking for the patterns he was trying to extract from the data. I showed him how to do a relatively simple select with a join, and he said I'd saved him at least a day of work.
posted by Mr. Bad Example at 11:41 AM on May 13, 2011


DH is definitely receiving (more?) funding in Europe; I'm about to start a new MA/MSc in DH at UCL, CRASSH in Cambridge has been going for about ten years, and seems (judging by the number of events they hold, and that they have postdoc grants available on an ongoing basis) fairly well funded. In addition, TCD is running an MPhil in DH for the first time this year, has a brand-new DH building, and is running a fully-funded PhD as of this September.
posted by urschrei at 2:27 PM on May 13, 2011 [1 favorite]


« Older A Typical Jordan Game   |   Go the F**k to Sleep Newer »


This thread has been archived and is closed to new comments