Useful: [FreeCite][]: “FreeCite is an open-source application that parses document citations into fielded data. You can use it as a web application or a service. You can also download the source and run FreeCite on your own server. FreeCite is distributed under the MIT license.”


Visual Correspondence

In addition to the Map of Metaphor in English, there is also [Visual Correspondence][1], a project to make available not only the contents of correspondence but also the metadata — who received the letter? who else was a correspondent? — and to make all of this data and metadata available for visualization.


European Summer University in Digital Humanities

The European Summer University in Digital Humanities takes place across 11 days. The intensive
programme consists of workshops, public lectures, regular project
presentations, a poster session and a panel discussion. The *workshop
programme* is composed of the following thematic strands:

* XML-TEI encoding, structuring and rendering
* Methods and Tools for the Corpus Annotation of Historical and
Contemporary Written Texts
* Comparing Corpora
* Spoken Language and Multimodal Corpora
* Python
* Basic Statistics and Visualization with R
* Stylometry
* Open Greek and Latin
* Digital Editions and Editorial Theory: Historical Texts and Documents
* Spatial Analysis in the Humanities
* Building Thematic Research Collections with Drupal
* Introduction to Project Management

Each workshop consists of a total of 16 sessions or 32 week-hours. The
number of participants in each workshop is limited to 10. Workshops are
structured in such a way that participants can either take the two
blocks of one workshop or two blocks from different workshops.

The description of all workshops can be found [here]( in at least two languages. Short bios in at least two languages are available of most [workshop leaders](

Bath Spa University’s Creative Computing Centre

From a recent _Humanist_:

> Date: Sat, 28 Feb 2015 13:52:01 +0000
> Subject: digital humanities at Bath Spa
> Recently I visited a small liberal arts university to the west of London, Bath Spa (, and was so impressed with what I found there that I invited my host, Professor Andrew Hugill, to write a brief description of his activities in digital humanities there. He sent the following.
>> Professor Andrew Hugill was appointed in April 2013 with a mission to develop the university’s digital portfolio. Hugill is a transdisciplinary academic, a composer who works in Music and Computer Science, as well as some of the wilder shores of French literature. He has a track record of creating cross-disciplinary entities, having established the Institute Of Creative Technologies at De Montfort University, which generated £7 million in external income under his direction.
>> In 2014, Hugill established the university-wide Centre for Creative Computing (CCC) at Bath Spa University, based at Corsham Court. This has already secured significant funding from NESTA (£125,000 for predictive analysis for museums) and an undisclosed industry partner, who is funding a postdoctoral research fellow. Working with his colleague Professor Hongji Yang, a software engineer, and Dr Jerry Fishenden, a new media researcher and developer, Hugill has established a network of 30 academics from all areas of the university, who are working on a range of projects. These include digital heritage, artistic creation, software development, and some secret projects, all of which aim to increase understanding of creative computing.
>> The CCC has already attracted 10 PhD students (all except one self-funding) and edits the International Journal of Creative Computing (Interscience). It has a programme of seminars and visiting lectures. Recent speakers include Prof Jim Hendler (lead scientist of the semantic web) and Prof Willard McCarty (Professor of Humanities Computing). It also runs a Masters course and in 2015 is launching an undergraduate programme that includes specialist pathways in Animation, Gaming and Software Development alongside a major/minor combination with a range of subjects from all the university Schools.
>> One of the challenges for Bath Spa University is to integrate computing more effectively into its teaching and research. To this end, the School of Humanities and Cultural Industries has been examining Digital Humanities as an area for development. There is a significant opportunity to embed digital humanities thinking and practices across a range of subject areas, from Music, Visual and Performing Arts to Creative Writing, Literature and History. The CCC is committed to trying to achieve a thoroughly developed digital humanities throughout the university.
> Even if only by implication the term “creative computing” gives to digital
humanities a push in the direction of synthesis to complement its long-
standing analytic emphasis. This is happening via efforts in simulation,
but as the turn to simulation develops the experience and wisdom of
practitioners in the arts will, I’d think, be of great benefit. The discovery
of a new teacher is a cause for celebration.

Only the Shadow Knows

I hate the title of the special issue of _Differences_ on the digital humanities, [“In the Shadows of the Digital Humanities”][d], but I am intrigued by some of the articles, especially the ones by Adeline Koh:

> This essay explores the “social contract” of the digital humanities community. I argue that the social contract of the digital humanities is composed of two rules: 1) the notion of niceness or civility; and 2) the possession of technical knowledge, defined as knowledge of coding or computer programming. These rules are repeatedly raised within the public sphere of the digital humanities and are simultaneously contested and criticized. I claim that these rules and the social contract come from humanities computing, a field commonly described as the digital humanities’ sole predecessor. Humanities computing has historically differentiated itself from media and cultural studies, defining itself as a field that uses computational methods to address humanities research questions rather than exploring the impact of computation on culture and the humanities. I call for a movement that would go beyond this social contract by creating multiple genealogies for the digital humanities; by arguing that current conceptualizations of the digital humanities have not only developed from humanities computing but also include additional fields such as new media studies, postcolonial science and technology studies, and digital research on race, gender, class, and disability and their impact on cultures around the world.

I am curious about her claim that humanities computing is often positioned “as the digital humanities’ sole predecessor.” I was under the impression that digital humanities was an umbrella term invented — and this history here intrigues me — to accommodate both the old humanities computing and the newer digital/media studies. But while I have apparently been flirting with the digital humanities for quite some time, I am a latecomer to its intellectual history.

I’m also interested in *code studies* and in the idea of a *critical technical practice*, which Michael Dieter explores:

> This article reflects theoretically on the conditions of possibility for critical work to be conducted in the context of the digital humanities and aims to provide a broad conceptual vocabulary suitable for supporting and expanding this rapidly changing subdiscipline. It does so by elaborating on the framework of critical technical practice (CTP) first proposed by Philip Agre, suggesting how this notion might be connected productively with philosophical lineages of antipositivist epistemology, but as such traditions are reimagined and retooled for today’s informational contexts. Here, CTP is considered through the work of sociotechnical problematization, especially by the various techniques that differentiate existing infrastructural solutions on the basis of the purported material problems and difficulties they claim to address. The origin of Agre’s notion of CTP is linked back to its inspiration in the specific methodologies and concepts in the work of Michel Foucault. It is also suggested that other important connections to the thought of Henri Bergson, Gaston Bachelard, Georges Canguilhem, and Gilles Deleuze can be made. While presenting a rich set of resources for the consideration of sociotechnical problems, the argument is made that these resources might be productively placed in dialogue with existing digital methods and techniques through a reflection on media aesthetics. The article concludes by illustrating the relevance of this general framework with reference to a number of projects by media practitioners relevant to digital humanities, including the work of Rosa Menkman, YoHa, Julian Oliver, Dmytri Kleiner, and Esther Polak.

My chief problem? I can’t seem to access the current issue of _Differences_ — if I can, my university’s infrastructure makes it very difficult to understand how.


What is it about the digital humanities?

What is it about the digital humanities that attracts so much, so much … angst, anxiety, and/or vituperation? (Some of it well intended, some of it not.) As I’ve noted before, the digital humanities is an *ex post facto* label brazenly applied to a wide variety of activities: computational analyses of texts and other data normally the purview of the humanities, sometimes individually or sometimes on a scale not previously possible; the creation of new kinds of archives of such materials, paving the way either for traditional forms of analysis or for the new kinds of analysis just mentioned; or the creation of new kinds of texts, sometimes called the digital arts (or digital media). And even with this list I am surely leaving something, okay a lot, out.

Given the variety of activity, the situation that results is best likened to the fabled six blind men who encounter an elephant, wherein each can only know the part that they touch. (The fable has always bugged me, because of the lack of communication among the blind interlocutors, but let’s leave it at its original task: to remind its listeners/readers that knowledge is almost always partial.)

The current tempest in the popular teapot is Adam Kirsch’s book review essay, “Technology Is Taking Over English Departments: The False Promise of the Digital Humanities” appearing at the New Republic. ([Link][]. Note the telling addition of “limits” in the essay’s URL: the NR’s editors are laying it on thickly.)

Ignoring obvious false starts, like the fact that two mathematicians, Erez Aiden and Jean-Baptiste Michel are featured early in the essay or that it’s too easy to give prolegomena and provocations too much weight in such considerations, Kirsch does grasp at least that “the field has no common essence: it is not a species but at best a genus, comprising a wide range of activities that have little relationship with one another.” He also foregrounds some of the essential difficulties that the digital humanities face, in actually bringing to the fore some of the difficulties that the humanities themselves have ignored.

One of those difficulties is, as many already know, the demise of the scholarly monograph, whose history is a lot more complicated than most realize, its roots being in the change in the way publishing companies were taxed on inventory (which is to say they were) in the eighties and then the way libraries were funded (which is to say *not*) in the nineties. The internet offered a place to publish, to communicate, and some humanists experimented not only with the new medium not only as a place to publish conventional materials, but to try out new kinds of genres, genres previously impossible in a communicative infrastructure based solely on codices. This desire to make things for ourselves has been in parallel with a host of other interests in “making” that has arisen in an era where devices and machines are increasingly sealed “for protection” and/or roped off by vague claims to IP entitlements. As Kirsch notes, “Like many questions in digital humanities, this one remains open. But the basic emphasis on teamwork and building, as opposed to solitary intellection, is common to all stripes of digital humanists.”

It’s really when it comes to what new kinds of analyses the computational turn in the humanities might make possible, that Kirsch reveals a real blindness, assuming that you think as I do that some of the above is insightful in its own fashion. Taking Moretti’s “Style, Inc.” as indicative of the larger field of computation, Kirsch notes: “It is striking that digital tools, no matter how powerful, are themselves incapable of generating significant new ideas about the subject matter of humanistic study. They aggregate data, and they reveal patterns in the data, but to know what kinds of questions to ask about the data and its patterns requires a reader who is already well-versed in literature.”

A reader could substitute any domain expertise for *literature*: history, folkloristics, rhetoric, linguistics, etc. And so the question really becomes: what exactly is Kirsch’s complaint? That the digital humanities still think domain expertise is important? Central? Critical to the application of computational technologies and techniques? He returns to Aiden and Michel for his discussion, which only proves the point: both are mathematicians with little to no domain expertise in the humanities. Of course many of their grander claims are rather thin. (I watched several audience members at the Texas Digital Humanities Conference try to get Aiden to think about his impoverished understanding of human history and language use, but he just doesn’t get it.)

Where does all of this take Kirsch? Well, he muses, quantification is what has gotten the humanities *into* trouble — the corporatization of the American university, wherein corporatization refers to the bureaucratic impulse to quantify things, like education — and so it should be the role of the humanities to resist quantification. *Really?* Is this the best answer? The only answer? Isn’t it also the job (and the switch from *role* to *job* here is purposeful) of the humanities to critique, to lay bare the apparatus by which certain phenomena appear and forces work? Previously to this moment, and currently in the so-called “traditional” humanities, the humanities have largely responded to the quantification of everything with simply “not everything can be quantified.” Which is rather like the childhood response “Is not!” That is, in a world where a new field called *social physics* cranks out social network analyses of myths, the opportunity arises to respond by being better at it than the physicists.

And that’s what some in the digital humanities aspire to do. In the process, some are also banking on the notion that there may actually arise refinements if not wholesale revisions of methodologies that can only come from not only treating the kinds of materials that have long been the purview of the humanities but also by incorporating humanistic theories and forms of theorizing.

**See also**: [Alan Jacobs’ response]( — I’m very jealous of his site name, *text patterns*. So good. [Ted Underwood]( says “you can’t govern reception.” [Gary Hall]( argues that the humanities have long been undertaking computation. And, speaking of the corporatization of the university, aka *scientific management*, [Jill Lepore][] has a nice review of Matthew Stewart’s _ The Management Myth: Why the Experts Keep Getting It Wrong_.

*Revised* 19:30: because English is a stable language and deserves to be treated with more respect that the first draft. Also, there were some redundancies and excesses that needed trimming.

[Jill Lepore]: