If you are interested in the digital humanities and don’t know where to start, you could do far worse than to browse the possibilities on TAPoR (Text Analysis Portal for Research), which offers 912 possible resources.
Occasionally, a colleague or student wants to learn more about the digital humanities. Here is a list of texts/sites/journals that are worth their consideration.
Literary Studies in the Digital Age: An Evolving Anthology is “published by the Modern Language Association of America. It is the MLAs first born-digital, publicly available anthology. It launched in 2013 and continues to grow. The editors welcome new submissions that will expand the breadth and depth of the collection, including pieces that offer primers on topics, tools, and techniques pertinent to computational approaches in literary studies as well as essays that deepen or nuance topics already covered in the volume.” Some of these essays are the de facto standard introductions to various dimensions of the digital humanities. They aren’t necessarily my favorites or even the best, but they do fall under the category of “everyone at least claims to have read them.”
Digital Humanities Spotlight: 7 Important Digitization Projects includes Mapping the Republic of Letters, London Lives, Charles Darwins Library, the Salem Witch Trials Documentary Archive and Transcription Project, The Newton Project, and Quijote Interactivo. This is an interesting collection of some of the more polished sites that are also publicly accessible.
DHQ: _Digital Humanities Quarterly.
DSH Digital Scholarship in the Humanities — the journal formerly known as LLC, _Literary and Linguistic Computing.
Patrick Juola and Stephen Ramsay announceed the publication of their new book, Six Septembers, though Zea Books, The University of Nebraska-Lincoln’s digital imprint. More than ten years in development, this book provides a broad conceptual introduction to the fundamentals of the mathematics that digital humanists are likely to encounter and to support high-level understanding of a variety of key mathematical ideas. The book is freely available under a Creative Commons CC-BY license, and can be downloaded from here.
Kudos to James O’Sullivan for a title so great I want to steal it: Cultural Mechanics is his podcast focusing on a really diverse range of digital humanities and digital arts topics. (Right now I would say it’s more digital arts in nature, but that may not be his overall focus.) Here it is on SoundCloud.
One of the things that interests me is all the ways that “statistical analysis” can be defined, even within the confines of a relatively nascent domain like text analytics. Of course, being nascent also means that things are not yet defined. Moreover, as a domain, text analytics is emerging at the intersection of a number of fields. Some of the differences about assumptions of what were the applicable dimensions of statistics, let alone mathematics, were quite striking at this year’s Culture Analytics program at UCLA’s Institute for Pure and Applied Mathematics.
Below is a recent request posted on The Humanist that I am capturing here as another entry in this area:
The work will involve investigating the temporal relationships between
spoken and gesture events, so experience with methods for conducting
statistical analysis (correlation, t-test, anova, hypothesis testing) are expected.
In addition, the preferred workflow is as follows:
Ideally, the work will be done in Python (ideally using pandas), but if people prefer using R, I’d be happy to hear from them.
I can’t, frustratingly, find the tweet now that brought this to my attention, but the Sherman Center at McMaster’s University has a nice collection of workflows that look really useful.
If you’re interested in the digital humanities, then you should know that a whole lot of it is available for free, including the stalwart A Companion to the Digital Humanities.
The University of Guelph, in Ontario, Canada, is hosting a collection of workshops May 9-12. A lot happens in those 3 to 4 days:
- Getting Going with Omeka with
Lisa Cox, Adam Doan, Melissa McAfee, Catharine Wilson.
- You’ve Got Data!: Introduction to Data Wrangling for Digital Humanities Projects with Paige Morgan.
- Text Encoding Fundamentals and Their Application with Jason Boyd.
- Minimal Computing for Digital Humanists with Kim Martin and John Fink.
- 3D Modelling for the Digital Humanities and Social Sciences_ with
- Spatial Humanities: Exploring Opportunities in the Humanities Jennifer Marvin and Quin Shirk-Luckett.
- Online Collaborative Scholarship: Principles and Practicies (A CWRCshop) with Susan Brown, Mihaela Ilovan, and Leslie Allin.
Full details are here.
A recent posting from _The Humanist_ noted the following:
> The MPhil Linguistics at the VU University Amsterdam now offers a two-years specialization in Linguistic Engineering. Linguistic Engineering is a young research field that holds a unique position between linguistics and computer science. The program is offered by the Computational Lexicology and Terminology Lab (CLTL), a leading research group in computational linguistics.
> Bachelors in linguistics, computer science, artificial intelligence or a comparable bachelor’s programme are encouraged to apply. Programming skills are not required, but candidates do need a clear motivation and a firm linguistic background.
> Take a look at the website of the CLTL for information about the program and the CLTL research group: http://www.cltl.nl/le for details.
> For more information on the MPhil Linguistics, admission and application, visit the VU University at: http://www.vu.nl/en/programmes/international-masters/programmes/l-m/linguistics-research/index.asp
Somewhere some part of me wants to respond “I do not think that means what you think it means” but another part of me recognizes that I am just fascinated by how these things are playing out.
Useful: [FreeCite]: “FreeCite is an open-source application that parses document citations into fielded data. You can use it as a web application or a service. You can also download the source and run FreeCite on your own server. FreeCite is distributed under the MIT license.”
In addition to the Map of Metaphor in English, there is also [Visual Correspondence], a project to make available not only the contents of correspondence but also the metadata — who received the letter? who else was a correspondent? — and to make all of this data and metadata available for visualization.
The European Summer University in Digital Humanities takes place across 11 days. The intensive
programme consists of workshops, public lectures, regular project
presentations, a poster session and a panel discussion. The *workshop
programme* is composed of the following thematic strands:
* XML-TEI encoding, structuring and rendering
* Methods and Tools for the Corpus Annotation of Historical and
Contemporary Written Texts
* Comparing Corpora
* Spoken Language and Multimodal Corpora
* Basic Statistics and Visualization with R
* Open Greek and Latin
* Digital Editions and Editorial Theory: Historical Texts and Documents
* Spatial Analysis in the Humanities
* Building Thematic Research Collections with Drupal
* Introduction to Project Management
Each workshop consists of a total of 16 sessions or 32 week-hours. The
number of participants in each workshop is limited to 10. Workshops are
structured in such a way that participants can either take the two
blocks of one workshop or two blocks from different workshops.
The description of all workshops can be found [here](http://www.culingtec.uni-leipzig.de/ESU_C_T/node/481) in at least two languages. Short bios in at least two languages are available of most [workshop leaders](http://www.culingtec.uni-leipzig.de/ESU_C_T/node/488).
From a recent _Humanist_:
> Date: Sat, 28 Feb 2015 13:52:01 +0000
> Subject: digital humanities at Bath Spa
> Recently I visited a small liberal arts university to the west of London, Bath Spa (http://www.bathspa.ac.uk), and was so impressed with what I found there that I invited my host, Professor Andrew Hugill, to write a brief description of his activities in digital humanities there. He sent the following.
>> Professor Andrew Hugill was appointed in April 2013 with a mission to develop the university’s digital portfolio. Hugill is a transdisciplinary academic, a composer who works in Music and Computer Science, as well as some of the wilder shores of French literature. He has a track record of creating cross-disciplinary entities, having established the Institute Of Creative Technologies at De Montfort University, which generated £7 million in external income under his direction.
>> In 2014, Hugill established the university-wide Centre for Creative Computing (CCC) at Bath Spa University, based at Corsham Court. This has already secured significant funding from NESTA (£125,000 for predictive analysis for museums) and an undisclosed industry partner, who is funding a postdoctoral research fellow. Working with his colleague Professor Hongji Yang, a software engineer, and Dr Jerry Fishenden, a new media researcher and developer, Hugill has established a network of 30 academics from all areas of the university, who are working on a range of projects. These include digital heritage, artistic creation, software development, and some secret projects, all of which aim to increase understanding of creative computing.
>> The CCC has already attracted 10 PhD students (all except one self-funding) and edits the International Journal of Creative Computing (Interscience). It has a programme of seminars and visiting lectures. Recent speakers include Prof Jim Hendler (lead scientist of the semantic web) and Prof Willard McCarty (Professor of Humanities Computing). It also runs a Masters course and in 2015 is launching an undergraduate programme that includes specialist pathways in Animation, Gaming and Software Development alongside a major/minor combination with a range of subjects from all the university Schools.
>> One of the challenges for Bath Spa University is to integrate computing more effectively into its teaching and research. To this end, the School of Humanities and Cultural Industries has been examining Digital Humanities as an area for development. There is a significant opportunity to embed digital humanities thinking and practices across a range of subject areas, from Music, Visual and Performing Arts to Creative Writing, Literature and History. The CCC is committed to trying to achieve a thoroughly developed digital humanities throughout the university.
> Even if only by implication the term “creative computing” gives to digital
humanities a push in the direction of synthesis to complement its long-
standing analytic emphasis. This is happening via efforts in simulation,
but as the turn to simulation develops the experience and wisdom of
practitioners in the arts will, I’d think, be of great benefit. The discovery
of a new teacher is a cause for celebration.
I hate the title of the special issue of _Differences_ on the digital humanities, [“In the Shadows of the Digital Humanities”][d], but I am intrigued by some of the articles, especially the ones by Adeline Koh:
> This essay explores the “social contract” of the digital humanities community. I argue that the social contract of the digital humanities is composed of two rules: 1) the notion of niceness or civility; and 2) the possession of technical knowledge, defined as knowledge of coding or computer programming. These rules are repeatedly raised within the public sphere of the digital humanities and are simultaneously contested and criticized. I claim that these rules and the social contract come from humanities computing, a field commonly described as the digital humanities’ sole predecessor. Humanities computing has historically differentiated itself from media and cultural studies, defining itself as a field that uses computational methods to address humanities research questions rather than exploring the impact of computation on culture and the humanities. I call for a movement that would go beyond this social contract by creating multiple genealogies for the digital humanities; by arguing that current conceptualizations of the digital humanities have not only developed from humanities computing but also include additional fields such as new media studies, postcolonial science and technology studies, and digital research on race, gender, class, and disability and their impact on cultures around the world.
I am curious about her claim that humanities computing is often positioned “as the digital humanities’ sole predecessor.” I was under the impression that digital humanities was an umbrella term invented — and this history here intrigues me — to accommodate both the old humanities computing and the newer digital/media studies. But while I have apparently been flirting with the digital humanities for quite some time, I am a latecomer to its intellectual history.
I’m also interested in *code studies* and in the idea of a *critical technical practice*, which Michael Dieter explores:
> This article reflects theoretically on the conditions of possibility for critical work to be conducted in the context of the digital humanities and aims to provide a broad conceptual vocabulary suitable for supporting and expanding this rapidly changing subdiscipline. It does so by elaborating on the framework of critical technical practice (CTP) first proposed by Philip Agre, suggesting how this notion might be connected productively with philosophical lineages of antipositivist epistemology, but as such traditions are reimagined and retooled for today’s informational contexts. Here, CTP is considered through the work of sociotechnical problematization, especially by the various techniques that differentiate existing infrastructural solutions on the basis of the purported material problems and difficulties they claim to address. The origin of Agre’s notion of CTP is linked back to its inspiration in the specific methodologies and concepts in the work of Michel Foucault. It is also suggested that other important connections to the thought of Henri Bergson, Gaston Bachelard, Georges Canguilhem, and Gilles Deleuze can be made. While presenting a rich set of resources for the consideration of sociotechnical problems, the argument is made that these resources might be productively placed in dialogue with existing digital methods and techniques through a reflection on media aesthetics. The article concludes by illustrating the relevance of this general framework with reference to a number of projects by media practitioners relevant to digital humanities, including the work of Rosa Menkman, YoHa, Julian Oliver, Dmytri Kleiner, and Esther Polak.
My chief problem? I can’t seem to access the current issue of _Differences_ — if I can, my university’s infrastructure makes it very difficult to understand how.
In case you haven’t been keeping up, the [Library of Congress hosts a number of blogs][blogs]. While some of them only infrequently publish, the overall amount of material available is really impressive.