, ,

Keiran O’Halloran’s conference in the context of the Digital Humanities Conferences addresses an issue less technical, yet more intriguing to the human mind. His focus is on the studies of electronic supplements on the web (i.e. discussion forums for magazines, articles, and the possibility of adding comments on the web) and how such external arguments can eventually bring to the deconstruction of the author’s main argument.
Although there seems to be a certain common sense behind his research (one could argue that reading a forum in parallel with an original text will lead to a more critical reading), O’Halloran uses technical methods to actually prove the existence of valid critics and reinsert them in the original argument.
To start with, he employs a simple keyword analysis on a reference corpus (keywords being those that appear with unusual frequency in the corpus, regarded as a larger norm). These keywords will reveal basic concepts and standard terms of reference. However, his trick lies into the extraction of keywords from the supplementary discussions, and the comparison with the original authors’ conception. If certain keywords found in the supplement are missing from the main article, it either means author is either avoiding something, or simply being ignorant to a certain obvious aspects of the topic discussed.
In order to complement the keyword analysis, O’Halloran adapts the theory of Jacques Derrida into a more pragmatic application. Derrida considers that a supplement is both an outside object (something missing from an original composition), but also an inside element (once absorbed into the original object, it comes to fill in certain deficiencies). Applied to the notion of electronic supplements, this approach would imply bringing in a new argument into the original story – since “there is a deficiency of normal conceptual usage in how an argument treats the topic”.
Finally, by putting these techniques together in an attempt to rebuild the original argument, and complete it with supplementary elements, if adding new elements to address eventual “deficiencies” will make the argument non-cohesive, one could easily argue there was certain instability in the starting argument and thus a very high probability of deconstructing it.
His presentation becomes even more interesting when he gives a practical example of a newspaper article, employs a W-matrix analysis to pull out the keywords in the forum discussion, and tries, in vain, to rebuild the initial argument based on the new elements brought into light. This is where we are able to see these techniques in action and better understand the importance of being able to regard an article with a more objective and complete perception.
As a conclusion, we could ask ourselves why such researches may be useful. First argument would be the faster familiarization with certain topics and the ability to follow an argument without being arbitrary – as we can see there are certain keywords and concepts that are (maybe deliberately) left behind by the author.
Of course, in the perspective of digital humanities, we could be cynical and consider that such methods constrain us to “follow the trends of the forums”, or even disregards arguments that might be original and bring new keywords and concepts to the way everyone sees a certain topic, or discusses it publicly. I believe it is all about being well informed, having legitimate interventions in such discussions, and being able to accept generic trends in certain (popular?) beliefs.
In the end, if we have no valid counter-facts to deconstruct an argument, might as well attack it from such a supplement point of view, don’t you think?