Ron Brooks pointed me to a great example of the kind of authenticity I described in my first post.
The pithily titled “Enticing words printed on bags of potato chips have a lot to say about social class, Stanford researchers find” argues that “whether you crunch an ordinary chip or the priciest-exotic-root-vegetable chip, consumers of all social classes value the product that they think is most authentic.” The rundown of this study points to differences between claims at authenticity where the least expensive chips packagings frequently refer to tradition, nation or region, old family recipes, etc. and pricier chips evoke natural, hand-made, small-batch sea-saltiness.
What I like about this is the assumption that authenticity is manufactured as much as it is evoked. Or, rather, its summoning is also its creation. So, some chips bank on heritage and patriotism while others conjure a pure nature and craftsmanship. Really, both depend on closing the distance via nostalgia for a values lost to the past and/or a nature unavailable in the city. And so there would be nothing “authentic” in gathering your family to the kitchen to slice and fry up your own potato chips. That’s work.
Which brings me very late to the fascinating discussion of theory and the Digital Humanities that occurred a few weeks back. While it’s taking a while to catch up and this isn’t exactly my field, it seems to me that a great deal of the discussion focuses on what counts as theory. It isn’t difficult to see the “more hack, less yack” mantra as an argument that hack is more authentic (in this sense, practical and useful and human) than the yack that is theory. Many others point out that practice is always predicated on theory and that theory is a kind of hacking, too. Fair enough. But while there is a lot of discussion about what counts as theory and it value, there is (with a few exceptions) not a lot of attention paid to what counts as hacking. By and large, at least in the discussions and posts I read, it seems that hacking means coding, data management, etc.
This is certainly a broader definition that I offered before and I wonder if it’s too broad. If any kind of tool production is counted as a hack, then is there much difference between the coding and theory? I mean, I suppose making a hammer means hacking a tree and a mountain (for wood and metal) in order to discover some new potential in them. So hacking is defined by a goal and its means are exploitation. (Exploitation can be good; it means spelunking the gap.) Theory – and especially the philosophical kind – does pretty much the same thing to the library, etc., right? I don’t know.
Regardless, in the phrasing “more hack, less yack,” folks responded to the dismissive yack pretty strongly. It might not have been meant so aggressively, though. It’s entirely possible that it was meant to just be fun and colloquial. Yack also rhymes with hack, and so was maybe chosen for that rhyme? If so, hack counts first. There’s a mystique to hacking – movies, books, all that – that can certainly be capitalized on, and that mystique (myth) of the hacker is everywhere these days. Hack is rebellious and young. Yacking is what old people do. And hacking seems more American. Or less European (theory is European). Hack is definitely more punk rock. And nothing matters more to punk than authenticity.
Though I certainly fall more on the hack side of things, I think the yack is nevertheless important and I engage in it daily with my classes. I say this with one caveat: as you suggest, code is a theoretical construct in itself. Anyone doing work in digital humanities work should be willing to understand and be involved in the coding side of things rather than just focusing on the surface of the tech being examined.
Good point. I should code more. Used to have some mad BASIC skills in the 80s.