Taking Care of Digital Dementia

Theorizing 21c

Taking Care of Digital Dementia

From this difference in minds a question has arisen: whether those who are going to deliver a speech should learn it by heart word for word, or whether it be sufficient to master merely the substance and order of particulars. This is a point on which certainly no general decision can be given. For my own part, if my memory is sufficiently strong and time is not wanting, I should wish not a single syllable to escape me, else it would be to no purpose to write. Such exactness we should acquire in childhood, and the memory should be brought to such a condition by exercise that we may never learn to excuse its failures. To be prompted, therefore, and to refer to one’s writing is pernicious, as it grants indulgence to carelessness.

– Quintilian, Institutio Oratoria, Book 1, Chapter 2

And that was the nature of my game, because I’d spent most of my life as a blind receptacle to be filled with other people’s knowledge and then drained, spouting synthetic languages I’d never understand. A very technical boy, sure.

– William Gibson, “Johnny Mnemonic”

What Is Digital Dementia?

There is no conclusive empirical evidence that the Internet and other media technologies undermine cognitive skills such as memory and attention. And yet, arguments based on anecdotal evidence and technological determinism continue to persuade readers, many of whom, such as best-selling author Nicholas Carr, have a vague sense that they are not thinking like they used to think, and that technology is to blame. From the surveys of mysterious market research group Embrain to Carr’s Pulitzer Prize nominated book The Shallows to Bernard Stiegler’s anxieties about tertiary memory in the Technics and Time trilogy, there is a growing body of literature about digital media and cognitive disorders. One of the most recent manifestations of what might be called speculative cognitive science is the coining of the term “digital dementia.” Like other types of dementia, a diagnosis of this new disease is based primarily on anecdotal evidence in the form of self-reporting. While for some, this questionable cognitive condition may be cause for skepticism and dismissal, I would argue that, at the very least, this new disease should be embraced as a thought experiment, one that requires a practice of care and attention in the face of a situation that might best be approached speculatively.

To better understand Quintilian’s reproach of those orators who “refer to [their] writing,” it is useful to trace his prejudice back to Plato. As several scholars of Quintilian have pointed out, he was well-versed in the works of Plato, and he was, as Richard Marback suggests in Plato’s Dream of Sophistry, aware of Socrates’s critique of “the extreme abuses of oratory.” [1] Although Quintilian focused primarily on Plato’s sophistry-busting Gorgias, his comments about writing and memory are more closely aligned with Phaedrus. The Egyptian myth of Theuth and Thamus recounted in Phaedrus is well-known–and perhaps well–worn—by media scholars, but it is not always recalled with accuracy. In The Shallows, Nicholas Carr suggests that Socrates’s tale, in which Thamus rejects Theuth’s gift of writing, is evidence that Socrates “shares Thamus’s view.” [2] As Carr suggests, Socrates

argues that a dependence on the technology of the alphabet will alter a person’s mind, and not for the better. By substituting outer symbols for inner memories, writing threatens to make us shallower thinkers, he says, preventing us from achieving the intellectual depth that leads to wisdom and true happiness. [3]

One might turn to Jacques Derrida’s reading of Phaedrus at this point and suggest that Carr leaves no room here for irony in Plato’s text, nor does he address the technological intricacies that distinguish dialectics from sophistics. According to Derrida,

What Plato is attacking in sophistics, therefore, is not simply recourse to memory but, within such recourse, the substitution of the mnemonic device for live memory, of the prosthesis for the organ; the perversion that consists of replacing a limb by a thing, here, substituting the passive, mechanical “by-heart” for the active reanimation of knowledge, for its reproduction into the present. [4]

What Plato is critiquing through the mouthpiece of Socrates is not writing itself, but, like Quintilian, sophistic practices that draw on writing technologies to mechanize philosophical discourse, thus producing speakers who “will appear to be omniscient and will generally know nothing.” [5] As Mary Carruthers puts it, “Plato is specifically responding to the use of textbooks with a ‘cookbook’ approach as a substitute for live teaching.” [6] I will return to this very important point later on, paying special attention to prostheses and organs. For the moment, it is important to note that: a) Thamus bases his judgment on nothing more than anecdotal evidence; and b) what Thamus fears, ultimately, is that writing will result in a form of dementia–a “forgetfulness in the learners’ souls.” [7]

Twenty-five centuries after the death of Plato, a 2007 study conducted at Trinity College Dublin found that 25% of survey participants under the age of 30 couldn’t remember their own home phone number without consulting their handheld device. Only 40% of those under 30 could remember family birthdays, compared to 87% of participants over the age of 50. [8] The primary researcher, Ian Robertson, suggests that the results are due to “Technology-Induced Memory Atrophy.” [9] Similarly, ongoing research by South Korean psychiatrist Yoon Se-chang suggests that “as people are more dependent on digital devices for searching information than memorizing, the brain function for searching improves whereas an ability to remember decreases.” [10] The result of this dependency, suggests Se-chang, is a form of “digital dementia” that manifests itself in decreased memory performance. His observations were further corroborated by the South Korean marketing research group Embrain, which conducted a study in 2007 of the relationship between forgetfulness and the use of digital devices. In a survey of 2,030 salaried men, Embrain noted that 63% reported suffering from forgetfulness, and, of these, 20.4% blamed the digital devices that relieve them from the need to memorize information. [11] An article in the Korean Times sums up the problem in appropriately anecdotal terms:

Unlike before, people these days are not required to make much effort to remember things as they are just a button away from all the necessary information which is stored in cell phones, PDAs or navigators. All they have to do is just search through them. Easy access to the Internet also weakens memory capacity. Whenever people ask others about something, you will very likely hear: “Check the Internet.” [12]

The image conjured up by these studies is that of a gadget-dependent race of humans who “will appear to be omniscient and will generally know nothing”–that is, they will know nothing without access to their digital prostheses.

Besides the fact that this evidence of “digital dementia” seems to be rooted in anecdotal observation (primarily self-reporting), another problem with the concept is that it appropriates the term “dementia,” and simplifies it in ways that would be condemned by the scientific community. As Ian Robertson confides in a personal interview, “I don’t think it’s a term a scientist should use.” [13] First of all, it is essential to note that there are various types of dementia with identifiable symptoms and causes, including vascular dementia, Kreutzfeld-Jacobs Disease, Alzheimer’s Disease, and Dementia with Lewy Bodies, among others. Dementia, as any neuroscientist would suggest, is not exclusively about memory loss. The word itself simply denotes impaired mental function, and it points to a broad range of symptoms that impact not only memory but also communication and language, attention, reasoning and judgment, and visual perception, among other cognitive functions. For this reason, the term digital dementia, which emerged in response to perceived memory loss, is much too broad.

More recent research out of South Korea seems to address this problem by suggesting that digital dementia impacts not only memory, but also attention and emotional development. Dr. Byun Gi-Won of the Balance Brain Center in Seoul suggests that digital dementia is “characterized by memory deficits, attention disorders and emotional flattening among young people who spend too much time using a gaming device, web searching, texting, and multimedia on smartphones.” [14] A concept of digital dementia that accommodates memory, attention, and emotion would be much more promising as a research model. Unfortunately, Gi-Won’s theory is difficult to digest since it is based on a concept of brain lateralization (right brain versus left brain) that was championed by Roger Sperry in 1981 and has since been debunked numerous times by neuroscientists, including Manfred Spitzer, author of The Mind Within the Net: Models of Learning, Thinking, and Acting (2000). While Spitzer’s book title is a reference to neural nets rather than the Internet, his more recent monograph, Digitale Demenz (2012), may give some weight to this seemingly questionable concept.

Digitale Demenz provides an unforgiving and unforgetting damnation of digital media. Drawing on his own research and on the findings of dozens of published neuroscience experiments, Spitzer references fMRI scans, neural network modeling, and statistics to assemble a sobering report on the cognitive effects of exposure to digital media. Like Nicholas Carr and Maryanne Wolf, Spitzer relies heavily on the concept of brain plasticity. His primary argument, in spite of being misquoted by a number of online book reviews, is not that digital media are a universal cause of dementia, but that small children who use digital media are more likely to experience early onset dementia than those who interact with more complex tactile objects. In Spitzer’s words, “digital environments deprive you from the experience you need in early age to build up your entire brain.” [15] The result of his research does little to inspire confidence in the cognitive future of the so-called digital natives. “Dementia is not a diagnosis,” suggests Spitzer,

it’s a description of the process of cognitive decline–dē- down, mēns mind–so it’s downward with your mind. And with any descent, one thing is clear: the higher you start, the longer it takes to get down. I mean, it’s as simple as that. If you start at a sand dune, your way to the ocean is short. If you start at the top of Mount Everest, you can go down for a long time and you’ll still be high up. And it’s exactly the same thing with mental decline, we know that. [16]

The problem, for Spitzer, is not that we are all liquidating our cognitive capital into Google, but that we are giving small children digital devices that are limiting their cognitive development, and this could leave them, as adults, on the sand dune rather than on Mount Everest.

Unlike the previous diagnoses of digital dementia cited above, Spitzer’s work is rooted in sound neuroscience, and his conception of dementia is certainly more well-rounded than those based primarily on memory atrophy. Where Spitzer’s work may fall short, however, is in his suggestion that the changes wrought in the brain by exposure to digital media are permanent changes. The title of a review of Digitale Demenz sums it up as follows: “Does The Internet Make You Dumb? Top German Neuroscientist Says Yes–And Forever.” [17] Spitzer seems to associate digital dementia with permanent physiological changes, such as those wrought by Alzheimer’s disease, for example. But if the brain is as plastic as he and other neuroscientists have led us to believe, then why not imagine digital dementia as a sort of reversible dementia, such as that caused by depression, alcohol and drug abuse, and nutritional deficiencies. [18] Spitzer himself compares Internet usage to alcohol consumption, comparing technological competency to “alcohol competency training in kindergarten by giving a little schnapps every day.” [19]

But perhaps we are missing the point by focusing on the reversibility of digital media’s cognitive effects. Who is to say that we will want to reverse these effects at all? Like the human brain, digital media are also plastic in nature. As media and brains co-evolve, the effects of digital media observed by Spitzer today may not be consistent with effects observed in a decade or more. What’s more, who is to say that a digitally demented brain is not impaired at all, but rather, that it is a digitally enhanced brain? This would be the transhumanist argument, as I will discuss in greater detail below. As science, then, the concept of digital dementia may not hold water, and it obfuscates and possibly belittles forms of dementia that have been scientifically validated. What’s more, by labeling cognitive effects wrought by technology as a disability, techno-naysayers are at once engaging in a questionable form of ableist rhetoric while discounting the possibility that new media might actually enhance attention, memory, and affect, especially as our brains continue adapting to new media technologies. Still, in spite of the inflammatory rhetoric and questionable science surrounding digital dementia, it continues to be a compelling concept. At the risk of adding speculation to anecdote, I would suggest that our willingness to believe in such a disease may be provoked by a desire to find relief from the digital devices to which we are constantly tethered. Especially for those who have not been co-evolving with digital media since their birth, a diagnosis of digital dementia would be a great excuse to take a vacation from the tyranny of e-mail, texting, Facebook, and so on. The existence of such a disease also provides the baby boomer generation, which is facing the very real prospect of age-related dementia, with evidence that the digital natives who follow them are possibly less intelligent than their elders. Digital dementia, then, will be viewed by many as a pseudoscience fuelled by curmudgeonly jealousy and fear. And yet, the arguments of Spitzer are especially compelling.

Theories about technological impacts on cognition are indeed difficult to prove, especially since the central concept that fuels the work of digital naysayers–brain plasticity–might also provide counterarguments, as noted in my comments above. But such speculations may serve as useful thought experiments–not for the sake of propagating pseudoscientific medical terms that seem designed for mass media consumption, but for the sake of remaining mindful and vigilant in the face of our co-evolution with technology. As David Wills suggests in Dorsality, we are helpless when it comes to understanding the impacts of technology on the human race, and for this reason, we should reserve “the right to hold back, not to presume that every technology is an advance.” [20] Recognizing McLuhan’s maxim that “we see the world through a rear-view mirror,” Wills calls for a conceptual account of how technology “defines and redefines the human,” that keeps in mind that these definitions take place “downstream from the point at which a given technological creation was brought into effect.” [21] Socrates, for example, could not have predicted that literacy would lead to the invention of topical logic, the mastery of linear narrative, and the blossoming of a global university system. Given our helplessness in such a situation, media theorists should turn not only to scientific proof, but also to inventive speculation, which means working less like scientists and more like digital artists or science fiction writers. While digital dementia is suspect as far as science goes, and while it may not be an excuse for a mental vacation or a reason to dismiss the technological savvy of digital natives, it can serve well as an evocative concept to provoke critical thinking about how the human brain co-evolves with technology.

What I propose, then, is not to abandon the concept of digital dementia, but to treat it as intelligent science fiction, teasing out the ways in which this ersatz disease–especially as understood by the South Koreans–can be a learning tool for researchers who are interested in the relationship between technology and cognition. For the sake of drawing a more precise boundary around this thought experiment, I will focus primarily on the types of dementia most commonly associated with memory impairment and Alzheimer’s disease. My goal is not to coin another pseudo-scientific term (Digital Alzheimer’s), but to explore the question of memory and technology more carefully by placing it in a very specific context inspired by a scientific understanding of cognitive impairment. The result will not be a scientific taxonomy, but a speculative taxonomy, a generative exercise that may not provide solutions, but will open doors for discussion.

Memory Types: A Fine-Grained Speculation On Digital Dementia

In an article from the Johns Hopkins Memory Bulletin, Peter V. Rabins identifies four different memory systems that are relevant in discussions of Alzheimer’s: episodic memory, semantic memory, procedural memory, and working memory. Rabins’s list is by no means exhaustive; he does not, for example, distinguish between explicit and implicit memory. Yet his simplified outline, which has been replicated numerous times in publications aimed to raise awareness about Alzheimer’s disease, can serve as a useful template for organizing ideas about memory and technology. The first type of memory explored by Rabins is episodic memory.

i. Episodic Memory

The temporal lobe, which contains the hippocampus and the prefrontal cortex are important to episodic memory, which enables us to learn new information and remember recent events. The hippocampus is one of the first brain structures damaged in Alzheimer’s disease and accounts for one hallmark of early Alzheimer’s: difficulty remembering recent events, without any trouble remembering events from long ago. [22]

Episodic memory, as Rabins suggests, refers to autobiographical events, including those that occurred long ago. A person living with Alzheimer’s may be able to recall childhood events yet be unable to recall an event from the previous day. Digital recording technologies are ideal for storing episodic memory. In fact, social media networks–including Facebook, Twitter, and YouTube–might be collectively described as a massive, ubiquitous apparatus for the storage and recall of episodic memory.

We might even conclude that the increasing ubiquity of technologies for episodic recording and transmission is too comprehensive. As Viktor Mayer-Schönberger suggests in Delete: The Virtue of Forgetting in the Digital Age,

Comprehensive digital memory represents an even more pernicious version of the digital panopticon. As much of what we say and do is stored and accessible through digital memory, our words and deeds may be judged not only by our present peers, but also by all our future ones. [23]

When it comes to episodic memory in the digital age, the problem may not be amnesia at all, but instead a form hypermnesia, or an excess of episodic memory.

Keeping in mind the slippage that occurs in Mayer-Schönberger’s theory between human memory and computer memory–one must not confuse archiving or storage, for example, with recall or remembering–we should pause here to consider whether the storage of memory in digital databanks is an accurate comparison to episodic memory stored in the human brain. An episodic memory is only episodic because it can easily be recalled; we have stored it for safekeeping in our consciousness, and we can call it up at will. The correct terminology from neuroscience suggests that episodic memory is explicit. A digital memory, on the other hand, may not be explicit at all. Surveillance images taken in the street, the shopping mall, or the local bank, for example, are more akin to implicit memories. Such surveillance data are a form of unconscious memories to which we do not have easy access. More pertinently, the multitude of digital photos stored on a personal computer may also slip from explicit to implicit memory over time. These images are stored autobiographical memories, but we rarely (if ever) call them up; we may even forget that they exist. As José Van Dijck suggests in Mediated Memories in the Digital Age, thanks to storage space and easy access to recording devices, “recording and collecting” experiences seems to have taken the place of recalling them. [24]

There is nothing new, perhaps, about the mania for information recording that characterizes the digital age. As Ann M. Blair suggests in Too Much to Know, the Renaissance was stricken by a severe case of “infolust,” which manifested itself in the production of numerous reference works, summaries, and compilations. [25] What is different today is the relative ease and cost efficiency of information recording. In Delete, Mayer-Schönberger suggests that with the invention of cheap and accessible digital storage, remembering has become very easy and relatively inexpensive. “Throughout millennia,” Mayer-Schönberger notes, “forgetting has remained just a bit easier and cheaper than remembering.” [26] But, as his argument goes, this has changed in a digital age of cheap recording and storage devices, where remembering seems to be the default. The problem with this argument is that just because we store something digitally, it does not necessarily mean we remember it. It might be more accurate to say that storing memories on digital devices is a more sophisticated form of forgetting that involves a shift in boundaries between the explicit and the implicit. That said, the mania with digital recording and collecting may lead to episodic memory loss for several reasons:

1) As Van Dijck suggests, we take chances when we entrust our episodic memory to external storage devices: “Even digital memories can fade–their fate determined by their in silico conception–as the durability of hard drives, compact disks, and memory sticks has yet to be proven.” [27]

2) As already mentioned, the affordability and availability of storage space creates a situation in which virtually all autobiographical events can be stored, thus creating the problem of what Mark Andrejevic has called infoglut. The difficulty of arranging these memories in a manner that is easy to navigate, not to mention the problem of finding time to navigate this infoglut, may lead to the loss of such memories.

3) Finally, as I have argued elsewhere, the mania for recording life events may alter our attunement to experience. The recording of an event has become a seminal part of the event itself. The prosthesis has become confused with the organ, and in the process we run the risk of experiencing events not as participants, but as recording devices, digitally capturing moments that we will never revisit. I will elaborate on this concept more directly in the section on Working Memory.

ii. Semantic Memory

Semantic memory governs general knowledge and facts, including the ability to recognize, name, and categorize objects. This system also involves the temporal lobes and, researchers suspect, multiple areas within the cortex. People with Alzheimer’s disease may be unable to name a common object or to list objects in a category, such as farm animals or types of birds. [28]

The drive to categorize information for the sake of storage external to human memory is what drove the print revolution to its apex. As Walter J. Ong notes, the invention of topical logic by Rudolph Agricola and Petrus Ramus moved rhetoric away from internal storage, as advised by Quintilian, and toward a dialectical arrangement of information in an externalized, methodical space. What’s more, the topical method “was to prove itself unexpectedly congenial to printing techniques.” [29] Ramist methodology was designed to bring an end to the random iconographies and associations of the memory palace; it was supposed to ensure that information could be arranged logically, represented consistently, and disseminated reliably from person to person.

As Mary Carruthers suggests, this tidy distinction between topical and iconic mnemotechnics–let alone the distinction between internal and external storage–is somewhat misleading. Verbal and pictorial forms of information are inextricably linked during the processes of storage and recall. [30] Carruthers cites a passage from Hugh of St. Victor, who instructs his students to memorize texts “from the same written source, lest a confusion of images caused by seeing different layouts make it impossible for the brain to impress a single image.” [31] Imagine what Hugh of St. Victor would think of texts on the Internet, the appearance of which is entirely dependent on a number of variables, from screen resolution to browser type? More important than the stability of information in its external recorded state, Carruthers suggests, is the stability of internalized categories used for recall, and these structures become increasingly complex as we are exposed to ever more information. “Without a sorting structure,” Carruthers notes, “there is only a useless heap, what is sometimes called silva, a pathless forest of chaotic material.” [32] The very fluidity of the digital landscape leaves it open to endless recategorization, editorializing, and reinterpretation. What we have in digital space is not a consistent dialectical logic as prescribed by Ramus, but a radically dynamic logic of hyperlinking and customization. Any image or word on a screen can be turned into a link, creating infinite possibilities for categorizing and recategorizing topical space. What’s more, anything digital can be cut up, pasted in a new context, and repurposed to suit the needs of individual users. Unlike the printed text, the digital environment is designed for semantic instability and impermanence. The Internet itself, then, suffers from a semantic disorder, or what we might call digital différance. In such a situation, one cannot trust that meanings and categories will remain stable in any way. On a computer screen, the place of information, the categorization of things, takes on new and unpredictable forms.

In addition to the problem of unstable meaning then, the complex networking of digital discourse raises issues of categorization. The branching trees of Ramist dichotomies may seem to be mirrored in the tree structures of databases; however, as Lev Manovich has shown, the logic that rules database trees is not reductive at all, but rather generative. In form, the database appears to be a tool for logical categorization, but in practice, it serves to facilitate the generation of “endless variations of elements.” [33] These endless variations are a result of the massive storage capacity of databases, which yields novel data sets of increasing complexity. As already noted, the yielding of this data relies on a commitment to recall, and this requires properly tagging each item in the database in a consistent and predictable manner. Proper tagging, however, is not a skill possessed by every user of digital devices, many of whom don’t even bother to organize files into discrete folders on their personal computers. In addition to the sheer problem of infoglut (as discussed above), we are facing an acute problem of info-disorientation, as users are faced with the increasingly demanding task of establishing meaningful categories for their digital memories.

Van Dijck, for example, relates the story of her friend who, in the process of digitizing his family’s collection of photos and videotapes, encountered “issues of time and order: time to enjoy and relive personal recorded cultural and personal moments . . . and order to allow the retrieval of specific moments, as the danger of getting lost in his multimedial repository was growing by the day.” [34] Similarly, in an essay written for tech magazine Fast Company, Clive Thompson describes the memory woes experienced by Gordon Bell, who achieved fame as a Microsoft researcher by lifelogging using the MyLifeBits platform. Having interviewed Bell, Thompson catches a glimpse of what might be called semantic digital dementia related to problems of categorization and recall:

Bell often finds himself lost in the forest. He hunts for an email but can’t lay his hands on it. He gropes for a document, but it eludes him. While eating lunch in San Francisco, he tells me about a Paul Krugman column he liked, so I ask him to show it to me. But it’s like pulling teeth: A MyLifeBits search for “Paul Krugman” produces scores of columns, and Bell can’t quite filter out the right one. When I ask him to locate a phone call from one of his colleagues, he hits a bug: He can locate the name of the file, but when he clicks on it the data are AWOL. “Where the hell is this friggin’ phone call?” he mutters to himself, pecking at the keyboard. “I either get nothing or I get too much!” [35]

Vacillation between amnesia and hypermnesia may well be the most apt description of a lifelogger’s cognitive activity.

In addition to the problems of information disorientation and infoglut, speed itself also leads to pernicious semantic consequences. For example, the velocity of digital discourse, combined with the handheld nature of our prosthetic devices, lend themselves to a semantic dementia in which the grammatological building blocks of the written word are radically altered. The language of texting, for example, which is based on a logic of instantaneous communication and on the muscle memory of thumbs, obliterates vowels and transforms single letters and numbers into homophonic devices, as seen in expressions such as “g2g,” “brb,” and “gr8.” These novel semantic shortcuts, fostered in the cradle of handheld texting, have begun to make their way into other forms of discourse, including the term papers of students and regular conversations, in which the expressions “YOLO” and “LOL” have become (admittedly annoying) mainstays.

Working against this semantic creativity, but working still in the name of speed, is the mechanization of language by means of word processing software, which threatens our ability to internalize the categorical rules of language, such as verb conjugation, spelling, and punctuation. As David Wills puts it,

Just as we are rapidly forgetting penmanship, we can easily imagine future generations that will not need to learn how to spell, form plurals, conjugate verbs, obey the sequence of tenses, and so on. And of course, a literacy that relies on digital technologies is trained by icons and a mechanized language production may well not want to or know how to read a Joyce, a Proust, a Beowulf, and so on. [36]

Of course, this sort of speculation can only be proven once we are in a position to slow down, look into the rear-view mirror, and assess what has happened to our semantic intelligence at the hands of digital media. What will we see in that mirror? Will it be an army of artificially intelligent drones chanting spell-corrected mantras and engaging in the hyperspeed production of what Kenneth Goldsmith has called uncreative writing? Will it be a generation of babbling digerati confusedly chirping their narrow folksonomies into increasingly splintered social networks? Or, as Maryanne Wolf asks in Proust and the Squid, “will the demands of our new information technologies–to multitask, and to integrate and prioritize vast amounts of information–help to develop equally if not more valuable new skills that will increase our human intellectual capacities, our quality of life, and our collective wisdom as a species?” [37] The answer to this question can only be found in the rear view mirror.

iii. Procedural Memory

The cerebellum is one of the structures involved in procedural memory. Procedural memory is what enables people to learn skills that will then become automatic (unconscious), such as typing or skiing. This memory system typically is not damaged in Alzheimer’s disease or is one of the last cognitive domains to deteriorate. [38]

In the hypermnesic milieu of digital culture, procedural memory trumps all other forms of memory. What is crucial in such an environment is not for a person to know the information, but to know how to find the information. And this requires physical procedures, well beyond the internalized mnemotechnics of ancient and medieval orators. To engage in human-computer recall, we must remember how to turn on the device and use a mouse or trackpad to navigate to the desired folder, or how to click to open the web browser and type a query into Google. In a digital milieu, our reliance on procedural memory is what gives us access to vast stores of information; but this reliance can also spur technophobic–possibly even techno-apocalyptic–fears. In the words of Nicholas Carr, “we are evolving from being cultivators of personal knowledge to being hunters and gatherers in the electronic data forest.” [39] The implicit suggestion here is that the procedural search technics used to gather digital data are a form of cognitive devolution from our previous habits of storing information as episodic memory. As I will discuss in the concluding section of this essay, Carr’s suggestion, as well as the work of Maryanne Wolf, are rooted in an essentialist understanding of the human that does not take into account our species’ technical nature. As has been argued by Gilbert Simondon, Bernard Stiegler, and others, we are always already technical beings. The procedural wielding of tools, such as the repeated use of a sharpened rock to break open a coconut, or the shared use of symbols to communicate ideas, is what makes us human. One need not mention the exceptional case of knapped flint. If our digital tools seem exceptional, it is–as I have already argued–because of the dual factors of speed and volume. We have more tools than ever to access a mounting glut of information, and these tools, which are increasingly complex, are being produced at an exponential rate.

As long as we can repeat the procedures for accessing digital memory, our information will be safe. But the mania for technical innovation in all sectors of society, driven forward by a free market, means that we are constantly learning new procedures based on new interfaces and new devices. For those raised on print literacy, procedural access to information–understanding how to open a book, consult an index, fold a page–was always consistent. The intellectual procedures for reading print-based materials have existed in a relatively stable state for several centuries. But for digital natives, the procedures are constantly changing, designed as they are to be learned, unlearned, and relearned, as software and hardware change according to the exponential logic of Moore’s Law. As with the case of digital episodic dementia, the cause of digital procedural dementia is not based on lack, but on excess. This situation could be intellectually catastrophic for users who cannot or simply will not keep up with the fast pace of digital change, and consequently find themselves lacking the necessary procedures to transfer memories to a new digital device that is not threatened by obsolescence. In short, if users lack the procedural skills required for a necessary device upgrade, they risk losing a large portion of their episodic and working memories. We may need to keep a vast archive of obsolete handheld devices on hand to avoid the need for catastrophic upgrades.

Another form of digital procedural dementia is not a result of incessant change, but of excessive repetition. This dementia is caused when an individual mechanically enacts digital-native procedures in the absence of any digital device. John Maeda describes this condition anecdotally in his foreword to the Critical Making anthology: “I was in an ink-drawing class where I noticed that whenever I made an error, my hand would reach for command-Z on an invisible keyboard in my mind.” [40] The “CTRL-Z reflex”, as I will call it, is a combined mental and physical undo reflex enacted in response to a regretful action. CTRL-Z (or command-Z on a Mac) is a keyboard shortcut for undoing a procedure in most software programs. Unfortunately, this shortcut does not work in the non-computer world, but this doesn’t stop the reflex. For example, I might drop a glass from a shelf, and in the split second that it shatters on the floor, my mind is impressed with the thought of “CTRL-Z,” and my left thumb and middle finger enact the rapid keystrokes necessary to undo a computer error. Also in this category are such physiological phenomena as texting thumb (an uncontrollable twitch reflex related to keyboard use on a handheld device) and phantom vibration syndrome (a false impression that a digital device is vibrating against the body even when it is absent). What these procedural glitches ultimately demonstrate is the extent to which humans co-evolve with their environments in complex and unpredictable ways. The CTRL-Z reflex is a palpable reminder of the procedural confusion encountered by beings that have become reliant on a technical complexity of which they are not fully aware.

iv. Working Memory

Working memory involves primarily the prefrontal cortex. This memory system governs attention, concentration, and the short-term retention of needed information, such as a street address or phone number. Problems with working memory can impair a person’s ability to pay attention or to accomplish multi-step tasks. [41]

Perhaps more than any other form of memory, working memory may be the most impacted by digital media. As indicated in the studies mentioned at the beginning of this essay, attention and concentration can be seriously impacted by digital information, and this may in turn lead to the loss of short-term memory. In the words of Nicholas Carr,

We ask the Internet to keep interrupting us, in ever more and different ways. We willingly accept the loss of concentration and focus, the division of our attention and the fragmentation of our thoughts, in return for the wealth of compelling or at least diverting information we receive. [42]

The folk wisdom about children with short attention spans may have emerged from a television generation, but has reached its zenith in the digital generation. A report from the U.S. Centers for Disease Control and Prevention indicates that in 2011-2012, an estimated 6.4 million children aged 4 through 17 had received an Attention Deficit Hyperactivity Disorder (ADHD) diagnosis at some point in their lives. [43] This is a 16 percent increase since 2007 and a 41 percent increase in the past decade. While the rhetoric surrounding ADHD treats the condition like a medical epidemic, some believe that what we are witnessing is not a disease, but a cognitive shift experienced by a generation weaned on the Internet, video games, and handheld devices.

Such is the conclusion of N. Katherine Hayles, for example, who suggests in her article “Hyper and Deep Attention: The Generational Divide in Cognitive Modes” that “there is little doubt that hyper attention is on the rise and that it correlates with an increasing exposure to and desire for stimulation in general and stimulation by media in particular.” [44] Hayles attributes the rise in ADHD primarily to cognitive demands placed on young individuals exposed to digital media. Like Carr and Wolf, she bemoans the potential waning of cognitive modes suitable for contemplation and deep reading, but she does not demonize hyper attention as a form of devolution or disease. In Hayles’s formulation,

deep attention, the cognitive style traditionally associated with the humanities, is characterized by concentrating on a single object for long periods (say, a novel by Dickens), ignoring outside stimuli while so engaged, preferring a single information stream, and having a high tolerance for long focus times. Hyper attention is characterized by switching focus rapidly among different tasks, preferring multiple information streams, seeking a high level of stimulation, and having a low tolerance for boredom. [45]

Rather than decrying her students’ waning ability to engage in deep reading, Hayles suggests that educators should embrace hyper attention in the classroom, and attempt to combine it with deep attention in novel assignments that will stimulate students.

The potential problem with Hayles’s educational formula is that students today are immersed in a mode of hyperactive cognitive training that is habit-forming, and that may have a serious impact on their ability to think critically and focus their attention. In Carr’s words, “as we reach the limits of our working memory, we become unable to distinguish relevant information from irrelevant information, signal from noise.” [46] We may be asking too much of students by requiring them to engage in deep attention–they simply do not have the cultural training for a cognitive mode that fosters critical reflection.

In addition, the constant state of distraction fostered by ubiquitous digital media leaves very little opportunity for working memory to transfer information into long-term memory. As evidenced by the Irish and Korean surveys quoted at the beginning of this essay, the cognitive overloading of working memory is already manifesting itself in digital forgetfulness, and various studies (e.g., Mangen, Walgermo, Bronnick, et al.) have shown that retention rates when reading from a screen, where distractions are merely anticipated, are far inferior to rates when reading from a printed page. In a digital culture, we may be trying to process too much too quickly, and the result may be that we are not processing anything at all. This brings us back to Thamus’s indictment that those who rely too heavily on the externalization of memory will “appear to be omniscient and will generally know nothing.” [47]

The ubiquitous presence of digital media in our lives may not always be a distraction, however. In fact, some devices are designed to disappear altogether–or at least they attempt to provide this illusion. A recent device designed for ubiquitous data collection is Autographer, a lifelogging camera designed to be used passively, capturing images at intervals of 30 seconds for extended periods. The slogan for the device is simply “Click on. Go out.”–suggesting that the device will unobtrusively document your activities. A competing device called Narrative (formerly known as Memoto), takes a more philosophical approach, suggesting that the device will help salvage potentially forgotten memories. As the promotional video states, “At Narrative, we love those small moments, but we hate forgetting them. So we thought, what if we could capture those moments, and create a true photographic memory?” [48] I have already considered the potential cognitive pitfalls of information overload resulting from the excessive collecting of autobiographical images. My own experiments with the Autographer device have led to what might be called a panic scene, and I have struggled to find useful ways to categorize, tag, and display the thousands of images I have taken. That said, children weaned on digital distraction may not face the same anxieties. But this possibility reminds me too much of an Ender’s Game scenario, in which hyperactive children are recruited as space warriors due to their superior cognitive abilities.

The Politics of Hypomnesis

There remains another potential side effect of ubiquitous externalized memory collection. While the Autographer suggests that the constant recording of experience opens one up to new and exciting adventures, it may also allow individuals to rely so heavily on such devices that their live experience of space and time becomes merely a trivial detail. As I argued in the section on Procedural Memory, the mania for recording alters our attunement to experience. Lifeloggers, for example, are “confusing the prosthesis for the organ,” [49] transforming themselves into recording devices so that they can digitally capture information that they may never recall. Ubiquitous recording technologies may alter our entire relationship to information and experience, offering us the chance to tune out from reality and focus on other distractions, knowing that live experiences can be replayed at any time in the future. This makes the event itself nothing more than an opportunity for archivation. Could the access to information by means of digital memory alter our attunement to lived experience more generally?

If information is always available to us externally, there is no need to pay attention, set the working memory in gear, and internalize information. Our digital attunement to the world frees the working memory from its previous task of transferring experience from short-term to long-term memory. We could be facing a culture of individuals for whom long-term memory is only accessible as data. Imagine, for example, that we are recording audio and video of every moment of every day. At the end of the day, we could rewind to any point and review something that we had forgotten–a person’s name, an item in a shop, a detail in a conversation–perhaps because we weren’t paying attention. Why pay attention at all, for that matter, if this supplement is always there, doing the work of working memory, transferring experience into digital episodic memories? Being present in the moment, in this case, might always be deferred. What’s more, the deferred memories may never be revisited. Life itself might be experienced as a series of rerun episodes played out after the fact. One question remains, however: What will the brain be doing in the moment, then, if it’s not paying attention and storing memories?

The question may be ludicrous, rooted as it is in a naïve understanding of both cognition and presence; but as noted already, it is a useful thought experiment that helps reveal what is at stake in discussions of the externalization of memory. As Mary Carruthers suggests, the value of memory is culturally determined, and this value has political consequences. For example, in medieval times, a strong memory was associated not only with intelligence but also with moral fortitude. Current speculation about the digital transformation of the brain reveals a politically inspired cognitive hierarchy rooted in an ableist and ageist understanding of what it means to be human. According to Neil Postman, Nicholas Carr, Manfred Spitzer, and other techno-skeptics, new media technologies are eroding our cognitive capacities–let alone our moral standards–and transforming us into something less that human. Clearly, for these critics–as for Quintilian and Plato’s Theuth–there is a hierarchy of cognitive modes in which long-term memory and deep attention are at the top. The ability to memorize and recite, according to this cognitive model, is the apex of human intellectual achievement, and any technology that threatens this hierarchy is viewed as destructive and immoral. It is instructive to note the correlation between the fatalistic language used to describe digital dementia and the dehumanizing labels often used to describe Alzheimer’s. As discussed by Sam Fazio in “Rethinking Alzheimer’s disease: The impact of words on thoughts and actions,” characteristics such as short attention span and limited short-term memory might be reframed, subsequently, as “spontaneity” and “curiosity.” [50] This exercise of reframing, which challenges cultural perceptions of Alzheimer’s disease, might equally be applied to digital dementia, as a way of exploring the possible benefits that digital media may have on cognition and related behaviours.

What the discourse on digital dementia amounts to is essentially a counter-narrative to the rhetorics of human augmentation via technology, such as that found in transhumanist discourse. The “Transhumanist Declaration,” for example, “envision[s] the possibility of broadening human potential by overcoming aging, cognitive shortcomings, involuntary suffering, and our confinement to planet Earth.” [51] Further, the Declaration insists that individuals should have “wide personal choice over how they enable their lives.” [52] The assumption here is that human life is currently less than able, shackled as it is to a rotting piece of flesh. For the transhumanist, then, technological prostheticization will allow humans to overcome their natural impediments. At the other end of the spectrum, for the digital skeptic, technological prostheticization is a form of impairment that tragically alters human nature.

The problem with these viewpoints is that both fail to understand the human as always already technological. Seeing our species in this way helps temper polemical rhetorics of prostheticization, just as it reminds us that ultimately, we can make choices about which technological prostheses to embrace or to discard, including prostheses for memory.

The concept of digital dementia implies, first of all, that our digital tools are inhuman, if not inhumane. Cell phones and laptops from this point of view are not a natural or organic part of human memory, but rather, like the writing instruments of Theuth, serve as an artificial, and potentially poisonous, supplement. But the supplement is deadly only if one imagines the human as possessing some sort of innate or essential presence without prosthesis. In the first volume of Technics and Time, Bernard Stiegler, following André Leroi-Gourhan, suggests that “the human invents himself in the technical by inventing the tool–by becoming exteriorized techno-logically.” [53] There is no human, then, without prosthesis. We are always already technical. Developing Stiegler’s thesis, David Wills suggests that this moment of human invention, of prostheticization, of technologization, is also a “matter of archivation: what is created outside the human remains as a matter of record and increasingly becomes the very record or archive, the artificial or exterior memory itself.” [54] What this suggests is that there is no human without the archive, no human without prosthetic memory. Any conception that views reading or writing as somehow unnatural for the human misses the point that perhaps the only thing natural about the human brain is its ability to adapt rapidly to changing technological implements and environments. With this in mind, contrary to the suggestions of Maryanne Wolf and other cognitive scientists, there is nothing more natural for the human brain than to adapt to the technical demands of reading.

The threat of hypomnesis (externalized memory) lies not in its potential destruction of natural human processes, but in the potential for these processes to be governed by others, without our knowledge or permission. In Institutio Oratoria, Quintilian tells the story of Simonides, the Lyric poet, who exits a banquet hall moments before it collapses, and later is able to recall the location of everyone beneath the rubble. This capability came from the rhetorical mnemotechnical method of loci whereby Simonides stored his oration spatially throughout the banquet hall. According to Cicero, the politician Themistocles was not impressed with Simonides’s “memory palace.” He responded as follows: “I would rather a technique of forgetting, for I remember what I would rather not remember and cannot forget what I would rather forget.” [55] Themistocles’s witty rejoinder reminds us of something important: whether or not we rely on external digital devices to store memory, in the act of archivation, we are always leaving something out. Or in the words of Stiegler, “there is no memorization without forgetting, there is no memorization without selection. The important question is to know who is doing the selecting.” [56]

For Stiegler, the process of remembering and forgetting is central to the process of transindividuation, the process by which we individuate ourselves, define ourselves against others. By allowing our externalized memory tools to play a crucial part in what we are remembering and forgetting, we are giving these tools a crucial role in our process of individuation. Who gives us these tools and who controls access to them? How do these tools help to sort what is remembered and what is forgotten? We know already that Google has a huge stake in our ability to recall information without really knowing it. And YouTube and Facebook serve as repositories for episodic memories. We should not underestimate the power of these technologies over human memory in all its forms. It is for this reason, perhaps, that the question of hypomnesis, in Stiegler’s terms, “is a political question, and the stakes of a combat: a combat for the politics of memory, and more precisely, for the question of sustainable hypomnesic milieux.” [57]

We might want to take care of digital dementia then, not as a form of cognitive impairment–after all, cognition must be considered within specific historical and cultural contexts, and what’s more we cannot predict how the brain will adapt to the increasing cognitive demands emerging from new media–but as a potential political impairment, one that occurs when we lose control over the psychic, biological, and technical aspects of specific cognitive tasks. Digital dementia may not be a disease, but the symptoms it identifies may well be the result of a war on cognition that is being waged by some of the most powerful corporations in history. If, for this reason, certain tools for the externalization of memory, for example, are indeed pernicious to the human animal, then we might approach the problem with what Bernard Stiegler has called a therapeutics of care. In Taking Care, Stiegler references Plato’s description of writing as a pharmakon (that which can both cure and kill), suggesting that “newly grammatized symbolic media are a network of pharmaka that have become extremely toxic and whose toxicity is systematically exploited by the merchants of the time of brain-time divested of consciousness.” [58] This indictment of new media technologies is not a reason for Luddite withdrawal, however. As Stiegler suggests, that toxic network is “also the only first-aid kit that can possibly confront this care-less-ness, and it is full of remedies.” [59] The concept of digital dementia, brought to life by news media and propagated as an Internet meme, may well be science fiction, but this toxic concept might be used to generate a more careful discussion about the politics of cognition in an age of rampant hypomnesis. At the very least, such a discussion might inspire, for example, scholars to forego reading from a page at conferences, and turn instead to active dialogue, or perhaps to experimental performances that adopt digital media as a first-aid kit in confronting the issue of cognition and technology.

Notes

[1] Richard Marback, Plato’s Dream of Sophistry (Columbia, SC: University of South Carolina Press, 1999), 50.

[2] Nicholas Carr, The Shallows: What the Internet Is Doing to Our Brains (New York: W.W. Norton & Company, 2010), 55.

[3] Ibid.

[4] Jacques Derrida, “Plato’s Pharmacy,” Dissemination, trans. Barbara Johnson (Chicago: University of Chicago Press, 1981), 108.

[5] Plato, Phaedrus, trans. Benjamin Jowett, Project Gutenberg, October 30, 2008, accessed November 5, 2013, http://www.gutenberg.org/files/1636/1636h/1636-h.htm.

[6] Mary Carruthers, The Book of Memory: A Study of Memory in Medieval Culture (New York: Cambridge University Press, 2008), 35.

[7] Plato.

[8] Ian Robertson, “Technology and Cognitive Function,” unpublished report, Trinity College, Dublin, 2007, n.p.

[9] Ibid.

[10] Park Chung-a, “Digital Dementia Troubles Young Generation,” Korean Times, June 8, 2007, http://www.koreatimes.co.kr/www/news/nation/2008/04/117_4432.html (accessed November 8, 2013).

[11] Ibid.

[12] Ibid.

[13] Ian Robertson, personal interview, Trinity College, Dublin, May 27, 2014.

[14] John Thomas Didymus, “South Korean doctors warn smartphones cause ‘digital dementia,'” Digital Journal, June 24, 2013, http://digitaljournal.com/article/353047 (accessed February 3, 2015).

[15] Manfred Spitzer, personal interview, May 12, 2014.

[16] Ibid.

[17] Claudia Ehrenstein, “Does the Internet Make You Dumb? Top German Neuroscientist Say Yes–And Forever,” Worldcrunch, September 12, 2012, http://www.worldcrunch.com/tech-science/does-the-internet-make-you-dumb-top-german-neuroscientist-says-yes-and-forever/digital-dementia-manfred-spitzer-neuropsychiatry/c4s9550/#.VLfZSmTF8mc (accessed January 14, 2015).

[18] Manjari Tripathi and Deepti Vibha, “Reversible Dementias,” Indian Journal of Psychiatry 51 (January 2009), S52-S55.

[19] Spitzer, personal interview.

[20] David Wills, Dorsality: Thinking Back through Technology and Politics (Minneapolis: University of Minnesota Press, 2008), 6.

[21] Ibid, 8.

[22] Peter V. Rabins, “How Memory Is Affected in Alzheimer’s,” Johns Hopkins Health Alert, 2008, http://www.johnshopkinshealthalerts.com/alerts/memory/JohnsHopkinsMemoryHealthAlert_2847-1.html (accessed November 6, 2013).

[23] Viktor Mayer-Schönberger, Delete: The Virtue of Forgetting in the Digital Age (Princeton, N.J.: Princeton University Press, 2009), 109.

[24] José Van Dijck, Mediated Memories in the Digital Age (Palo Alto, CA: Stanford University Press, 2007), 108.

[25] Ann M. Blair, Too Much to Know: Managing Scholarly Information before the Modern Age (New Haven, CT: Yale University Press, 2010), 11.

[26] Mayer-Schönberger, 49.

[27] Van Dijck, 48.

[28] Rabins.

[29] Walter J. Ong, Ramus, Method, and the Decay of Dialogue: From the Art of Discourse to the Art of Reason (Cambridge, MA: Harvard University Press, 1958), 97.

[30] Carruthers, 21.

[31] Ibid, 10.

[32] Ibid, 39.

[33] Lev Manovich, The Language of New Media (Cambridge, MA: MIT Press, 2002), 236.

[34] Van Dijck, 148.

[35] Clive Thompson, “A Head for Detail,” Fast Company, 2006, http://www.fastcompany.com/58044/head-detail (accessed November 6, 2013).

[36] David Wills, “Techneology or the Discourse of Speed,” The Prosthetic Impulse, ed. Marquard Smith and Joanne Morra (Cambridge, MA: MIT Press, 2009), 258.

[37] Maryanne Wolf, Proust and the Squid: The Story and Science of the Reading Brain (New York: HarperCollins, 2008), 214.

[38] Rabins.

[39] Carr, 138.

[40] John Maeda, Foreword, The Art of Critical Making: Rhode Island School of Design on Creative Practice, ed. Rosanne Somerson and Mara L.Hermano (Hoboken, N.J.: Wiley, 2013), 5.

[41] Rabins.

[42] Carr, 134.

[43] Alan Schwarz and Sarah Cohen, “A.D.H.D. Seen in 11% of U.S. Children as Diagnoses Rise,” New York Times, March 31, 2013, http://www.nytimes.com/2013/04/01/health/more-diagnoses-of-hyperactivity-causing-concern.html (accessed November 6, 2013).

[44] N. Katherine Hayles, “Hyper and Deep Attention: The Generational Divide in Cognitive Modes,” Profession (2007), 191.

[45] Hayles, 187.

[46] Carr, 125.

[47] Plato.

[48] “Narrative,” promotional video, http://getnarrative.com (accessed January 14, 2015).

[49] Derrida, 108.

[50] Sam Fazio, “Rethinking Alzheimer’s disease: The impact of words on thoughts and actions,” American Journal of Alzheimer’s Disease (July/August 1996), 41.

[51] Doug Baily, et al, “Transhumanist Declaration,” Humanity+, 1998, http://humanityplus.org/philosophy/transhumanist-declaration/ (accessed November 6, 2013).

[52] Ibid.

[53] Bernard Stiegler, Technics and Time, 1: The Fault of Epimetheus, trans. Richard Beardsworth and George Collins (Palo Alto, CA: Stanford University Press, 1998), 141.

[54] Wills, 10.

[55] Cicero, cited by David A. Campbell, Greek Lyric III (Cambridge, MA: Harvard University Press, 1991), 351.

[56] Stiegler, Technics, 466.

[57] Bernard Stiegler, “Anamnesis and Hypomnesis: Plato as the First Thinker of the Proletarianization,” Ars Industrialis, http://arsindustrialis.org/anamnesis-and-hypomnesis (accessed November 6, 2013).

[58] Bernard Stiegler, Taking Care of Youth and the Generations, trans. Stephen Barker (Palo Alto, CA: Stanford University Press, 2010), 85.

[59] Ibid.

Marcel O’Gorman is Associate Professor of English at the University of Waterloo,
where he teaches in the Experimental Digital Media Program. He is also Director of
the Critical Media Lab and a practicing digital artist. His latest book, Necromedia,
is forthcoming this spring in the Posthumamities Series at University of Minnesota
Press.