Sabtu, 01 Oktober 2016

Devense in memory


I suppose it is the case that we derive some pleasure from imagining ourselves to be part of a beleaguered but noble minority. This may explain why a techno-enthusiast finds it necessary to attack dystopian science fiction on the grounds that it is making us all fear technology while I find that same notion ludicrous.

Likewise, Salma Noreen closes her discussion of the internet's effect on memory with the following counsel: "Rather than worrying about what we have lost, perhaps we need to focus on what we have gained." I find that a curious note on which to close because I tend to think that we are not sufficiently concerned about what we have lost or what we may be losing as we steam full speed ahead into our technological futures. But perhaps I also am not immune to the consolations of belonging to an imagined beleaguered community of my own.

So which is it? Are we a society of techno-skeptics with brave, intrepid techno-enthusiasts on the fringes stiffening our resolve to embrace the happy technological future that can be ours for the taking? Or are we a society of techno-enthusiasts for whom the warnings of the few techno-skeptics are nothing more than a distant echo from an ever-receding past?

I suspect that latter is closer to the truth, but you can tell me how things look from where you're standing.

My main concern is to look more closely at Noreen's discussion of memory, which is a topic of abiding interest to me. "What anthropologists distinguish as ‘cultures,’" Ivan Illich wrote, "the historian of mental spaces might distinguish as different ‘memories.'" And I rather think he was right. Along similar lines, George Steiner lamented, "The catastrophic decline of memorization in our own modern education and adult resources is one of the crucial, though as yet little understood, symptoms of an afterculture.” We'll come back to more of what Steiner had to say a bit further on, but first let's consider Noreen's article.

She mentions two studies as a foil to her eventual conclusion. The first suggesting "the internet is leading to 'digital amnesia', where individuals are no longer able to retain information as a result of storing information on a digital device," and the other "that relying on digital devices to remember information is impairing our own memory systems."

"But," Noreen counsels her readers, "before we mourn this apparent loss of memory, more recent studies suggest that we may be adapting." And in what, exactly, does this adaptation consist? Noreen summarizes it this way: "Technology has changed the way we organise information so that we only remember details which are no longer available, and prioritise the location of information over the content itself."

This conclusion seems to me banal, which is not to say that it is incorrect. It amounts to saying that we will not remember what we do not believe we need to remember and that, when we have outsourced our memory, we will take some care to learn how we might access it in the future.

Of course, when the Google myth dominates a society, will we believe that there is anything at all that we ought to commit to memory? The Google myth in this case is belief that the every conceivable bit of knowledge that we could ever possibly desire is just a Google search away.

The sort of analysis Noreen offers, which is not uncommon, is based on an assumption we should examine more closely and also leaves a critical consideration unaddressed.

The assumption is that there are no distinctions within the category of memory. All memories are assumed to be discreet facts the sort of which one would need to know in order to do well on Jeopardy. But this assumption ignores the diversity of what we call memories and the diversity of functions to which memory is put. Here is how I framed the matter some years back:

All of this leads me to ask, What assumptions are at play that make it immediately plausible for so many to believe that we can move from internalized memory to externalized memory without remainder?  It would seem, at least, that the ground was prepared by an earlier reduction of knowledge to information or data.  Only when we view knowledge as the mere aggregation of discreet bits of data, can we then believe that it makes little difference whether that data is stored in the mind or in a database.

We seem to be approaching knowledge as if life were a game of Jeopardy which is played well by merely being able to access trivial knowledge at random.  What is lost is the associational dimension of knowledge which constructs meaning and understanding by relating one thing to another and not merely by aggregating data.  This form of knowledge, which we might call metaphorical or analogical, allows us to experience life with the ability to “understand in light of," to perceive through a rich store of knowledge and experience that allows us to see and make connections that richly texture and layer our experience of reality.

But this understanding of memory seems largely absent from the sorts of studies that are frequently cited in discussions of offloaded or outsourced memory.

As for the unaddressed critical consideration, if we grant that we must all outsource or externalize some of our memory, and that it may even be admittedly advantageous to do so, how do we make qualitative judgments about the memory that we can outsource and the memory we should on principle internalize (if we even allow for the latter possibility)?

Here we might take a cue from the religious practices of Jews, Christians, and Muslims, who have long made the memorization of Scripture a central component of their respective forms of piety. Here's a bit more from Steiner commenting on what can be known about early modern literacy:

Scriptural and, in a wider sense, religious literacy ran strong, particularly in Protestant lands. The Authorized Version and Luther’s Bible carried in their wake a rich tradition of symbolic, allusive, and syntactic awareness. Absorbed in childhood, the Book of Common Prayer, the Lutheran hymnal and psalmody cannot but have marked a broad compass of mental life with their exact, stylized articulateness and music of thought. Habits of communication and schooling, moreover, sprang directly from the concentration of memory. So much was learned and known by heart — a term beautifully apposite to the organic, inward presentness of meaning and spoken being within the individual spirit.

Learned by heart--a beautifully apt phrase, indeed. Interestingly, this is an aspect of religious practice that, while remaining relatively consistent across the transition from oral to literate society, appears to be succumbing to the pressures of the Google myth, at least among Protestants. If I have an app that let's me instantly access any passage of my sacred text, in any of a hundred different translations, why would I bother to memorize any of it.

The answer, of course, best and perhaps only learned by personal experience, is that there is a qualitative difference between the "organic, inward presentness of meaning" that Steiner describes and merely knowing that I know how to find a text if I were so inclined. The same is true for how we might come to know a poem. But the Google myth, and the studies that examine it, seem to know nothing of that qualitative difference, or, at least, they choose to bracket it.

I should note in passing that much of what I have recently written about attention is also relevant here. Distraction is the natural state of someone who has no goal that might otherwise command or direct their attention. Likewise, forgetfulness is the natural state of someone who has no compelling reason to commit something to memory. At the heart of both states may be the liberated individual will yielded by modernity. Distraction and forgetfulness seem both to stem from a refusal to acknowledge an order of knowing that is outside of and independent of the solitary self. To discipline our attention and to learn something by heart is, in no small measure, to submit the self to something beyond its own whims and prerogatives.

So, then, we might say that one of the enduring consequences of new forms of externalized memory is not only that it alters the quantity of what is committed to memory but it also reconfigures the meaning and value that we assign to both the work of remembering and to what is remembered. In this way we begin to see why Illich believed that changing memories amounted to changing cultures. This is also why should consider that Plato's Socrates was on to something more than critics give him credit for when he criticized writing for how it would affect memory, which was for Plato much more than merely the ability to recall discreet bits of data.

This last point brings me, finally, to an excellent discussion of these matters by John Danaher. Danaher is always clear and meticulous in his writing and I commend his blog, Philosophical Disquisitions, to you. In this post, he explores the externalization of memory via a discussion of a helpful distinction offered by David Krakauer of the Santa Fe Institute. Here is Danaher's summary of the distinction between two different types of cognitive artifacts, or artifacts we think with:

Complementary Cognitive Artifacts: These are artifacts that complement human intelligence in such a way that their use amplifies and improves our ability to perform cognitive tasks and once the user has mastered the physical artifact they can use a virtual/mental equivalent to perform the same cognitive task at a similar level of skill, e.g. an abacus.

Competitive Cognitive Artifacts: These are artifacts that amplify and improve our abilities to perform cognitive tasks when we have use of the artifact but when we take away the artifact we are no better (and possibly worse) at performing the cognitive task than we were before.

Danaher critically interacts with Krakauer's distinction, but finds it useful. It is useful because, like Albert Borgmann's work, it offers to us concepts and categories by which we might begin to evaluate the sorts of trade-offs we must make when deciding what technologies we will use and how.

Also of interest is Danaher's discussion of cognitive ecology. Invoking earlier work by Donald Norman, Danaher explains that "competitive cognitive artifacts don’t just replace or undermine one cognitive task. They change the cognitive ecology, i.e. the social and physical environment in which we must perform cognitive tasks." His critical consideration of the concept of cognitive ecology brings him around to the wonderful work Evan Selinger has been doing on the problem of technological outsourcing, work that I've cited here on more than a few occasions. I commend to you Danaher's post for both its content and its method. It will be more useful to you than the vast majority of commentary you might otherwise encounter on this subject.

I'll leave you with the following observation by the filmmaker Luis Bunuel: “Our memory is our coherence, our reason, our feeling, even our action. Without it, we are nothing.” Let us take some care and give some thought, then, to how our tools shape our remembering.


Tidak ada komentar:

Posting Komentar