Bill LeFurgy

Bill LeFurgy has worked for a number of years as an archivist and a librarian with national institutions. All comments are strictly personal and do not represent those of his employers, past or present.

May 032014
 
Improvement the order of the age, by Boston Public Library, on Flickr

Improvement the order of the age, by Boston Public Library, on Flickr

A strain of techno-pessimism, much of it invading the border of hysteria, is rampant in our culture. The Internet is Making us Stupid!  Gadgets Ruin Relationships and Corrupt Emotions!  Technology Is Taking Over English Departments with The false promise of the digital humanities!

At first glance, this kind of thing seems so very important and present-day, reflecting serious analysis about the impact of new tools on what we value about the past (or should value). Plus, it must also be said, some of these doomsters write compelling with sincerity and intelligence.

But there are two issues with such Cassandraic pronouncements, one conceptual and one historical. The conceptual issue boils down to basic human nature, which leaves us feeling a little uneasy about big changes in our lives. There’s a little nagging fear back in our heads even in the midst of what is generally thought of as progress, both personally and culturally. We may be swept along with innovation, but the more we see (and the more we age), the more nostalgic we tend to feel about tradition. Given how radically everyday life has changed in the West over the last several decades, heightened anxiety toward change leaves us particularly receptive to contrarian arguments about the benefits of technology. All those subjective pronouncements about how technology hurts us and erodes human values may just be the manifestation of our collective little nagging fears goosed by lots of change.

The historical perspective makes it clear that techno-pessimism is very old, most particularly in connection with communication itself. Plato, for example, decried writing because it diminished the power of learning through conversation. “If men learn [writing],” he wrote (!), “it will implant forgetfulness… calling things to remembrance no longer from within themselves, but by means of external marks.”

Denis Baron, in A Better Pencil, notes how critics railed against the printing press when it came on the scene because inked words on paper would last far less time than handwriting on parchment. Peter the Venerable, according to Baron, contrasted the pen as carving wisdom into parchment with the press, which merely brushed marks on top of paper.

The typewriter, favorite of literary nostalgists, was initially viewed with much fear and loathing. While Samuel Clemens claimed he “was the first person in the world to apply the type-machine to literature” in 1874, he bore no love for the device.

That early machine was full of caprices, full of defects–devilish ones. It had as many immoralities as the machine of today has virtues. After a year or two I found that it was degrading my character, so I thought I would give it to Howells. He was reluctant, for he was suspicious of novelties and unfriendly toward them, and he remains so to this day. But I persuaded him. He had great confidence in me, and I got him to believe things about the machine that I did not believe myself. He took it home to Boston, and my morals began to improve, but his have never recovered.

And so on until the recent past, where David Mamet declares his love for pad and pencil and abhors computers.  “The idea of taking everything and cramming it into this little electronic box designed by some nineteen-year-old in Silicon Valley… I can’t imagine it.”

The bottom line here is that yes, we are a little worried about how quickly things are changing with information technology. But we can take some comfort in knowing that our ancestors had the same fears over the past 2500 years and things seem to have turned out reasonably well.

May 012014
 
By outcast104, on Flickr

By outcast104, on Flickr

There is an endless fascination about how writers work. What time they start, where they do it, how much they drink–all of it is grist for literary gawkers. When I peep into writerly habits, I wonder about computers. How much do writers revise on the screen? What kind of digital record do they keep of the drafting process? How do digital drafts correlate with paper drafts?

While others have delved into these questions–see  Saving-Over, Over-Saving, and the Future Mess of Writers’ Digital Archives, for example–the issue as a whole remains ill-defined. Even after 30 plus years of cursor-blinking relentlessness, computers still jockey with paper for the affection of authors. At the highest level of abstraction that appears safe to make, we can say that most writers generate an interwoven, hybrid mix of paper piles and computer files. What elements of this mix survive to document the creative process is difficult to say.

A clue can be found in the attitude of writers toward the computer. To this end, the interviews in The Paris Review are helpful. The publication has several long-running interview series, including The Art of Fiction, in which well-known authors share insights into their work. Starting about 1980, interviewers began asking about “word processors,” and a few years later about “computers.” There are wonderful vignettes about how this new technology was welcomed–or not–into the creative process. Not much is said about what records are kept, although one can draw some inferences. It is important to note that these are views frozen in time and may have changed for the individuals involved.

My overall impression from the authors interviewed is that most have an attitude toward computers that ranges from hostility to workmanlike acceptance. At one extreme there is the poet Yves Bonnefoy, who declared “No word processors! I use a little typewriter; at the same time, often on the same page, I write by hand. The old typewriter makes the paper more present, still a part of oneself,” he says. “What appears on the screen of a word processor is a mist; it’s somewhere else.” David Mamet also has no use for computers: “The idea of taking everything and cramming it into this little electronic box designed by some nineteen-year-old in Silicon Valley . . . I can’t imagine it. ” Ted Hughes goes even further. When the interviewer asked him if “word processing is a new discipline,” Hughes replied that “It’s a new discipline that these particular children haven’t learned…. I think I recognize among some modern novels the supersonic hand of the word processor uncurbed.”

Some accept the computer but feel it is a devil’s bargain. “It has ruined my life! Really on a daily basis it ruins my life,” said Ann Beattie. Why not quit it? “I can’t. I’m hooked…. The computer can more or less keep up with my thoughts. No electric typewriter, not even an IBM Selectric could possibly keep up with my thoughts. So now I’m keeping up with my thoughts. But they’re the thoughts of a deranged, deeply unhappy person because of working on the computer for twenty years.”

Ann Beattie

Ann Beattie

Utilitarian acquiescence is a common refrain. Tahar Ben Jelloun said “I write by hand. The word processor is for articles, because newspapers demand that the text be on a diskette.”  Beryl Bainbridge states “I write directly on the word processor, but I use the machine as a posh typewriter and print out every page the moment it’s finished.” A.S. Byatt has a firmly practical view. “I write anything serious by hand still. This isn’t a trivial question,” she said.  “On the other hand I do my journalism on the computer with the word count. I love the word count.”

Some writers do profess appreciation. “In the mideighties I was a grateful convert to computers,” says Ian McEwan. “Word processing is more intimate, more like thinking itself. In retrospect, the typewriter seems a gross mechanical obstruction. I like the provisional nature of unprinted material held in the computer’s memory—like an unspoken thought. I like the way sentences or passages can be endlessly reworked, and the way this faithful machine remembers all your little jottings and messages to yourself. Until, of course, it sulks and crashes.”

When the interviewer said “I understand that you are a great fan of the word processor,” to John Hersey it provoked a long reply.

I was introduced to the idea very early. In 1972… a young electrical engineer, Peter Weiner, was developing a program called the Yale Editor on the big university computer, with the hope that there would be terminals in every Yale office. They were curious to see whether it would work for somebody who was doing more or less imaginative work, and they asked me if I’d be interested in trying it. My habit up to that point had been to write first drafts in longhand and then do a great deal of revision on the typewriter. I had just finished the longhand draft of a novel that was eventually called My Petition for More Space, so I thought, well, I’ll try the revision on this machine. And I found it just wonderfully convenient; a huge and very versatile typewriter was what it amounted to…. So when these badly-named machines—processor! God!—came on the market some years later, I was really eager to find one. I think there’s a great deal of nonsense about computers and writers; the machine corrupts the writer, unless you write with a pencil you haven’t chosen the words, and so on. But it has made revision much more inviting to me, because when I revised before on the typewriter, there was a commitment of labor in typing a page; though I might have an urge to change the page, I was reluctant to retype it. But with this machine, there’s no cost of labor in revision at all, so I’ve found that I’ve spent much more time, much more care, in revision since I started using it.

Perhaps the most elegant appreciation for the computer came from Louis Begley. In response to a question about “when he grabbed a pen” and decided to write his first novel, Begley said:

I didn’t grab a pen. I typed Wartime Lies on my laptop. I remember exactly how it happened. In 1989, I decided to take a four-month sabbatical leave from my law firm, and I started the book on August 1, which was the first day of the sabbatical. I did not announce to my wife, Anka, or even to myself that this was what I was going to do. But, just a week before the sabbatical began, in a torrential downpour, I went to a computer store and bought a laptop. Now why would I have bought a laptop if I wasn’t going to use it? So I must have had the book in my mind.

As I noted earlier, the question of what writers actually save rarely comes up in these interviews. But Jonathan Lethem is an exception.

Interviewer:  You work on a computer. Do early drafts get printed out and archived?

Lethem: No, I never print anything out, only endlessly manipulate the words on the screen, carving fiction in ether. I enjoy keeping the book amorphous and fluid until the last possible moment. There’s no paper trail, I destroy the traces of revision by overwriting the same disk every day when I back up my work. In that sense, it occurs to me now, I’m more like the painter I trained to be—my early sketching is buried beneath the finished layer of oil and varnish.

That idea–digital traces of earlier starts, stops, changes and corrections buried under a finished product–is the ideal metaphor for the contemporary computer-aided writer. But where earlier sketching is an integral part of the painting itself, digital drafts live a physically separate, and likely undervalued, life of precarious materiality.

Apr 202014
 

I just finished D.T. Max’s fine new book, Every Story is a Ghost Story: A Life of David Foster Wallace. The book is elegantly written and does a wonderful job in portraying Wallace, who was a polymath genius, stunning literary talent and deeply sincere humanist .

51MbRwLrSoL._SY344_BO1,204,203,200_While I enjoyed learning more about all aspects of the man, I was especially interested in what Max had to say about Wallace’s use of computers. Which is to say: not much. I was a bit disappointed in this because, later in his career, Wallace did use a computer to draft his work and went on to use email for correspondence.

Wallace left a hybrid collection, the vast majority of which seems to have been on paper. He came of age during the late 1970s and early 1980s, before the widespread adoption of personal computers, and began his literary career using the traditional tools of handwritten and typescript drafts. Wallace also used paper correspondence with friends, editors and a variety of literary figures, including Don DeLillo and Jonathan Franzen. Thankfully, a good deal of this material survives in the David Foster Wallace Papers in the Harry Ransom Center at the University of Texas at Austin.

Yet, according to Max, Wallace did come use a computer around his mid 20s. It seems he began drafting by hand and then transcribed the work into electronic form, maybe in way similar to how he used a typewriter beforehand. What I found striking is there is so little discussion of how Wallace actually used the machine and what insights might be offered from his digital files. How extensively did he rework drafts on the screen, for example? What is revealed through a forensic analysis of the files and their media? How did the machine influence his work?

Part of the issue is that Wallace was no computer nerd. “Thank God,” Max quotes him as saying in reaction to a new piece of computer equipment “I wasn’t raised in this era.” Another issue is just what kind of born digital material exists for Wallace.

At the least there are files relating to his posthumous novel, The Pale King. Wallace’s editor at Little Brown turned a “tower of a manuscript and the handwritten journals and notebooks… and stacks of computer disks whose labels indicated the evolution of the novel’s title” into the finished work. The Ransom Center states that this material is slated to be placed with the rest of his archive. It will be interesting to see what kind of use the disks–along with whatever other born digital materials that might survive–are put to.

 

Mar 242014
 

Are you looking forward to The Emails of Thomas Pynchon? Or maybe Jonathan Franzen: Tweets and Chats?

Sorry, but the future holds something different for the literary remains of famous authors.

By Frank Boyd, on Flickr

By Frank Boyd, on Flickr

Email and other forms of digital technology represent a sea change for writers. Works are drafted and rewritten on the screen. Authors have a vastly expanded capability to create and to correspond with editors, friends and others, all of whom may be just a few keystrokes away.

But the degree to which any one writer’s digital trail survives is very much an open question. In 2005, for example, Zadie Smith speculated that her email would “will go the way of everything else I write on the computer–oblivion.”

Famous writers have long bequeathed their correspondence, drafts and unpublished works to libraries and archives. It’s an arrangement that benefits everyone: the institution build prestige; the researcher get revealing material; the public learns about the literary back story; and the writer (or her estate) gets money. Yet the whole system as we know it is built on paper: letters, journals and hand-annotated drafts.

Personal digital content threatens everything. The biggest problem is the “personal” part: authors, like the rest of us, can be poor stewards of their own digital legacy. They don’t back up their hard drives. Their files are a disorganized mess. Their content is scattered among multiple devices and online platforms. And while writers may know that some of this digital material has enduring value, there is as yet no easy way to even think about preserving it. All of us are still working though what digital means in our lives.

People have a natural emotional connection to works on paper–it’s easy to see, to handle and to store. It’s durable and even resists apparent efforts to destroy it. Even though Samuel Clements could, for example, write a letter declaring “shove this in the stove… I don’t want any absurd ‘literary remains’ & ‘unpublished letters of Mark Twain’ published after I am planted,” the words live on because they were on paper.

Clemens changed his mind regarding his letters, choosing to “leave it behind and utter it from the grave.” He has plenty of company.  F. Scott Fitzgerald, William Faulkner, Sylvia Plath, Mary McCarthy and Saul Bellow all left paper literary remains. Over the last century, libraries and archives have developed great expertise in acquiring and preserving this material.

But we are at an unusual point in documenting literary lives and works. Authors have had word processing and other forms of personal digital technology available to them for 30 years. Some writers have stubbornly refused to use it, but many have, and are contemplating their own “absurd literary remains.” What actually remains is big open question. Are there emails with editors or notable authors? Drafts with track changes? Ribald direct messages?

At this point, there are only a few institutions with literary personal digital materials. The Norman Mailer Papers at the Harry Ransom Center, University of Texas at Austin, include “359 computer disks, 47 electronic files, 40 CDs, 6 mini data cartridges, 3 laptop computers [documenting] correspondence and literary drafts.” The Salman Rushdie Papers at the Emory University Library have “one Macintosh Performa 5400/180, one Macintosh PowerBook 5300c, two Macintosh PowerBook G3 models, and one SmartDisk FWFL60 FireLite 60GB 2.5′ FireWire Portable Hard Drive.” The Susan Sontag Papers at the Charles E. Young Library, University of California Los Angeles, contain “seventeen thousand one hundred and ninety-eight e-mails.” All these are hybrid collections, which is to say that most of the material is on paper.

How much born digital content is still out there, living wild under the good/bad/indifferent care of writers who find themselves to be their own unintended digital archivists? How ever much there is, I suspect that the proportion of paper to digital is rapidly declining.

What to do about it? Raise awareness about the value of personal digital archives across the board, pure and simple. Everyone has a story to tell and a digital legacy to pass on. The apparent value of email and other content will, I am sure, become more obvious over time.

This is already happening for writers. In 2005, Rick Moody told the New York Times that, when he was considering the sale of his papers, the dealer wanted to know about email. “This sort of brought to mind that there was a policy [for saving it], though it was a very unmethodical policy,” he said. Paying money for email is certainly one way to draw attention to its value. And once writers, agents, publishers, libraries and archives, and all the rest of us understand that personal digital collections warrant careful management from the moment of creation, we will see betters tools and methods for personal digital archiving.

In the meantime, we can only speculate how much and what kinds of digital literary remains will find their way into research collections. Or, to paraphrase Sontag, our libraries await the digital archives of longing.

 

Mar 152014
 

Edward Snowden did more than blow the lid off secret government surveillance. He has called into question a fundamental role of government itself: keeping records.

Snowden at SXSW, by Cory Doctrow, on Flickr

Snowden at SXSW, by Cory Doctrow, on Flickr

Governments have always kept records. Documentation is needed for protecting legal rights and financial obligations, as well as for establishing individual identities and relationships. While there were instances of public outrage in connection with certain overzealous documentation efforts (such as with the East German Stasi), government record keeping is something most people accept as a fact of life.

And when it comes to the idea of “archives”–records kept permanently for their historical or other value–it’s easy to stir the mystic chords of memory. In laying the cornerstone of the U.S. National Archives  in 1933, Herbert Hoover declared “This temple of our history will appropriately be one of the most beautiful buildings in America, an expression of the American soul.”

The concept of government archives as secular temples persists into the digital era. In 2001, Archivist of the U.S. John Carlin described the National Archives and Records Administration’s system for preserving email and other born digital records, the Electronic Records Archives, as follows: “An ERA will allow us at NARA to make a much greater amount of our holdings— these records of democracy, ‘the people’s records’— available to more citizens via the Internet. And that will make our country and our democracy stronger.”

But all this assumes the people are fine with government collecting and keeping digital records about them. The news has long been full of stories about data breaches and digital identity theft. Add to that the specter of government snooping, and it’s no surprise that public anxiety is higher than ever. According to the Electronic Privacy Information Center, “public opinion polls consistently find strong support among Americans for privacy rights in law to protect their personal information from government and commercial entities.” Polls also show that public trust in government is at historically low levels.

All this makes me wonder how supportive the public will be in the years ahead for government efforts to collect and manage any kind of digital information. Given that people tend to paint “the government” with a broad brush, skepticism can easily extend to national cultural heritage institutions. When the Library of Congress announced in 2010 that it would collect the Twitter archive, the agency immediately faced an uproar of privacy concerns that continue to this day. A 2013 About.com article on the project is entitled Does the Government Monitor Your Twitter Account? As absurd as this seems, somebody was worried enough to establish a Twitter application, http://noLOC.org (now apparently inactive), to automatically delete tweets before they fell under the Library’s control.

There is a bit of inside irony about all this. The landmark 1996 report, Preserving Digital Information: Report of the Task Force on Archiving of Digital Information, declared that libraries and archives needed to prove they could excel in collecting and keeping digital content to gain and keep public trust. Trust has indeed proven to be a paramount issue, but it’s not about about demonstrating capability through building trustworthy systems as the report called for. Instead, the public worries that the government and other large institutions have too much technological capability–so much so that privacy is compromised. Trust instead is linked to degrading capability through some combination of personal data ownership, data collection restrictions and even calls to scale back cloud computing.

Are we at a point where the public wants government to do less in terms of collecting and keeping electronic records? If so, that’s a major concern, especially since many government archives already are struggling in this area. With regard to NARA, for example, see Record Chaos: The Deplorable State of Electronic Record Keeping in the Federal Government2008 and Report on Current Recordkeeping Practices within the Federal Government, 2001. (Sadly, whatever success the National Security Agency has had isn’t transferable).

One thing seems certain: government needs to establish itself as a trustworthy manager of electronic records before government archives become digital temples of history.

Mar 142014
 

All digital storage media–hard drives, flash disks, CD-ROMs, and the like–have a short life.  This is why digital preservation requires active management, including regular migration of content from older storage devices to newer devices.

Do you have a back-up plan?

Do you have a back-up plan? by Images by John ‘K’, on Flickr

Individuals face an especially serious challenge.  Unlike many organizations, people at home typically do not have special services to guard their digital data from loss or corruption.

Another way to put it is that everyone is now their own digital archivist.  If you don’t attend to preserving your own digital photographs, videos, email, social media and so on, there is an excellent chance they will be lost.

And, unlike what some vendors imply, relying solely on the cloud is not foolproof. A commercial service can choose to pull the plug–literally–on a cloud service at any time.  If you want to keep it, you need to take responsibility for it.

Individual users need to know that the life of storage media are cut short by at least three factors:

  1. Media durability.
  2. Media usage, storage and handling.
  3. Media obsolescence.

Media Durability

Computer storage media devices vary in how long they last. The quality and construction of individual media items differ widely. The following estimates for media life are approximate; a specific item can easily last longer–or fail much sooner.

  • Floppy disk: 3-5 years.  Though no longer made, many still exist; examples include 8”, 5.25” and 3.5” disks, along with items such as Zip and Jaz disks.
  • Flash media: 1-10 years.  This category includes USB flash drives (also known as jump drives or thumb drives), SD/SDHC cards and solid-state drives; all generally are less reliable than traditional spinning-disk hard drives.
  • Hard drive: 2-8 years.  The health of a spinning disk hard drive often depends on the environment; excessive heat, for example, can lead to quick failure.
  • CD/DVD/Blu-ray optical disk: 2-10 years.  There is large variation in the quality of optical media; note that “burnable” discs typically have a shorter life than “factory pressed” discs).
  • Magnetic tape: 10-30 years.  Tape is a more expensive storage option for most users–it depends on specialty equipment–but it is the most reliable media available.

Media use handling and storage

People have a direct impact on the lives of storage media:

  • The more often media are handled and used, the greater the chance they will fail; careful handling can extend media life, rough handling has the opposite effect.
  • Stable and moderate temperature and humidity, along with protection from harmful elements (such as sun and salt) helps keep media alive.
  • Good-quality readers and other hardware media connections are beneficial; poor connections can kill media quickly.
  • Media that are not labeled or safely stored can be lost or accidentally thrown away.
  • Fires, floods and other disasters are very bad for media!

Media obsolescence

Computer technology changes very quickly.  Commonly used storage media can become obsolete within a few years.  Current and future computers may not:

  • Have drives that can read older media.
  • Have hardware connections that can attach to older media (or media drives).
  • Have device drivers that can recognize older media hardware.
  • Have software that can read older files on media.

What you need to do

Actively manage your important digital content!  Steps to consider:

  • Have at least two separate copies of your content on separate media—more copies are better.
  • Use different kinds of media (DVDs, CDs, portable hard drives, thumb drives or internet cloud storage);  use reputable vendors and products.
  • Store media copies in different locations that are as physically far apart as practical.
  • Label media properly and keep in secure locations (such as with important papers).
  • Create new archival media copies at least every five years to avoid data loss.

For more information

  1. Care and Handling of CDs and DVDs —A Guide for Librarians and Archivists
  2. Digital Media Life Expectancy and Care
  3. Do Burned CDs Have a Short Life Span?
  4. Mag Tape Life Expectancy 10-30 years
  5. Personal Archiving: Preserving Your Digital Memories (Library of Congress)
  6. Retro Media: Memory (and Memories) Lost; Which of these media will be readable in 10 years?  50 years?  150 years?
  7. Care, Handling and Storage of Removable media (UK National Archives)
  8. Do You Have a Back-up Plan?
  9. Selecting and managing storage media for digital 

    public records guideline (Queensland State Archives)


Note: This is adopted from information developed for digitalpreservation.gov at the Library of Congress; post updated: originally published in Jan. 2011

Mar 112014
 

In an era when millions of people have personal digital archives, and where communities of all kinds are archiving content of all kinds, what is the role for archivists? This may sound like a trick question but it is surprisingly complicated.

Sébastien Magro, on Flickr

Sébastien Magro, on Flickr

Part of the problem is that “archive” and “archiving” have acquired chic currency. The terms are used ever more fluidly to describe collecting, keeping and accessing information, most particularly that in digital form. Some of us who identify with the archival profession do weigh in from time to time in an a variety of interesting, if  inconclusive, discussions about our individual concepts (see here, for example).

It’s time to accept that archive/archives/archived/archiving, et al., have fled whatever restricted semantic corral that used to exist.There are declared archivists for shoessex and furries. The archive has gone wild into the vernacular.

Many professional archivists are, of course, employed by institutions to care for and serve recorded information. The type of information varies among institutions; no one place dares to put a claim on everything (with one possible exception). Institutional archives explicitly care for only a fraction of the most important material. The U.S. National Archives has long claimed, for example, to preserve only 1-3 percent of all federal records. This selectivity is justified on practical grounds (there is too much material to store and manage) as well as on projections of future value (people will only care about a tiny portion of the whole). The limited subset of preserved material is a precious distillate upon which professional archivists devote the vast bulk of their time and attention. Notions of evidence, authenticity and trustworthiness for these rare bits are paramount, and are used as justification for spending lots of money to build complex information technology systems to preserve digital archives.

One could say this traditional focus is out of step with the times. Perhaps this is one reason that archivists have lost the semantic battle over their own name. If, as Tim O’Reilly said in 2011, “Digital preservation won’t be just the concern of specialists, it will be the concern of everyone,” how do professional archivists justify what can seem like a monkish fixation on a tiny arcane fraction of the digital whole? I don’t think earnest explanation about reference models or trustworthy systems will cut it with the average person.

Now, there is a real baby in all this bathwater, and she needs careful attention.

I see some interesting possibilities in Terry Cook’s 2013 article, Evidence, memory, identity, and community: four shifting archival paradigms (behind a pay wall). He presents a well-articulated view of how archival practice (or “myth-making”) has evolved over the last 150 years, from “guarding the judicial legacy” to “the historian-archivist selects the archive” to “the mediator-archivist shapes the societal archive” and finally to “the activist-archivist mentors collaborative evidence- and memory-making.” Cook sets the context for this fourth framework in familiar language. “With the Internet, every person can become his own publisher, author… and archivist,” he writes. “Each is building an online archive…. And they are creating records to bind their communities together, foster their group identities, and carry out their business.” The excitement comes when he imagines how professional archivists can interact with this new reality.

Archives as a concept, as practices, as institution, and as profession may be transformed to flourish in our digital era, especially one where citizens have a new agency and a new voice, and where they leave through digital social media all kinds of new and potentially exciting, and potentially archival, traces of human life, of what it means to be human.

But to fully embrace this new possibility, archivists have to reimagine their role.

Professional archivists need to transform themselves from elite experts behind institutional walls to becoming mentors, facilitators, coaches, who work in the community to encourage archiving as a participatory process shared with many in society, rather than necessarily acquiring all the archival products in our established archives. We archivists need to listen as well as speak, becoming ourselves apprentices to learn new ways (and, sometimes, very old ways) that communities have for dealing with creating and authenticating evidence, storytelling memory-making, documenting relationships that are often very different from our own.

The ultimate aspiration here, according to Cook, is “a virtual, inclusive, ‘total’ archive for a country, province or state, or similar jurisdiction, one held by many archives and libraries, including community archives, but unified in conception and comprehensiveness.” He goes on to describe efforts in Canada to make this concept an operational reality.

To my mind, this is exactly right. It jibes with other stimulating ideas, such as that of Mike Featherstone, who asked “rather than see the archive as a specific place in which we deposit records, documents, photographs, Ž film, video and all the minutiae on which culture is inscribed, should we not seek to extend the walls of the archive to place it around the everyday, the world?” But, even more to the point, it seems to me that the movement toward a more democratic, personalized view of “archives” is already well underway, and that those of us who consider ourselves to be professional preservers of the cultural record must accept and embrace this fact. We need to do so even if it means relaxing certain orthodoxies that we have long gestured toward in an effort to establish a special role (myth-making!) for ourselves. Given the enormous opportunity to help people and communities nurture their memories, this seems only sensible.

Cook does remind us, however, that we have choice. Professional archivists can defend their “bastions of identity” or they can feel liberated by new social and technological forces. Put more dramatically, archivists can “float in stagnant backwaters of irrelevancy” or they can be “transformed to be relevant actors out in our society’s communities…. listening more to citizens than the state.”

Mar 042014
 

Brace yourself: you probably already have it. I know I’ve had it for years, but lately it seems that everyone I meet is coming down with personal archive fever. The symptoms include bouts of anxiety, obsessive behavior, vivid memory flashbacks and shifts in mood from an urge to protect to a compulsion to forget, perhaps even to destroy.

But let us not begin at the beginning, nor even with the fever brought on by personal archiving. But rather with the personal acts of creating and caring.

Millions of people now generate digital photographs, posts on Twitter or Facebook and other trails of digital content that extend into the past. Most people begin innocent of any desire to “archive.” Technology instead lures us into recording more and more about our lives until, suddenly, we have a collection that documents a period of history in our lives. We have a personal digital archive. And we have surprising and strong emotional connections with and reactions to this odd extension of ourselves.

That we consider such digital collections to be archival is a clue to why they cause fervid reactions. Traditionally, archive is a term associated with important government records that must be kept forever; more broadly, the word is associated with values such as knowledge, memory, nourishment and power. An official archive (or archives) is a cloistered place that most people knew little about, apart from the fact that it is a place responsible for a dauntingly huge amount of  important information about the past. What’s changed is that average people are now themselves responsible for a dauntingly huge amount of  important information about their own past.

Until recently, people had mementos, photo albums, home movies and paper files, but it was rare for any of this material to be considered  “a personal archive.” This began changing as the ability to generate large volumes of documentation grew progressively easier, and the trend really took off after the advent of personal computing, as the graph below from Google Ngram illustrates. At this point personal archives are commonplace, from celebrities like Jerry Seinfeld and Beyoncé, to average people. Google “I have a personal archive” and you will get over 56,000 hits.

From Google Ngram

From Google Ngram

This is a fascinating trend that has all kinds of rich implications, from challenging the hegemony of the historical imagination, to making us stupid. But let me get back to the fever–the physical and psychological impacts–of personal digital archives. One affliction is, ironically, an effect that flows from the positive association with our archives. Many of us have a clear awareness about the value of our collection for documenting important life events, most especially those to do with raising children and other family events. The collection is seen as very important–but it is also unseen in a machine, often disorganized and scattered, hard to access and, perhaps worst of all, terribly temporary-seeming. Everyone has some experience with digital loss, and the vulnerability of data to human and machine error is a well-known fact. But for most people, addressing this problem is maddeningly difficult. There are too many files–with more added all the time–in too many places to keep easy track of them. Making copies is a good idea, but one estimate says 81 percent of people don’t have any current personal data backup. The result? The more you care about your digital archive, the more anxious you likely are.

Another malady flowing from personal archives is the opposite of the first: the archive brings back uncomfortable or unwelcome memories. Part of this is the lesson that parents try to drum into teenagers: don’t put anything too personal or too embarrassing out into the digital universe, because it can come back to haunt you. In a larger sense, the problem is the inability to forget: for all their fallibility, computer networks also have the seemingly perverse capability to preserve details that people don’t want kept. This includes personal data that companies (and the government) collect, but it extends to our own personal collections. In Delete: The Virtue of Forgetting in the Digital AgeViktor Mayer-Schönberger tells a sad story of two long-separated friends who agree to meet at an old cafe haunt to catch up on things. “But Jane can’t quite remember the name of the cafe. So she has a brainwave – she’ll check through her old emails to John. As she looks for the cafe address, she stumbles across an exchange with him that poisons her attitude to him. Instead of forgiving and forgetting, she is overwhelmed with old resentment and, quite possibly, won’t turn up for that coffee.” If we can’t forget, we can’t forgive or change for the better.

Sometimes a personal archive provokes a mild delirium where people are compelled to confess embarrassment about the past, conscious or not of the oddly self-referential element it adds to the collection. There is plenty of this evident on Twitter, as these recent search results below reveal.

Twitter search results

Twitter search results

I’d really like to see tweets that reveal in the present moment and say words to the effect of “hey future me–screw you and your judgments,” but I guess awareness of the archive only goes so far.

Fever can, of course, refer to “a state of heightened or intense emotion or activity,” and in this regard personal digital archives are certainly capable of helping people feel strong positive emotion, strengthen family ties, reduce alienation and experience a comforting sense of continuity with with past. This is chiefly how I would characterize my own case of personal digital archive fever (although I do wish I had more fevered energy to work with my files). But I also cycle through the less pleasant symptoms noted above. What I hope for is a way to transform fever to fitness: I want more of the affirming enthusiasm with less of  the anxiety and bad energy.

While I don’t know if this desire will be fulfilled, things are bound to change. To paraphrase Marshall McLuhan, the history of technology is a continuous pattern of  humanity first inventing a tool and then having the tool change humanity. Right now we’re at the “and then” stage.

Dec 262012
 

Any list obviously reflects the interests of the compiler, as well as the source and scope of the information considered.  In this case, I turned to Slideshare and searched on “digital preservation.” Filtering by “this year,” yields the following, ranked in order.

  1. Digital Preservation and Social Media Outreach. Presentation given during the 17th Brazilian Conference of Archival Science in Rio de Janeiro, June 21 2012, by Bill LeFurgy.  Seems vain, I know, but see above.
  2. Digital Preservation Perspective. How far have we come, and what’s next? by Jeff Rothenberg from FuturePerfect. Insights from one of the people who originally framed the digital preservation issue.
  3. Digital Preservation: A Wicked Problem. AIIM Ottawa presentation by Ron Surette, DG Digital Preservation and CIO, Library and Archives Canada. Wicked: a problem that is difficult or impossible to solve because of incomplete,contradictory and changing requirements.
  4. Assessing Preservation Readiness Webinar. Presented on February 7, 2012 as part of the DuraSpace Curated Webinar Series, “Knowledge Futures: Digital Preservation Planning” Curated by Liz Bishoff, The Bishoff Group, LLC. Note this is a recording of the webinar and may load a bit slowly.
  5. Workshop 4 audiovisual digital preservation strategy. Now that you have digitised your audio and video, how to you keep the files — forever? by Richard Wright. Choices involved when moving from analog to digital, dealing with born digital and developing cost estimates.
  6. Getting the whole picture. From the National Library of Australia. Finding a common language between digital preservation and conservation.
  7. Bit Level Preservation. Assessing and Mitigating Bit-Level Threats, DigitalPreservation 2012, Washington, DC, from Dr. Micah Altman.  A framework for addressing bit-level preservation risks.
  8. In Search of Simplicity: Redesigning the Digital Bleek and Lloyd.  DESIDOC Journal of Library & Information Technology: Special Issue on Digital Preservation original submission. The Bleek and Lloyd is a collection of digitized historical artifacts on the Bushmen people of Southern Africa.
  9. Digital Preservation: caring for our data to foster knowledge discovery and dissemination. From Claudia Bauzer Medeiros. Given at Institute of Computing, UNICAMP.
  10. Digital Presevation: An Overview.  From Amit Kumar Shaw. Given at Indian Statistical Institute, Kolkata Library, India.

The National Library of Australia deserves special consideration in that, in addition to coming in at number 6 in the top 10, they also scored at number 11, with Digital Preservation for Ongoing Access, and at number 12 with What is the Mediapedia.

Dec 172012
 

Thinking about digital recreation of reality has an inevitable association with the “whoa, dude, are we in a video game?” line of reasoning. This is good in some ways because it puts the issue into a context that’s easier to ponder for most of us. It even turns out that at least one NASA scientist likes to compare reality to Grand Theft Auto (more on that in a minute).  Despite the fundamental inability to prove or disprove the hypothesis, there is something compelling about speculating just how close a computer simulation can ever get to fully depicting the reality that we perceive.

Call of Duty - Black Ops - RealTime Screenshots, by shyb, on Flickr

Call of Duty – Black Ops – RealTime Screenshots, by shyb, on Flickr

It’s clear that video games are getting increasingly life-like all the time. Video teasers for games like Black Ops—Call of Duty are, for example, getting hard to distinguish from live-action movie trailers. We’ve come a long way from crudely pixelated Doom fantasies  from the 1990s. Is it possible to image a video game that’s authentically real from a human perspective?

It may come down to whether the universe is digital or analog, which is the subject of debate and speculation among physicists. Is reality made up of discrete, if tiny, chunks or is it one big continuum that resists ultimate subdivision?

The rough outlines of quantum theory posit that action at atomic scale happens in discrete—digital, if you will—amounts. The theory had a direct impact on development of the transistor and microchip, and is closely associated with development of modern electronics. And even though certain features of quantum mechanics are undeniably weird—with cats half alive and half-dead at the same time—it seems plausible that a powerful-enough digital computer should be capable of faithfully converting a digital reality into an alternative digital reality.

That NASA scientist mentioned above, for example, says:

The natural world behaves exactly the same way as the environment of Grand Theft Auto IV…. You see exactly what you need to see of Liberty City when you need to see it, abbreviating the entire game universe into the console. The universe behaves in the exact same way. In quantum mechanics, particles do not have a definite state unless they’re being observed. Many theorists have spent a lot of time trying to figure out how you explain this. One explanation is that we’re living within a simulation, seeing what we need to see when we need to see it.

digital/analog wisdom, by doctor paradox, on Flickr

digital/analog wisdom, by doctor paradox, on Flickr

But if the universe is analog, the best we can hope for is for a sampling of the real thing. Computers might create lossy MP3 versions of reality, and with enough processing power we might even get products that go beyond human ability to tell the difference. Even in such a situation we’re still dealing with a discrete representation of the real thing, which would elevate the debate between digital and analog media aficionados to the ultimate level.

There’s a spiritual dimension involved here, too. Independent of any particular belief system, I agree with John Horgan, author of Rational Mysticism, when he says that “our minds have untapped depths that conventional science cannot comprehend.” If those depths are analog, is there ever any hope of fully replicating them? The notion of a sampled soul adds a whole new meta-layer to spiritually.

The Foundation Questions Institute sponsored an essay contest in 2011 that asked “is reality digital or analog?” As noted in Scientific American, the organizers expected entrants to come down on the side of digital, which is what I myself would guess, if forced to.  But—surprise—“many of the best essays held, however, that the world is analog.” The reasoning in the essays is erudite, but certain points jump out with clarity. David Tong wrote about the fundamental inability of scientists to simulate the Standard Model of physics on a computer, perhaps because reality is made up not of particles but of “ripples of continuous fields, moulded
into apparently discrete lumps of energy by the framework of quantum mechanics.” Numbers are, according to Tong, merely emergent from a fundamentally analog universe. His final sentence works to drive the point home: “We are not living inside a computer simulation.”

The winning essay, however, came down on the side of a digital reality. Why? Because Isaac Newton says so. Jarmo Makela, “a specialist in general relativity with an avid interest in the history of science,” purports to report on an interview he conducts with Newton in 1700. The great man confidently declares that reality (or, at least the odd alternative slice of reality hosting the conversation) is most definitely digital because he has calculated it as such, based in part of an analysis of black hole entropy and speculation about “a still unknown law of nature.” Newton presents Makela with the written details, but they prove to be sadly evasive.

In short, if we don’t know the fundamental basis of reality, it’s pretty hard to imagine faithfully recreating it in all it’s cryptic glory. It’s probably better to think about video games and other simulations as tools that do a good enough job of engaging awareness and aiding our learning and entertainment within an inescapable, enduring meatspace reality.