The Internet constantly confronts us with evidence of our past. Are we losing the chance to remake ourselves?
By Nausicaa Renner
Last year, I had a strange dream. My father and I were wading in an industrial canal, reminiscent of a subway, as thousands of hatchery-raised fish were being released into it. The fish crowded, slimy, around our legs, and I knew (in the way that one knows in a dream) that they thought, as they hit the water, that they were drowning—that they had to experience death before entering adulthood. The next day, I told my father about the dream. He revealed that, when I was three, when we were living in Pittsburgh, he took me to see a truckful of catfish being pumped into an artificial pond. I was too young to remember this. But somewhere in my mind the vision of fish being spewed into water had lodged itself, resurfacing more than twenty-five years later.
These days, it’s common to find an image emerging, unbeckoned, from the reservoir of the past. We spend hours wading through streams of photos, many of which document, in unprecedented ways, our daily lives. Facebook was invented in 2004. By 2015, Kate Eichhorn writes in “The End of Forgetting: Growing Up with Social Media,” people were sharing thirty million images an hour on Snapchat, and British parents “posted, on average, nearly two hundred photographs of their child online each year.” For those who have grown up with social media—a group that includes pretty much everyone under twenty-five—childhood, an era that was fruitfully mysterious for the rest of us, is surprisingly accessible. According to Eichhorn, a media historian at the New School, this is certain to have some kind of profound effect on the development of identity. What that effect will be we’re not quite sure.
Eichhorn sees both sides of the coin. On the one hand, she says, children and teen-agers have gained a level of control that they didn’t have before. In the past, adults refused to acknowledge children’s agency, or imposed on them an idealized notion of innocence and purity. Adults were the ones writing books, taking photos with expensive cameras, and commissioning paintings, all of which tended to commemorate childhood—to look back at it—rather than participate in it. The arrival of cheaply made instant photos, in the nineteen-sixties, allowed children to seize a means of production, and the arrival of the Internet gave them an unprecedented degree of self-determination. “If childhood was once constructed and recorded by adults and mirrored back to children (e.g., in a carefully curated family photo album or a series of home video clips), this is no longer the case,” Eichhorn writes. “Today, young people create images and put them into circulation without the interference of adults.”
This practice can be hugely beneficial. New technology—especially the smartphone—allows us to produce a narrative of our lives, to choose what to remember and what to contribute to our own mythos. For Eichhorn, this is the latest instance of a long-held, if mysterious, practice. “Long before children were able to create, edit, and curate images of their lives,” she writes, “they were already doing so on a psychic level.” Freud called these images “screen memories”—no pun intended—and he thought that we used them to soften or obscure painful experiences. Humans have always tried to cope with the difficulty of memory, to turn it “from an intolerable horror to something which is reassuringly innocuous and familiar.” Social media just makes us more adept at it.
On the other hand, Eichhorn writes, such media can prevent those who wish to break with their past from doing so cleanly. We’re not the only ones posting; our friends and family chronicle our lives, usually without our consent. Growing up online, Eichhorn worries, might impede our ability to edit memories, cull what needs to be culled, and move on. “The potential danger is no longer childhood’s disappearance, but rather the possibility of a perpetual childhood,” she writes. We may, in short, have traded “screen memories for screens.”
This is of particular import for those who yearn to establish new identities. People who transition, for instance, often rely on having a clean break, visually, with their previous appearances; as Eichhorn points out, one of the early promises of the Internet, when it was just “texts and clip art,” was that it “presented itself as a safe place [for transgender youth] to try on an aspect of their identities they could not explore in their material lives.” Now that the Internet is more permanent, and more pervasive, it’s hard to avoid the relics of past identities. Eichhorn cites one of her students, Kevin, an aspiring film critic from a small town in upstate New York. By his second year of college, Kevin says, his Facebook stream “was getting really weird. I had my new friends from New York posting about queer performance art and these guys from my high school posting about dirt biking in a gravel pit and tagging me in photographs from high school. I needed to move on.” Although he deactivated his social-media accounts and created new ones under a pseudonym, he continued to be tagged in old photos. “I guess that Kevin is out there for good,” he says. “I just have to live with him and all those people he was trying to escape.”
The persistence of certain images is more of a problem for some than for others. There are moments, elevated not by the fact of being recorded but by the impossibility of being erased, that become traumatic. These situations—in which a naked photo or an offensive tweet destroys a person’s public life—are unfortunate, and widely covered (for example, in Jon Ronson’s “So You’ve Been Publicly Shamed”). Eichhorn details the case of Ghyslain Raza, a Canadian teen-ager who, in 2002, recorded himself wielding a golf-ball retriever as though it were a lightsabre. The video, which was found by a classmate, titled “Star Wars Kid,” and uploaded to the Internet, was viewed by millions of people; as Eichhorn points out, this all happened at a time when virality, as a phenomenon, wasn’t really a thing. Raza was bullied at school and ended up in a psychiatric ward. In 2013, still unable to escape the video, despite legal action, he spoke publicly about his experience, describing his contemplation of suicide.
Everyone, Eichhorn writes, benefits from experimentation in adolescence. During that time, we exist in what the psychoanalyst Erik Erikson called a psychosocial “moratorium”—a stage in which we hover “between the morality learned by the child and the ethics to be developed by the adult.” The moratorium is a period of trial and error that society allows adolescents, who are permitted to take risks without fear of consequence, in hopes that doing so will clarify a “core self—a personal sense of what gives life meaning.” The Internet interrupts the privacy of this era; it tends to scale up mistakes to monumental proportions, and to put them on our permanent records. Colleges and employers now look at social-media accounts for evidence of character. Eichhorn spends less time than she might have on how this affects today’s teens. What is it like to live under threat? What are the ramifications when an entire generation never gets the chance to experiment freely or to remake themselves?
Eichhorn does lightly gesture to a kind of universal human right, one that runs counter to the whims of companies that use data. “Forgetting—that once taken-for-granted built-in resource that all humans possessed—is now being pitted against the interests of technology companies,” she writes, implying, with an endearing idealism, that we have a right to forget. (For some, this belief might reflect a distinctly American approach to the rest of the world.) More plausibly, she cites the right to be forgotten, which is the nickname of both data privacy regulations in Europe and movements against naming minors in the media. Either way, the implication is that the ability to detach from one’s past self—to move laterally, as an individual, into a new body or personality—is a democratic ideal. We also have the right to stay as we are. In some cases, retaining our sense of self across chasms that might destroy it is more important than having a rebellious phase. Take, for instance, the case of migrants, which Eichhorn touches on briefly: “Family members left behind can now stay in constant touch with their sons and daughters and even track their footsteps across Europe.” Here, memory is almost a form of political representation, enabled by social media; groups are able to preserve their history as they travel across continents.
Are all photos documentary? In “The Social Photo,” Nathan Jurgenson puts forth the useful proposition that most online photos are about sharing experiences, not creating memories. In one passage, Jurgenson, a founder of Real Life magazine, writes that selfies are “less an accurate picture of me at this time in this place and more . . . a visual depiction of the idea of me.” They’re units of communication, more emojis or hieroglyphics than portraits; they have little context, aren’t discernibly located anywhere, and typically come in the aggregate. For the most part, it wouldn’t really matter if they existed in twenty years. This explains the prevalence of disappearing photos, like Instagram stories and Snapchat. (Jurgenson is also a sociologist for Snap Inc., Snapchat’s parent company.) It also explains photos of food, which are rarely artful or worth saving.
For Jurgenson, taking social photos changes the way vision works—a process that began with the advent of cameras and is still evolving today. Teen-agers are cyborgs, and their phones are mechanical eyes that help them interpret their experience. “To document,” Jurgenson writes, “is to be involved with our own experience instead of passively letting it float by.” On this subject, Jurgenson has all the right, if somewhat dutiful, opinions: nostalgia is overrated, but he’s not into “digital austerity.” We shouldn’t hark back to an era in which we were less attached to technology—mostly because that era doesn’t exist. “Our reality has always been already mediated, augmented, documented,” he writes, “and there’s no access to some state of unmediated purity.” We shouldn’t ask whether social photography is good, but how it can be good.
Jurgenson, unlike Eichhorn, isn’t worried about the Internet making it hard to bury past versions of ourselves. If anything, he fears the prevalence of death. Photos, he writes, “embalm” their subjects, encasing them in a “stilling sadness that kills what it attempts to save out of a fear of losing it.” For him, the risk of constant documentation is alienation: a sense that our bodies are generating still moments rather than constant movement. He cites Wolfgang Schivelbusch, a German scholar who wrote about the effect of the railway on human perception. With its speed and glass windows, “the train flattens nature into something smooth and predictable, not something traveled within but something easily seen and consumed,” Jurgenson writes. “As more of life is experienced through camera screens, does it occur at a similar remove, where the messiness of lived experience is made into something merely observable?”
It would, indeed, be stunning if we were able to see painful moments from the past—the ones we meditate on for years—as dead and embalmed. The trouble is, most difficult memories aren’t captured by photos, videos, or tweets. Screens, like screen memories, are avoidant; they turn away from the painful. There are few crying children on Instagram. A friend, whose mother digitized all her family’s old home videos, recently told me about a significant birthday party at the roller rink. What she remembered was the drama beforehand: at the time, she was obsessed with Rollerblades, and when the rink had only roller skates her mother rushed to a sports store to get an inline pair, barely saving the day. None of this, it turned out, was captured in the video. All it showed was the triumph—a redemptive moment after tears, and a happy loop around the rink.
The views expressed in this article belong to the author and do not necessarily reflect the editorial policy of Araweelo News Network.
Source: The views expressed in this article belong to the author published the first newyorker