Internet Histories30.09.12

Internet Histories | 1 October

Facebook's faulty memory, learning from animals, dying alone, desperately unhappy, and a straight-faced Forgotten Silver.

This fortnight:
Facebook's faulty memory | Learning from animals |
Dying alone, desperately unhappy | A straight-faced Forgotten Silver


Joe
Facebook has been at pains to curb it, but the rumour that the latest of the social network’s pretty fluid iterations has made users’ private messages (from 2009 and earlier) available on their walls has taken wings, with more and more people coming forward and claiming their private conversations have gone public.

In what’s possibly a selective sin of omission, a source from the company correctly stated to the BBC that “no mechanism” had ever been created that would allow a private message to be published onto a user’s wall or timeline. This doesn’t mention the fact that Facebook has been an organic patchwork of bolted add-ons and functionalities that were continually modified or added to from its inception on – we forget that our capacity to post photos, statuses, endorse the statements of others, and hold a conversation in the same place (public or private) was once quite limited.

Yet not even the most visible (and therefore controversial) changes to the way the website displays the information you choose to give it ever constituted a full reboot or reissue (think “Facebook 2”, “Facebook 2010”). It’s always been evolving before us. There are undoubtedly some good technical articles detailing these changes and how information could slip through the grates, but I’m more interested in how we’ve acculturated to those developments.

I delved into my own back pages for a look, and felt wrongfooted at every turn by what may or may not be a private message. I can’t tell what’s for my eyes only or just a friend spooling a stream-of-consciousness update about his travels on my page. What seems like a one-line fragment of a chat might have been an in-joke. I can’t tell what’s merely one side of a conversation had back and forth between two pages, which we used to do a lot with Facebook. I also cringe because I was younger and dumber and so were my friends. Did the site leak my private missives, or was I just stupid enough to narrate my private arrangements (none of which are actually exciting, but serve as noise I wouldn’t make others read) in public? I feel like I have an above-average memory for places, names, details, joy, heartbreak, time – so why can’t I tell what I meant to hide and what I meant to share on Facebook?

[caption id="attachment_4639" align="aligncenter" width="500"] A depiction of a 16th-century memory theater, designed as an array of images, symbols and archetypes that amounted to a macrocosm of the cosmos. Going by Facebook, I have a smaller, pettier conception of the infinite.[/caption]

My gut suspicion is that tracking the corollary of these on social media eludes us. It’s at once mundane and ever-shifting territory – as common as going to the supermarket, yet unrecognizable when you return to old information mispackaged in a layout and format it wasn’t written in – imagine seeing a handwritten note to your 4th form sweetheart reproduced in Arial on your office letterhead and the contextual confusion and awkwardness that ensues. But since we’re a transitional generation we have beenacculturating, so we’re also encountering and second-guessing an old manifestation of ourselves whenever we go back.

(Disclaimer: I can’t speak for people who have grown up entirely on these sites, from the moment they learnt to type and read – but it scares me a little that we have a sense of parallel existences (joy, heartbreak, time) without or outside of the Internet that they might not).

Where this is all leading is to two great pieces – older, but timely - on The Frailest Thing, a blog by American PhD student Michael Sacasas on technology and culture. In Social Media, Social Memory: Remembering With Facebook, he posits the site as a sort of externalized memory theatre while warning of the dangers of artificially storing up one’s autobiographical memory the way one stores up poems, songs, and phone numbers. The companion piece, Social Media and the Arts of Memory, also has several choice observations:

“Much of what cell phones are increasingly used for has very little to do with making a phone call, after all. In fact, one could argue that the calling feature of phones is becoming largely irrelevant. Cell phones are more likely to be used to access the Internet, send a text message, take a picture, or film a video. Given these capabilities cell phones have become prosthetic memory devices; to lose a cell phone would be to induce a state of partial amnesia. Or, it may be better to say it would might induce a fear of future amnesia since our ability to approach the present as a field of potential memories would be undermined…





…Derrida’s insight suggests that given the way the architecture of an archive already determines what can in fact be archived, the future record of the past is already impinging upon the present. Or, put otherwise, the sorts of memories we are able to generate with social media may already be directing our interactions in the present.”


The whole site is worth bookmarking and following.


The failure of animals to develop sophisticated online social media networks has of course historically been used as a source of mockery and even a justification for killing them, eating them and wearing their skins. But in the New York Times, cardiology professor Barbara Natterson-Horowitz and writer Kathryn Bowers have distilled the best bits of their forthcoming book Zoobiquity: What Animals Can Teach Us About Health and the Science of Healing to a cavalcade of astonishing trivia that shows us just how much our behaviours and malaises are replicated elsewhere in the animal kingdom. Though that’s this synthesis’s main charm (like a good trailer, I guess) it also asks a very good question about why the knowledge and expertise of human doctors is privileged above and siloed from that of veterinarians. At its most satisfying, it recontextualises us within a weird and wonderful cornucopia of beasts:

“Modern, affluent humans have created a continuous eating cycle, a kind of “uniseason.” Our food is stripped of microbes, and we remove more while scrubbing off dirt and pesticides. Because we control it, the temperature is always a perfect 74 degrees. Because we’re in charge, we can safely dine at tables aglow in light long after the sun goes down. All year round, our days are lovely and long; our nights are short.





As animals, we find this single season an extremely comfortable place to be. But unless we want to remain in a state of continual fattening, with accompanying metabolic diseases, we will have to pry ourselves out of this delicious ease."


Matt
How could a woman go missing inside her own home? That’s the question Michael Kruse asks in this story, compiled in 2011, about a woman who disappeared nearly two years earlier without anyone noticing. A short-ish read, it’s written in a spare, affectless style that lets the facts speak for themselves. It’s haunting.

After the diagnosis, she made daily notes on index cards. She ate at Arby's, Wendy's, McDonald's. Sometimes she did sit-ups and rode an exercise bike. She read the paper. She got the mail. She went to sleep at 8 p.m., 1:30 a.m., 6:30 a.m. Her heart raced.





"Dropped fork at lunch," she wrote.



"Felt depressed in evening and cried."



"Noise outside at 4 a.m. sounded like a dog."


By the time I read it over, though, my question didn’t quite match the one posed by the author. Why doesn’t this happen more often?

Who among us hasn’t had the worst week, the worst month, when the thought of completing even the simplest task is enough to spin one into an existential oscillation between panic and ennui? When staying in bed is only slightly less terrifying than getting up, except it’s really just as bad, and you quickly come to resent your own weakness and refusal to do the things you know need to get done, until inaction becomes a reflexive habit and you might as easily stop breathing as go out and meet new people, apply for a new job?

Your heart aches for those who exist within this depressed fugue state frequently, or even constantly, who are lost someplace in the dark interstice between culture and character.

A fascination with the pathology of shut-ins is beginning to hit the mainstream. Earlier this year, the NZ International Film Fest showed Dreams of a Life, about a woman who died in her apartment and wasn’t discovered for three years. Maybe it’s because we feel instinctually that in an age of internet-mediated interaction, it should somehow be impossible for people to simply fall off the grid. Who would update my Facebook if I died tomorrow? Nobody. And surely someone would notice that, right? Right?

On the other hand, the internet makes it possible to live without having to physically interact with anyone. If technology has tied invisible connecting cords between us, these are ropes as much as they’re telephone lines. Perhaps the internet isn’t making us any more likely to become recluses, but – with ecommerce, next-day delivery, and at least the superficial façade of social contact – it’s certainly an enabler to those already predisposed to abandoning reality.

[caption id="attachment_4640" align="aligncenter" width="480"] The Japanese do disaffection real, real good.[/caption]

Sometimes it’s diagnosed as symptomatic of existing psychological disorders – usually some form of autism, or a social anxiety disorder. In other cases though, the roots are less explicable. The practice of removing oneself from the world has become in some places such a popular phenomenon that there are several names for it. In Japan, such people – there are anywhere from 500,000 to a million such in that country alone – are termed hikikimori. Experts suggest that in neurotypical sufferers, it might have something to do with strict cultural settings.

In Britain, such people are called NEETs: Not in Education, Employment, or Training. In Mexico, ni-nis.

What’s responsible for this modern, nihilistic urban hermitism? Why doesn’t this happen more often? It’s a flagrant guess, but I’d suggest that it’s our extended relationships that save our worlds from narrowing to pin-pricks. Our obligations to aunts and cousins, clubs and schools and churches and in some vague, ill-defined extent, our communities – they’re a part of what keep us moving forward, or at least getting out of bed on Mondays. Perhaps this is another thing we have neoliberalism to thank for. Perhaps in another life I am a sociologist.

Regardless, I could never live alone with a cat. I'd be afraid that if I died suddenly, it would eat my face.


Rosabel
I've been thinking this week about still cities and slow ruins and unfamiliar modes of seeing.

Of course, Dau isn’t a city, not in the strictest sense of the word. Technically, it's Russian filmmaker Ilya Khrzhanovsky's latest project. But it’s been in production for five years now, and if you were to visit its east Ukranian set you’d find a small totalitarian city the size of Eden Park. Each apartment is completely furnished. The toilets work. The iceboxes are stocked with fresh food. Fresh food with expiration dates from 1952. Everybody is dressed in Stalin-era clothing. There's no sign of modern technology, no mention of Google or Facebook, no mention of the true nature of the city – words like ‘shoot’, ‘scene’, and ‘lighting’ are forbidden - and there are no cameras in sight.

It’s Synecdoche, New York, but it’s real: The actors enact entire lives on set. They eat, they sleep, they bathe, they work.

"It's almost slavery," writes one former crew member in a blog. "But Ilya managed to make everyone think they were part of something truly great." "Working here," notes another, "is like being that guy who wanted to be killed and eaten, and finding a maniac who wants to kill and eat you. Perfect reciprocity."


They have years of footage now. Years.

The film that will someday emerge from this footage can be anything—a great historical epic or a tedious tone poem—or nothing at all. Because Dau is not just a runaway shoot. It's a shoot running away from itself: the first film project in history whose director doesn't seem to want to make a movie. "What's going to happen to the set after the shooting is over?" I asked Khrzhanovsky once, and watched him plunge into an instant funk. "I don't know," he said, caressing a faux-marble wall of the cafeteria. "Right now, shooting is the only thing that justifies the enormous costs of keeping it up. I don't know what to do later."


But what is most amazing is Michael Idov’s disturbing account of how, in the short space of 48 hours, he becomes immersed in this alternate historical creation. I find it endlessly fascinating – what it means to be living in this other world, no less real than your own, yet completely not your own, somebody else's, a fictional somebody else - how will each of the hundreds of cast members navigate the return to their former lives? And will they even want to leave?

Kolmanskop is a town in the Namib, once a mining village, now a ghost town. The abandoned houses (photographed by Álvaro Sánchez-Montañés) have a sad and surreal quality to them. They are slowly disappearing, and with a stillness that startles and speaks to some unknown fear deep in your gut.



Also stunning are these aerial shots of Iceland by Andre Ermolaev, marbled and mottled landscapes that feel foreign at first glance.

On the subject of photography and the sheer volume of images we process each day, worth reading is Teju Cole’s thoughtful and poetic essay on Gueorgui Pinkhassov and Instagram as poshlost, defined by Nabokov as “the falsely important, the falsely beautiful, the falsely clever, the falsely attractive.”

The piece touches on similar ideas explored in Social Media and the Arts of Memory – the piece Joe has linked to above – that is, that the way we use technology has changed drastically over the past few decades. The primary function of a phone is no longer to facilitate conversation across distances; they’re memory aids that we've become completely dependent on. Cameras – or more specifically, camera phones – are no longer used primarily for capturing special memories or breathtaking scenes; they document the minutiae of everyday life. In other words, they serve the same function as our eyes.

What is the fate of art in the age of metastasized mechanical reproduction? These are cheap images; they are in fact less than cheap, for each image costs nothing. Post-processing is easy and rampant: beautiful light is added after the fact, depth of field is manipulated, nostalgia is drizzled on in unctuous tints of orange and green. The result is briefly beguiling to the senses but ultimately annoying to the soul, like fake breasts or MSG-rich food. I like Matt Pearce’s thoughtful polemic on this subject, published on these pages: “Never before have we so rampantly exercised the ability to capture the way the world really looks and then so gorgeously disfigured it.”





...



There is of course nothing wrong with a photograph of your pug. But when you take that photograph without imaginaton and then put a “1979” filter on it—your pug wasn’t born in 1979—you are reaching for an invented past that has no relevance to the subject at hand. You make the image “better” in an empty way, thus making it worse. Your adoring fans or friends can instantly see your pug or your ham sandwich on which you have bestowed the patina of age. This immortal sandwich of yours is seen by hundreds of people even before you’ve finished eating it.


*
Oh! And, most excitingly, we have a sleek new logo, designed by the supremely talented Jeremy Unkovich of Scrublife fame:


(it's a pantograph punch)

Read by Category

The Pantograph Punch publishes urgent and vital cultural commentary by the most exciting new voices in Aotearoa.

The Pantograph Punch publishes urgent and vital cultural commentary by the most exciting new voices in Aotearoa.

Your Order (0)

Your Cart is empty