My last status update

It is five years since I posted my last status update to Facebook.

It was posted by a robot.

At the Splendour in the Grass music festival in 2013 the festival wristbands had RFID tags in them.

You could link your tag to your Facebook. Then, as you wandered about the festival going to see bands play, you could ‘swipe’ your wrist at these sensor stations to record your movements around the site. The festival said if you did this they would send you a customised playlist of every band you saw after the festival.

This is algorithmic culture. A culture where machines are trained to make judgments about our tastes and experiences.

I couldn’t resist. I linked the band to my Facebook. Every time I swiped at a sensor point, it would automatically post a status update in my name on my Facebook timeline.

The first one was ‘So glad you’re back this year The Rubens #returntosplendour’, the next one was ‘Getting my second dose of You Am I #neverenough’ and on it went, merrily posting this remarkably lame promotional pap to my timeline.

My friends made hay. This sounded nothing like the wry snark I usually traded in.

The Rubens was a dead giveaway. The uncharitable things I’ve been known to say about suburban dad rock bands over the years! The machine was pretending to be me, but failing hopelessly.

I am drawn to these moments where the interplay between the rich texture of human experience and the computational machinery of media break down. They prompt us to think about not just what media technology can do today, but what technologists are imagining media might do tomorrow.

This moment at Splendour is one of those moments where the open-ended performance of cultural life collides with the computational capacities of media technologies. The system could only send out the pre-loaded responses, and they sounded inauthentic coming out of the mouths of festival-goers. No one speaks like a music publicist.

Imagine if, when I swiped at the sensor though, an algorithm scanned my Spotify playlist to figure out if I actually like this band, and then my Facebook messages to see what the tone of my voice is, and then created a post that simulated my actual voice together with a live video from the performance in real time?

Splendour is a cultural space rich with human judgments, feelings and expressions. And so, this moment makes me ask: can machines be trained to understand, predict and make these judgments?  

Let me do a little detour.

There is an episode of the British television series Black Mirror called Be Right Back.

Spoiler Alert. If you haven’t seen it and don’t want it spoiled, stop here and go watch it.

Be Right Back is about a young couple: Martha and Ash. Ash dies in a car accident.
Martha, grieving, seeks out a connection with him. This is how we grieve: old photos, old letters, clothes, places you visited together, songs you listened to.

A friend suggests Martha log into a service where an artificial intelligence simulates the voice of your dead loved one by learning their style of expression from their emails, messages, social media posts, voice mails and other media they created while they were alive.

Martha does it.

She becomes entangled in these conversations.

Immersed in them.

A new product becomes available, a robot that can be designed based on photographs and videos of Ash while he was alive.  

Martha orders it. It arrives. She activates it. The robot Ash is uncanny. His looks, his voice. The robot is much better in bed than Ash was. But, he’s spooky. How to put it? The robot has no human presence.

Lying in bed beside Martha, robot Ash is unnervingly still. It cannot ‘spend time’ with another human.

Things get complicated.

Martha: get out, you are not enough of him…
Robot: did I ever hit you?
Martha: of course not, but you might have done.
Robot: I can insult you, there is tons of invective in the archive, I like speaking my mind, I could throw some of that at you.

 

The robot can manipulate how Martha feels, but it can’t understand her feelings or feel for itself. The robot falls short of being able to communicate.

Here again, just like Splendour, we have another moment where machines fail to simulate human experience. And yet, we still enjoy their company. We still play with them.

The philosopher of communication John Durham-Peters (1999: 30) writes

Communication, in the deeper sense of establishing ways to share one’s hours meaningfully with others, is sooner a matter of faith and risk than of technique and method. Why others do not use words as I do or do not feel or see the world as I do is a problem not just in adjusting the transmission and reception of messages, but in orchestrating collective being, in making space in the world for each other. Whatever ‘communication’ might mean, it is more fundamental a political and ethical problem than a semantic one.

This is what Martha has encountered. The robot has no capacity to experience faith, politics and ethics as anything other than data to be processed

Machines can displace forms of human known and doing in the world, but they do that by formatting communication as a series of logical procedures, calculations and predictions.

Martha takes robot Ash to the edge of a cliff they used to visit when Ash was alive. She asks him to jump off.

Martha: Jump
Robot: What over there? I never express suicidal thoughts or self harm.
Martha: Yeah well you aren’t you are you?
Robot: That’s another difficult one.
Martha: You’re just a few ripples of you, there is no history to you, you’re just a performance of stuff that he performed without thinking and its not enough.
Robot: C’mon I aim to please.
Martha: Aim to jump, just do it.
Robot: OK if you are absolutely sure.
Martha: See Ash would’ve been scared, he wouldn’t have just leapt off he would’ve been crying.

 

Ash doesn’t jump. Martha takes him home and puts in in the attic.

The writer Charlie Brooker is asking us to imagine what happens as we knit out lived experience together with intelligent machines.

Machines fail to make the textured judgments about human experience that we ourselves can make. And yet, we find ourselves in increasingly complicated relationships with them.

Splendour, a year later. Evening is falling and I stand watching the Skywhale, a hot air balloon whale with breasts for wings, being inflated. Festival goers, many affected by alcohol and drugs, are having visceral reactions. I join in with the hundreds of others taking images and videos, translating our experience into images that get circulated mostly through Snapchat and Instagram.

Where the RFID tag tried to tightly program the translation of festival moments into media content, I’m struck here by the continuous translation of the festival into images. Hundreds per hour are uploaded to the public festival hashtags.

We express our cultural experiences in flows of images. Those images double as data.
This is what a participatory media system is like. We address machines as much as we address each other. We are training machines – bit by bit – to make judgments about culture, and ourselves to incorporate intelligent machines within our human experience.