Make your own reality

I hesitate to do this because there’s a lot of people on Twitter these days tweeting very grim predictions. But, here goes. This is a thread posted by Justin Hendrix, the director of the NYC Media Lab in June 2017. He plays out a scenario here that makes clear the political stakes of the difference between representation and simulation.

Trust in the media is extremely low right now, but I think it may have a lot further to go, driven by new technologies. In the next few years technologies for simulating images, video and audio will emerge that will flood the internet with artificial content. Quite a lot of it will be very difficult to discern- photorealistic simulations. Combined with microtargeting, it will be very powerful. After a few high profile hoaxes, the public will get the message- none of what we see can be trusted. It might all be doctored in some way. Researchers will race to produce technologies for verification, but will not be able to keep up with the flood of machine generated content. The platforms will attempt to solve the problem, but it will be vexing. Some governments will look for ways to arbitrate what is real. The only way out of this now is to spend as much time trying to understand the externalities of these technologies as we do creating them. But this will not happen, because there is no market mechanism for it. Practically no one has anything to gain from solving these problems.

OK, so Hendrix is one of many who understand Trump, rightly I think, as a symptom of a media culture characterised by what Mark Andrejevic calls ‘infoglut’. The constant flood of views, opinions, theories and images amounts to a kind of disinformation. It becomes harder for us to mediate a shared reality that corresponds with lived experience, that coheres with history or that is jointly understood. In a situation of infoglut actors will emerge, like Putin and Trump, who will thrive on information/disinformation overload.

Hendrix’s grim warning illuminates is what is lost when representation gives way to simulation. A media culture organised around the logic of representation is one in which words and images denote or depict objects, people and events that actually exist in the material world. A media culture organised around the logic of simulation is one in which words and images can be experienced as real, even where there is no corresponding thing the sign refers to in the ‘real world’ or outside the simulation itself.

This is what Hendrix forewarns, the creation of artificially intelligent bots that can produce statements, images and videos that a human experiences as real. His point is this, as non-human actors like artificially-intelligent bots begin to participate in our public discourse they have a corrosive effect on the process through which create a shared understanding of reality.

If we can’t be sure that something we see or read in ‘the media’ is even said by a human, we begin to lose trust in the very idea of using media to understand the world at all. We become reflexively cynical and sceptical about the very character of representation. If we begin to live in a world where we cannot even tell if a human is speaking, then what we lose is the capacity to make human judgments about the quality of representation.

Representation itself begins to break down.

A month after Hendrix’s prediction, computer scientists at the University of Washington reported that they had produced an AI that could create an extremely ‘believable’ video of Barack Obama appearing to say things he had said in another context. OK, so they are not yet at the point of creating a video where Obama says things he has not said, but they’re getting close.

This is how they described their study.

Given audio of President Barack Obama, we synthesize a high quality video of him speaking with accurate lip sync, composited into a target video clip. Trained on many hours of his weekly address footage, a recurrent neural network learns the mapping from raw audio features to mouth shapes. Given the mouth shape at each time instant, we synthesize high quality mouth texture, and composite it with proper 3D pose matching to change what he appears to be saying in a target video to match the input audio track. Our approach produces photorealistic results.

So, Hendrix is right it seems. We are entering an era where a neural network could produce video of someone saying something they never said. And we, as a human, would be unable to tell. If this kind of artificially constructed speech becomes widespread then the consequence is a dramatic unravelling of the socially-constructed institution of representation. In short, a falling apart of the process through which we as humans go about creating shared understanding of the world.

If the ‘industrial’ media culture of the twentieth century exercised power via its mass reproduction of imagery, then the ‘digital’ media culture of today is learning to exercise power via simulation. To make a rough distinction, we might say if television was culturally powerful in part because of its capacity to reproduce and circulate images through vast populations, then the power of digital media is different in part because of its capacity to use data-driven non-human machines to create, customise and modulate images and information.
Here’s the thing. Simulation is both a cultural and a technical condition.

Cultural in the sense of accepted practices of talking about the world – like journalism – that establish a commonly held understanding of reality. Simulation is a cultural practice where people in the world do it, attempt to make reality conform with their predictions. Technical in the sense of the creation of tools and institutions that produce and disseminate these depictions of reality – like cameras, news organisations and television transmitters.

Digital media technologies, and particularly their capacity to process data, dramatically escalate the capacity to simulate.

It is one thing for say Trump to propagate the lie that Obama was not born in the United States. This, as just one of many of the false statements Trump makes, illustrates part of the character of simulation. Trump says it, over and over, others repeat it, public opinion polling begins to show a majority of his voters believe it. It becomes real to them. But, imagine how this could be escalated if Trump or one of his supporters could create a video where Obama appeared to admit that he was not born in the United States.

The capacity of digital media to simulate – to create images that appear real even when they have no basis in reality – dramatically intensifies a culture of simulation. And, a culture of simulation is one where the images we invent begin to change the real world.

This follows Jean Baudrillard’s logic when he explains that ‘someone who feigns an illness can simply go to bed and pretend he is ill. Someone who simulates an illness produces in himself some of the symptoms".

The simulation begins to affect reality.

Baudrillard builds in part on Guy Debord’s notion of how, ‘the saturation of social space with mass media has generated a society defined by spectacular rather than real relations’.
According to Baudrillard, in a world characterised by immersion in media, simulation supersedes representation. The signs – image and words – we consume via media are no longer directly related with a ‘reality’ outside of the system of signs.

To return to Hendrix’s example from the outset, we might say that the ‘fake news’ that has been the subject of public debate since the 2016 Presidential election follows this logic of simulation. News that follows the logic of representation is ‘testable’ against reality. Representative news presents images of people saying things they actually said and accounts of events that actually happened. Simulated news though presents a series of claims and image and stories that refer to one another, but cannot be tested with reality.
Simulations though feel real, or are perceived as real, when they immerse us in this self-referential system of signs.

We might say that the ‘fake news’ that went viral during the 2016 Presidential election followed this logic. For some people their Facebook News Feeds began to fill up with repeated stories about the corruption of Hillary Clinton and the Democratic party, vast interwoven conspiracies involving murders, criminal activities, and human trafficking. The more some users clicked on, liked and shared these stories, the more of them they saw. None of these stories beared up to any comparison with reality, yet their constant repetition within News Feeds made them feel real to many Facebook users. These fake news simulations produced symptoms in the bodies and minds of those consuming them. They began to act as if they were real.

So, to reiterate. We might describe this kind of algorithmically-fuelled ‘fake news’ as following the logic of simulation. ‘Fake news’ is the circulation of stories that can be experienced as if they are real, even when there is no corresponding thing the sign refers to in the ‘real world’, or outside of the simulation itself.

Following Baudrillard’s way of thinking, we might say this creates a situation of hyperreality, where the basic relationship between signs and reality implodes. Let’s return then to Hendrix’s prediction we considered at the outset. A situation where non-human artificially intelligence devices produce their own depictions of real people and events, and we as humans cannot tell if these things were really said or done by fleshy humans? That seems to me to be hyperreal in the sense that Baudrillard means.

That is, hyperreality as the situation where simulations are experienced as real and therefore produce how we experience ‘reality’. Imagine you are watching a video of the President of the United States speaking that looks absolutely real, even those he never said those things. That’s a situation where the relationship between signs and reality has imploded. You can no longer trust that the signs represent reality. The video is a simulation in the sense that the words coming from the President’s mouth do not actually refer to real words the real person named President Obama said. And yet, I cannot really parse the difference. Simulation is no longer ‘referential’, but instead the production of a model or ‘world’ without an underlying reality. As Baudrillard describes it, ‘it is no longer a question of imitation, nor duplication, nor even parody. It is a question of substituting the signs of the real for the real itself.’

To illustrate this logic, listen to the writer Ron Suskind recount a conversation he had with Karl Rove one of US President’s George W. Bush key political strategists.

Suskind said that Rove told him that reporters like you live "in what we call the reality-based community," which he defined as people who "believe that solutions emerge from your judicious study of discernible reality…That’s not the way the world really works anymore…We’re an empire now, and when we act, we create our own reality. And while you're studying that reality -- judiciously, as you will -- we'll act again, creating other new realities, which you can study too, and that's how things will sort out. We're history's actors . . . and you, all of you, will be left to just study what we do.”

The order here is what matters. In the order of representation reality happens, we study it, and then we use language to explain it. In the order of simulation, we imagine and predict a real future, and then we set about making reality conform with our prediction. Think of Karl Rove’s claim that the American empire had moved into a phase where it could ‘create its own reality’ in relation to Beaudrillard’s claim that ‘present-day simulators attempt to make the real, all of the real, coincide with their models of simulation’.

The idea of simulation here is a political and cultural condition and a technical achievement. The more you have the computing power to collect and process data that enable you to make predictions, the more you begin to act as if reality conforms to your predictions
The question we might ask then is in whose interests is it to pursue the development of cultures and technologies of simulation, rather than representation.

Writing in the London Review of Books John Lanchester remarks that ‘Facebook has no financial interest in telling the truth’. Buzzfeed reported that in the final three months of the US presidential election, fake news stories on Facebook generated more engagement than real news from reputable news sources’. Facebook’s algorithmic architecture is geared for simulation not representation, it uses data to produce immersive streams of images that conform with moods and preconceptions of individuals users.

With the rise of the major platforms, we need to contend with powerful actors whose business model is organised around the effort to simulate and augment reality. For us, as citizens of this world, the struggle is to articulate and defend the order of representation because with it goes the possibility of shared human experience.