James Vlahos wrote in Wired magazine in July 2017 about his creation of a ‘dad bot’. Vlahos sat down and taped a series of conversations with his dying father about the story of his life. The transcript of these conversations is rich with the stories, thoughts, and expressions of his dad. He begins to ‘dream of creating a Dadbot – a chatbot that emulates… the very real man who is my father. I have already begun gathering the raw material: those 91,970 words that are destined for my bookshelf’. The transcripts are training data. Over months he builds the bot, using PullString, training and testing it to talk like his dad.

He takes the bot to show it to his mother and father, who is now very frail. His mum starts talking with the bot.

I watch the unfolding conversation with a mixture of nervousness and pride. After a few minutes, the discussion ¬segues to my grandfather’s life in Greece. The Dadbot, knowing that it is talking to my mom and not to someone else, reminds her of a trip that she and my dad took to see my grandfather’s village. “Remember that big barbecue dinner they hosted for us at the taverna?” the Dadbot says.

After the conversation, he asks his parents a question.

“This is a leading question, but answer it honestly,” I say, fumbling for words. “Does it give you any comfort, or perhaps none—the idea that whenever it is that you shed this mortal coil, that there is something that can help tell your stories and knows your history?”
My dad looks off. When he answers, he sounds wearier than he did moments before. “I know all of this shit,” he says, dismissing the compendium of facts stored in the Dadbot with a little wave. But he does take comfort in knowing that the Dadbot will share them with others. “My family, particularly. And the grandkids, who won’t know any of this stuff.” He’s got seven of them, including my sons, Jonah and Zeke, all of whom call him Papou, the Greek term for grandfather. “So this is great,” my dad says. “I very much appreciate it.”

Later, after his father as passed away, Vlahos recalls an exchange with his 7 year old son.

‘Now, several weeks after my dad has passed away, Zeke surprises me by asking, “Can we talk to the chatbot?” Confused, I wonder if Zeke wants to hurl elementary school insults at Siri, a favorite pastime of his when he can snatch my phone. “Uh, which chatbot?” I warily ask.
“Oh, Dad,” he says. “The Papou one, of course.” So I hand him the phone.’

The story is strange and beautiful. It provokes us to think about how we become entangled with media technologies, and the ways in which they are enmeshed in our human experience. In this story, not just a father – but a family and their history – is remembered and passed on not with oral stories, or photo albums, or letters but with an artificial intelligence that has been trained to perform someone after they die.

The dadbot is an example of the dynamic relationship between surveillance and simulation. Surveillance is the purposeful observation, collection and analysis of information. Simulation is the process of using data to model, augment, profile, predict or clone. The two ideas are interrelated. Simulations require data and that data is produced via technologies of surveillance. The more data we collect about human life, the more our capacity grows to use that data to train machines who can simulate, augment and intervene in human life.

If the Dadbot is a real experiment, let me offer a speculative fictional one. In the episode Be Right Back of his speculative fiction Black Mirror, Charlie Brooker asks us to think about a similar relationship between humans, technologies and death. Be Right Back features a young couple: Martha and Ash. After Ash’s death, his grieving partner Martha seeks out connection with him. At first the episode raises questions about how media is used to remember the dead. Old photos, old letters, clothes, places you visited together, songs you listened to. A friend suggests Martha log into a service that enables text-based chat with people who have passed away, simulating their writing style from their emails and social media accounts. She does that. It escalates. She uploads voice samples that enable her to chat to him on the phone. She becomes entangled in these conversations. Sometimes the recording spooks her, for instance when she catches it ‘googling’ answers to questions she knows Ash wouldn’t know. A new product becomes available, a robot whose draws on photographs and videos of Ash while he was alive. It arrives. She activates it in the bath. The robot is much better in bed than Ash ever was. Things get complex. Martha goes looking for the gap between the robot and the human.

Vlahos’ Dadbot and the robot in Be Right Back are both examples of the interplay between surveillance and simulation. Each of them illustrate how the capacity to ‘simulate’ the human depends in the first case on purposefully collecting data. Data is required to train the simulation.

In his 1996 book, The Simulation of Surveillance: Hypercontrol in Telematic Societies, William Bogard (1996) carefully illustrates the history of this relationship between simulation and surveillance. He proposes that over the past 150 years our societies have undergone a ‘revolution in technological systems of control’. That is, our societies have developed increasingly complex machines for controlling information, and using information to organise human life. One of the key characteristics of the industrial societies that emerged in the 1800s was the creation of bureaucratic control of information. Bureaucracies were machines for gathering and storing information using libraries, depositories, archives, books, forms, and accounts. They processed that information in standardised ways through the use of handbooks, procedures, laws, policies, rules, standards and models. Since World War II these bureaucratic processes have become ‘vastly magnified’ via computerisation. Bureaucracies rely on surveillance. They collect information in order to monitor people, populations and processes. Think of the way a workplace, school or prison ‘watches over’ its employees, students, or prisoners in order to control and direct what they do.

Bogard argues that increased computerisation has resulted in surveillance becoming coupled with processes of simulation. Remember, surveillance is the purposeful observation, collection, and analysis of information, while simulation is the process of modelling processes in order to profile, predict or clone. Inspired by the French theorist of surveillance Michel Foucault, Bogard suggests that surveillance operates as a ‘fantasy of power’ which in the post-industrial world ‘extends to the creation of virtual forms of control within virtual societies’. What’s a ‘fantasy of power’ here? Well, firstly, it is a kind of social imaginary, a set of techniques through which individuals ‘internalise’ the process of surveillance. They learn to watch over themselves, they learn to perform the procedures of surveillance on themselves, in advance of technologies themselves performing those techniques. Let me give a very simple example. You might go to search something on Google, and then stop because you think ‘Hmm, Google is watching me…’ I don’t want it to know I searched that. You discipline yourself, pre-empting the disciplinary power of the technology.

But, secondly, a fantasy of power gestures at something else important too. It suggests a society where we come to act as if we believe in the predictive capacity of surveillance machines. That is, in practice we trust the capacity of bureaucratic and technical machines to watch over and manage social life. We trust machines to reliably extend human capacities. By the 1990s, the socio-technical process of simulation had become an ordinary part of many social institutions. For instance, computerized ‘experts’ increasingly assist doctors in making complex medical diagnoses, algorithmic models help prisons determine which prisoners should be eligible for parole, statistical modelling projects the need for public infrastructure like roads and schools, satellite surveillance informs foreign policy decisions.

The ‘fantasy’ driving government, military and corporate planning is that the capacity of digital machines to collect and process data can extend their capacity to exercise control beyond what humans alone might accomplish. Across government, corporate and military domains in the post-war period ‘simulations’ became standard exercises. Simulations are used by engineers to project design flaws and tolerances in proposed buildings. For instance, to test whether a building could withstand an earthquake before that building is even built. They are used by ecologists to model environments and ecosystems, by educators as pedagogical tools, by the military to train pilots and by meteorologists to predict the weather. In each of these examples data is fed into a machine which predicts the likelihood of events in the future. Corporations increasingly base investment decisions on market simulations and more recently, nanoscientists have devised miniaturised machines that can be used in areas as diverse as the creation of synthetic molecules for cancer therapy to the production of electronic circuits.

Bogard calls these ‘telematic societies’ driven by the fantasy that they can ‘solve the problem of perceptual control at a distance through technologies for cutting the time transmission of information to zero.’ That is, these societies operate as if all they need to do is create technologies that can watch and respond to populations in real time. In these societies powerful groups invest in creating a seamless link between ‘surveillance’ and ‘response’, between collecting data, processing it and acting on it.

Bogard’s original insight then is to identify – in the mid 1990s no less – that we are becoming societies where ‘technologies of supervision’ that collect information about human lives and environments are connected to ‘technologies of simulation’ that predict and respond in real-time. That’s a wonderfully evocative insight to think about in the era of the FANGs. Facebook, Amazon, Netflix and Google are each corporations whose basic strategic logic, and engineering culture is organised around the creation of new configurations of technologies of supervision, data collection, and technologies of simulation, prediction, response and augmentation.

Nicholas Carah and Deborah Thomas