The Babadook, Intervention Technology, and Designing With, Not At.


Warning: A few light spoilers for The Babadook, but I’ve tried really hard not to ruin it too much.

In Jennifer Kent’s rather remarkably terrifying film, The Babadook, a mother and son slowly regress into their house, away from school and family, held hostage (psychologically, emotionally) by a monster that has leapt straight out of a children’s picture book. An analogy for grief, and its subsequent consumption, the monster, the Babadook, lingers in the house, growing stronger with each denial, more fearsome with each rebuttal. Grief can turn a home into a dark, inhospitable place, left open for monsters to occupy every cupboard, every shadow, every dark and stormy night. As the small, compacted family are forced inside their house, the outside world is less visible, because the world has almost definitely forgotten about them. For many, their grief is too much to deal with, and it’s this loss of control that is the most frightening part of the narrative.

As a film that looks at how paranoia, grief, and desperation manifests, The Babadook is excellent, there is no unnecessary gore or jump tactics (as many reviews have highlighted, with a collective sense of relief) but rather a slow unraveling that ultimately leads to a rather visceral, human climax, with a monster, of various kinds, overwhelming the house. This isn’t a film about the supernatural at all, it’s about people.

At the heart of The Babadook is the focus on the things that we don’t talk about, or don’t see. Kent’s film is rather sympathetic of the social services, but not the unhelpful schools that misunderstand Samuel’s eccentric behavior as disobedient, rather than bright (a story familiar with many), causing the family to be further isolated from the world around them. Adding to that a sister that is more concerned with how she’s perceived socially than to care about her overworked, emotionally fraught sister and you’ve got a recipe for disaster. Aside from thinking about how grief can be the monster in the basement, The Babadook looks at the people who do fall through the cracks, into the basements of the world, never to be seen again. How we see these people, and how we find ways to reach them, is as important a question as it’s ever been, and one that we will eventually want to use technology, potentially, to solve.

Recently there’s been a lot of controversy and criticism of The Samaritan’s app, Radar, and rightfully so, as the app’s core function is to monitor the people in your twitter timeline for key trigger words and phrases, which, on identifying them, alerts you that this person might be in trouble. The conversation surrounding this has raised necessary discussions about what intervention is, where technology should, and shouldn’t be used, and a whole multitude of issues of consent, surveillance, and online communities. Radar is intrusive, and although it’s essentially enabling peer-to-peer surveillance and gross misunderstanding, there’s a reason why Samaritans thought it should exist. They were trying to do a good thing, in a bad, misguided way. Stavvers wrote an excellent summation of why she wants it pulled, for the very specific, very important point that not everyone will have your best interests at hand, ‘not everyone is going to be operating from a position of good faith’. When you’re down, that’s the time for your enemies to kick the hardest, and by accidentally facilitating this, Radar hands the power to those who know exactly how to exploit it.

The Samaritans, who do a spectacular job, saw technology as a way of stepping in before something bad happened, because sometimes it’s hard to tell the warning signs. As I mentioned earlier, in The Babadook, Amelia’s family are barely there, and when they are, they are too wrapped up in the apparently complexities of their own mediocre dilemmas to even notice, or take action, when a cry is heard. So, rather clumsily, Radar is trying to respond to that need to intervene more quickly, more effectively, albeit with an added dose of emotional distance and obligation (see Joanne McNeil’s piece of Facebook adding a layer of obligation to our everyday interactions for an interesting perspective on this). Like many IoT devices, this interaction runs the risk of being yet another thing that we can ignore, or get bored of.

Radar, as a means for intervention, is a key example of techno-solutionism, and not designing with people, but rather at them, without fully anticipating the full set of consequences or problems that could arise, as Stavvers mentioned, this is almost perfectly designed for trolling. In Dan Lockton’s excellent essay ‘As We May Understand’, he stresses an important problem with IoT innovation, that we are seeking to correct, or change behavior, rather than working with people to know what they actually want, designing with, not for:

‘People (‘the public’) are so often seen as targets to have behaviour change ‘done to them’, rather than being included in the design process. This means that the design ‘interventions’ developed end up being designed for a stereotyped, fictional model of the public rather than the nuanced reality.’

For devices that do intervene, or help, Dan suggests ‘helpful ghosts’, stone tapes that provide ‘ambient peer support’ through pre-recorded messages, triggered by a specific series of parameters, thrown carefully into a customised recipe using something like If This Then That. On the surface, this sounds vaguely like Radar, but there’s a hope in this, that it is tailored to the individual, by another individual, and isn’t invasive or awful. The Anti-Clippy (which Dan points out), if you will.

As we look for more ways to make convenient our daily lives through connected objects, we’re almost certainly going to see more in the way of intervention devices. I mean, they already exist, take Glow for example (thanks Meg Rothstein for alerting me to this), a pregnancy app that allows the pregnant person to add their partner to the app, and using the recorded data, sends prompts to make their pregnancy more comfortable. ‘Your partner has recorded that they aren’t feeling too good today, why don’t you get them some flowers?’ What the app doesn’t know (because it doesn’t ask) is that your partner might hate flowers, or is allergic, and might just need a hug, or to be left alone, but this layer of obligation prompts you to be kind in a way that will never feel genuine, because there’s not enough of your relationship there, it’s not personal, or made with you.

What does the future look like here if everything becomes an obligation? Where can we look beyond, to where intervention could be helpful? One of my immediate thoughts was applying this to older people, who might not have left the house in a while, where a gentle nudge to a nearby designated person can let them know to drop them a line. In The Babadook, this comes in the form of a friendly neighbor, who although is at first pushed back, eventually is let in, and in some ways, becomes the subtle, ongoing hero of the story, beyond the close of the film. It’s certainly a level of obligation, but in some way it’s not masking as something else (such as romance, with Glow, or suicide prevention, in the case of Radar), because we do need reminding sometimes. As always, comments welcome.