Are algorithms damaging the movies?

Movie clapperboard
Share this Article:

They affect what’s made, what we watch and how films are written out: we take a look at the quiet power that algorithms have over modern cinema.

Around ten years ago, Noel Clarke was developing a movie in and around the world of MMA fighting. He’d got the script together, the project looked to be shifting forward. And then the company that was set to press ahead with it ran the idea through a computer simulation it had. Factoring in the underwhelming gross for the acclaimed Tom Hardy-headlined Warrior in 2011, as the cliché goes, the computer duly said no. Clarke’s film was put into turnaround, where it’s stayed ever since. That computer program was a large part of the reason why.

Computers have been helping determining big decisions in the movies for some time now, of course. From the huge simulated models required to plot major takeovers (Bob Iger’s book, The Ride Of A Lifetime, talks about this when Disney was planning its takeover of Fox) from analysis by cinemas as to what films to screen. And, to be clear, I think computer intelligence and algorithms have their place.

But what I struggle with is when they almost entirely override basic human instinct, from the choices over the films that are being made, to the way films are being written about, and how we get to see them. Algorithms are at the heart of so much of modern life, and yet we rarely see them, let alone have a chance to properly understand them (although in the UK, they’ve made headlines thanks to the current exam results fiasco, of course). What we’re seeing instead is their major role in decision making. And it’s easy to understand why: it’s much easier to blame a computer when something goes wrong for a start.

Most overtly in the movies, streaming services are fuelled by analytical data. In fact, at the heart of Netflix’s first big self-made successful TV show was greenlit off the back of it.

The show in question was House Of Cards, and at the time it was greenlit, two things suddenly caught the attention. The first was that a streaming service was suddenly making a major TV series with its own money (that really was a big deal back when the deal was announced), and the shockwaves of that have rippled through the industry since. The second was that it ordered two seasons up front, for a cost of $100m, without seeing a single episode first. No traditional TV network could or would do that.

But Netflix had crunched its many, many numbers.

As Wired reported back in 2012, the streaming service was gambling hard on ‘big data’, analysing the viewing habits of its-then 29 million subscribers (that number has ballooned since). As the firm’s Jonathan Friedland told Netflix at the time, “we know what people watch on Netflix and we’re able with a high degree of confidence to understand how big a likely audience is for a given show based on people’s viewing habits … We want to continue to have something for everybody. But as time goes on, we get better at selecting what that something for everybody is that gets high engagement”.

There’s no question it managed that. House Of Cards became its first exclusive huge hit, and many more followed. And then, following that success, other streamers followed in its path. This transferred to Netflix movies too. As much as the industry sneered when it inked a hugely expensive exclusive deal with Adam Sandler for four movies (the first of which, The Ridiculous Six, is pictured below) – a deal that’s been extended twice since – the firm knew what it was doing. It had the numbers to back it up. Never mind whether the films were any good, it was certain that people would watch. They did.

Chatting to my ex-Den Of Geek colleague Ryan Lambie, we were musing that Netflix now feels better for TV shows than films, but then that’s the algorithm at work too. It’s registering engagement, and because it wants us watching for longer, it makes more sense to put most of its energy into television productions with multiple episodes. Sure, one big $150m film gets the headlines, but how many of us have watched 6 Underground more than once?

Major movie studios, without the annual budget of Netflix, have become more risk averse with their big bets as streaming has grown. And they’ve been calling on more and more on algorithm-based models to choose their bets and manage their gambles.

The Verge, last year, ran a piece explaining how a tool called ScriptBook is predicting the success of a movie just by a computer analysis of its script (Scriptbook’s website gives an overview, here). A separate tool called Vault (pictured below) is giving studios a detailed analysis of how movie trailers are being received online.

Pilot – the AI tool, not the excellent podcast and magazine – goes further, offering “unrivalled accuracy” as it predicts box office numbers 18 months ahead. Although how each of these tools would have accounted for the last six months is questionable.

In 2018, meanwhile, 20th Century Fox – just before it fell to Disney – admitted that it was using an AI tool to micro-analyse its movie trailers to work out which part of them would be most appealing.

To give you an idea of how human all of this is, the aforementioned ScriptBook describes itself as “democratizing storytelling through the art of AI”. A sentence that looks like it’s never seen a person before for a start. That hasn’t stopped films such as Knives Out, Baywatch and, er, Gotti being put through its servers. Even indie productions are amongst the many names the company boasts have utilised its services.

Countering this, there’s the story of Megan Ellison and her Annapurna Pictures company. She had the finance and the clout to press ahead with a slate of pictures that were far more based on human instinct and choice. Thanks to her, pictures such as Detroit, If Beale Street Could Talk, The Sisters Brothers and Booksmart got through the Hollywood system. But whilst the films were oftentimes hugely acclaimed, the box office wasn’t matching the expenditure, to the point where the company was rumoured to be having to fend off bankruptcy in 2019 (which thankfully it appears to have done).

It’s a grim picture. There are some who get films through such a system by knowing how to work it. James Mangold appreciated he had a window of opportunity to get a passion project made after the success of Logan, and thus managed to get Fox to greenlight Le Mans 66 (on the proviso he could get the budget under $100m, necessitating some clever script work). But Fox – that was making 15-16 films a year – will be lucky to now make a quarter of that number.

Hand in hand with this is the way algorithms are affecting how film is written about, particularly online.

For an entertainment website to thrive in the modern environment, it has to play by rules set by huge organisations: primarily Google and Facebook. The Google algorithm in particular is pivotal, and it keeps shifting (and in fairness, trying to do the right thing). But the bare basics usually involve hitting keywords, making sure stories are a certain length, headlines that hit those keywords, and if there’s a successful topic, writing 20 articles about that rather than 20 articles about 20 different films. It pays to zero in on big hit movies lots of times if you want lots of clicks.

If you wonder why lists, ‘everything you need to know’ about an upcoming film features, ending explaining articles, trailer breakdowns and ‘will there be a sequel to’-style articles are so prevalent, it’s because they get the clicks. I’d argue that in an algorithm-defined world, clicks are of more value, at least in the short term, than readers (although there’s obviously a strong point that these are the same things). And given how many websites have taken away comments on articles (there’s an article in that I may write one day), that level of engagement wasn’t required for a while.

That’s now reversing tough, as Google now looks to prioritise high time spent on an individual site. I’ve spoken to a few people working day-to-day in this area, and they all argue that Google is at least trying and pushing in the right direction. But also, it’s a huge industry in itself, working out how best to play the Google system.

What these algorithms are doing as well is artificially bloating articles. Google has a habit of punishing websites that spend 200 words on a topic even if that’s all that’s needed by the story itself, and instead will regard something with 4-500 words as ‘more substantive content’. Thus, if you’re reading an article that’s taking three paragraphs to get anywhere near the question it asked in its headline, that’s why. It’s actually part of the Google drive to keep people more engaged, but it’s being gamed by publishers looking for hits. My admiration for sites backing proper longform writing in this era – including my former employer – is sky-high. After all, the incentive to do more than Google requires isn’t always there.

What’s also not been seen is the growing number of sites deleting older news articles too. This is to keep load times faster as they come under greater weight from growing numbers of adverts. Furthermore, there are quality issues too with a lot of older material. But still: older editorial is being sacrificed to keep a site working at a decent speed, to avoid being penalised by Google. Again, it’s hard to blame the sites concerned, and this is very much not a dig at them.

I’ll happily dig at Facebook, though. The social media giant spent time persuading publishers to turn their attention to video rather than articles, and let’s just say that its algorithm proved ‘unreliable’. This BBC article goes into that in a bit more detail: the consequence was that many publishers dismissed their writers in favour of video teams, with false analytics one reason one. An instance where the algorithm was plainly wrong, but nobody would seemingly argue with it.

Then, of course, there’s clickbait, the key lure for getting noticed on social media platforms. Algorithms have a habit of rewarding pieces that get lots of clicks, and the surest way to get them is a clickbait headline. That much you probably guessed.

Finally, let’s go back to streaming. A report landed earlier this week that Netflix is going to take the effort out of choosing what to watch, by trialling automatically starting another film or TV show. It does so based on the algorithm’s analysis of what you’ve watched before.

Given it’s already impossible to get a straight list of what’s available on a streaming service, most of us are choosing what to watch on a film night from a list of 40 or 50 titles, curated by a computer. Now, we won’t even need to choose. We’re turning into The Jetsons, where even the act of pushing a single button is a strain, and cause for a sit down.

I think of a film like Pride in the midst of all of this (pictured below). It’s one of my favourite films of the last decade, and I’ve long argued it’s a big mainstream movie, with a core of steel to it. But where does that fit into this current culture?

The only way it was made was via a mix of funding sources. It then struggled to get much in the way of film outlet attention, because the clicks – and in turn, the revenue – is in Marvel movies and their ilk (no slight on Marvel movies there either). And finally, when it arrives on a streaming service, what chance does it have of bubbling to the top? It’s the same challenge faced by independent and smaller productions, that have always struggled for attention. But algorithms have diluted the impact of counter-programming for a start, and the irony of having so much choice at our fingertips is that algorithms are in turn narrowing the options presented to us. The films on the bottom shelf of the video store in the 80s have precious little chance of being found today.

I appreciate this all paints quite a cloudy, downbeat picture (and I’ve not even got to trying to search for a DVD you want on Amazon, and having to wade through its suggestions and sponsored results before you get to want you want), but I’d mitigate that by suggesting that a lot of good material comes out of all of this work. Algorithms clearly aren’t all bad. When it actually comes to making a film, once the greenlight has been given, I don’t believe that a director is sat there with a computer printout telling them what they’re allowed to do (although editing and testing are another story).

But I also think that word of mouth has never been more important. As much as it’s fun to discover a movie from a list of great films you might have missed – ironically, just the kind of material that’ll bubble up on Google – what’d be more refreshing was if the deck wasn’t so loaded against such a movie in the first place.

My suggestions would thus be obvious. If you see something you like, tell someone. Go on social media and recommend that book, film, website or article, especially if it’s gambling on going against what’s ordinarily successful.

Click on a like button, or even better a repost button. Do something to give a smaller, independent movie a fighting chance of getting noticed. Because the ramifications of a surprise success aren’t just about wrong-footing the computer. They’re about very slightly altering that algorithm, and giving a human being an extra argument when they’re trying to get a film made that a computer simulation would otherwise reject.

Algorithms have helped the movies in lots of ways, I do see that. But human beings? They’re really not bad either. And perhaps good old fashioned instinct shouldn’t be entirely washed away.

Certain images: BigStock

————

Thank you for visiting! If you’d like to support our attempts to make a non-clickbaity movie website:

Become a Patron here.

Sign up for our email newsletter here.

Follow Film Stories on Twitter here, and on Facebook here.

Buy our Film Stories and Film Stories Junior print magazines here.

Share this Article:

Related Stories

More like this