John Oliver covered the dangers of AI on his weekly HBO show, calling it “worryingly corrosive” for society.
On Last Week Tonight, Oliver said that the “spread of AI generation tools has made it very easy to flood social media sites with cheap, professional-looking, often deeply weird content” using the term AI slop to describe it all.
He referred to it as the “newest iteration of spam” with weird images and videos flooding people’s feed with some people having “absolutely no idea that it isn’t real”.
Oliver said that it’s “extremely likely that we are gonna be drowning in this shit for the foreseeable future”.
With content such as this, “the whole point is to grab your attention” and given how easy it’s become to make it, the barrier of entry has been reduced.
Meta has not only joined the game with its own tool but it’s also tweaked the algorithm meaning that more than a third of content in your feed is now from accounts you don’t follow. “That’s how slop sneaks in without your permission,” he said.
There are monetisation programs for people who successfully go viral and now a range of AI slop gurus who will teach people what to do for a small fee.
It’s “ultimately a volume game like any form of spam” and has led to AI generators ripping off the work of actual artists without crediting them. But “for all the talk of riches in those slop guru videos, the money involved here can be relatively small”.
It can be as little as a few cents and then sometimes hundreds of dollars if it goes mega-viral which means that a lot of it comes from countries where money goes further like India, Thailand, Indonesia and Pakistan,
One of the downsides is having to explain to parents that content isn’t real. "If you see an animal that’s so cute it defies reality and it’s not Moo Deng, odds are it’s AI,” he said.
There’s also an environmental impact to the resources needed to produce it as well as the worrying spread of misinformation.
Oliver spoke about the many fake disasters that have been created with images and videos showing tornados and explosions and plane crashes that don’t exist. “Air travel is scary enough now without people making up new disasters,” he said.
Generative AI has also been used during the Israel-Iran conflict and posed problems for first responders with the flooding in North Carolina last year. It was also used by Republicans to show that Biden was not handling the latter situation well with fake images used on the right despite them being told they weren’t real.
“It’s pretty fucking galling for the same people who spent the past decade screaming ‘fake news’ at any headline they didn’t like to be confronted with actual fake news and suddenly be extremely open to it,” he said.
While the spread wasn’t as damaging as some head feared during last year’s US election, AI is “already significantly better than it was then”.
He added: “It’s not just that we can get fooled by fake stuff, it’s that the very existence of it then empowers bad actors to dismiss real videos and images as fake.”
Oliver said it’s all “worryingly corrosive for the concept of objective reality” with platforms finding it harder to detect AI.
“I’m not saying some of this stuff isn’t fun to watch, I’m saying that some of this stuff is potentially very dangerous,” he said.