Fake Bot Content Is Hard to Spot – and the Problem Is Getting Worse

Posted October 28th, 2016 at 11:05 am (UTC-5)
1 comment

(VOA/T. Benson)

(VOA/T. Benson)

Sometimes you see them, sometimes you don’t. But they are all over social media. Bots, short for robots, are automated applications – both good and bad – that either help with Web search and repetitive tasks or wreak havoc. And as more of them flood cyberspace, authenticating the content they push will be a major challenge.

Just this week, an alleged Facebook Live video went viral. The video purportedly showed a live feed of an International Space Station spacewalk. But there’s only one problem with the story. It isn’t real.

Neither NASA nor Facebook Live had any reference to this “event.” NASA typically announces spacewalks in advance and dedicates the day to them. No spacewalk was scheduled Wednesday, as NASA confirmed. Yet the story racked up mentions, views and likes across the internet. And by the way, did you notice the “2013” mention in the headline on the video?

In August, Facebook automated the descriptions for its Trending section. Then a story started trending that claimed Fox News’ Megyn Kelly had been labeled a “traitor” and was subsequently fired:

The story, one of many bot-curated suggestions, was false. As was this one:

Facebook is working to refine the computer algorithm that learns from its human creators how to determine which stories are trending. Before automating the Trending section, humans monitored content to filter out offensive or inappropriate material.

But “why couldn’t the machine tell what was real and what was false? Christo Wilson, Assistant Professor of Computer and Information Science at Northeastern University, Boston, asked.

Humans are prone to make similar mistakes, he quickly added, and they are already having problems figuring out which bot-driven content is real.

Efforts are underway to find a way to authenticate bot-driven content. But Wilson warned that this will not be an easy task, compared with the ease with which an individual can produce online content and then “buy or rent bot armies to promote it.”

“And it’s not that difficult to make it look realistic and sophisticated,” he added.

Getting bots is easy – and cheap. Thousands of bots can be leased or bought for a few dollars and used to promote your business or your point of view, buy followers or retweets, mine for data, spam social media users or hijack their accounts.

A screenshot from the 'Fake Bloomberg News' bot account, now suspended, on Twitter. (Twitter)

A screenshot from the ‘Fake Bloomberg News’ bot account, now suspended, on Twitter. (Twitter)

All it takes is for people to start writing bots to create numerous social media accounts and then “start spreading a certain narrative using these accounts,” said Distil Networks CEO Rami Essaid.

That includes false opinion polls that seek to “convey a false reality.”

“We don’t know just how many of these are influenced by subjective users going to certain polls or if it is influence by bots that are skewing the narrative,” he said. But that opinion, or this reality “is real to someone,” he added, and it is “being magnified by orders of magnitude that aren’t real.”

So instead of having one person with one opinion on social media, bots automate that opinion and simulate hundreds of people – “tens-of-thousands of people or more,” said Essaid, and then replicate the same opinion.

“It sways the conversation,” he added. “It really influences the narrative and kind of tilts the scale for what could be just this one person’s opinion.”

“It’s a very tough situation,” added Wilson. “There’s a lot more information. … The news cycle is so aggressive, it doesn’t leave that much time to vet things. But you have to be really careful because any one [post] can look authentic. You just have no idea what the veracity or the provenance of any of this information is.”

At some point, Essaid said checks and balances will be necessary, and social media companies that take the number of bots in their networks seriously will need to be part of the discussion around what is real and what is not.

A screenshot from 'Dear Assistant,' a self-declared Twitter bot account. (Twitter)

A screenshot from ‘Dear Assistant,’ a self-declared Twitter bot account. (Twitter)

“When you have a wide open network like Twitter, where … up to 30-50 percent of somebody’s followers are bots, in those cases, we should be more aware of it,” he said. “Even if you want to reference it as a journalist, you should reference it but also caveat it so just the legitimacy and the precaution used by each network or each poll – or whatever – should be part of the conversation so that people are more aware.”

Facebook, Twitter, Twitch and Instagram have all been waging a war against millions of fake bot accounts, “Twitterbots” and Instagram spam bots. While some accounts are suspended, others escape or crop up again.

Social media services that are serious about news content “could probably be doing a better job and try to fight these things more aggressively,” said Wilson. But they have a “difficult socio-technical challenge in front of them,” he said, as they balance ease of use and increasing their subscriber base with “trying to very tightly control all these accounts and determine what is real and what is fake.”

Aida Akl
Aida Akl is a journalist working on VOA's English Webdesk. She has written on a wide range of topics, although her more recent contributions have focused on technology. She has covered both domestic and international events since the mid-1980s as a VOA reporter and international broadcaster.

One response to “Fake Bot Content Is Hard to Spot – and the Problem Is Getting Worse”

  1. John Pike says:

    The problem is worse than described. The broadcast evening news, and cable news, are increasingly just summaries of what is trending on twitter. Editorial judgement has been replaced by market research, and if a story is trending on twitter it must have an engaged audience, or so they say.

Leave a Reply

Your email address will not be published. Required fields are marked *