Red Alert for Net Neutrality. Take Action | Learn About Net Neutrality.

How To Manipulate Elections In The 21st Century

10 May 2018

Propaganda isn’t a new idea, but we’ve been witnessing its evolution into a 21st century version. Through history, propaganda wasn’t always true but it was usually believable. The word propaganda came into normal use around 1914 but the act it describes is far from new. The ancient Greeks used games, theatre, courts, and religious festivals to spread their ideas, the mass media of the time. Over time, the medium for propaganda evolved to the modern media at the time. Today’s propagandist has a lot more choices for outlets, including the Internet and social media which can spread ideas faster than ever, with opaque fake identities that are hard to catch.

Although spreading stories to push a belief isn’t anything new, making them believable isn’t important anymore. Starting the spread of a false story or a series of false stories, is extremely easy online. Armies of fake social media accounts can be rented for very little. The idea is not to push a specific idea, but to make it harder to trust anything. It’s a scorched-earth sort of approach where the intent is to create doubt, break faith in institutions that should otherwise be trustworthy, and manipulate politics with the resulting paranoia. Even publishing completely contradictory information - whether any of it is true - is effective because it can spread like wildfire when shared enough times on social media. People lose track of what’s true and refuse to believe anything at all, and start to question everything. A firsthand account from Elizabeth Flock at PBS is an interesting read.

The magnitude of foreign-backed—primarily Russian—online misinformation campaigns have been widespread and growing. All the major social networks have reported they’ve removed hundreds, or far more, of accounts found to be part of misinformation “bot” networks. Facebook launched a tool that showed users if they had followed any Russia-backed information campaigns. The campaigns aren’t limited in effect to the Internet, though they spread quickly through social networks. Russia-backed groups have launched protests and counter protests in real life, everything from LGBTQ+ rallies to 2nd Amendment rallies. In summer 2017, two different Russia-backed pages organized dueling rallies in Texas in the same location. The format of the campaigns have included Facebook pages and purchased ads as well as leaking emails and other things.

It’s easy to hope that you would never be duped by a misinformation campaign or a foreign power trying to manipulate opinion, but even bona fide activists have been fooled into helping with rallies. We don’t really know how much recent events were affected. Congress recently released some 3,000 Facebook ads that were paid for by Russian groups. Twitter said it removed more than 50,000 Russian bot accounts from the network in January 2018. Many of the campaigns, such as some “Antifa” accounts appeared to be designed to make people angry, for example by posting things about defacing an opposing group’s materials. Others, such as a “Blacktivist” group tried to create rallies in the wake of tense events, and still others advocated for the secession of Texas.

The issue of Russian bots and misinformation is more than an online conspiracy theory. Research has been slowly tracking down bot accounts and has discovered human and bot accounts on Twitter act quite a bit different. Sometimes, it’s obvious even to the untrained eye. Bots in the same campaign sometimes share the same message just seconds apart, and in alphabetical order (based on the account usernames). In some instances pointed out by Twitter users the Russian owner of the account forgot to turn off location services, which tag the actual location a tweet was sent from (note that this can be spoofed). In one case, the bot networks started tweeting in defense of a Russian action before it actually happened.

Part of the problem—likely the more easily reparable part—is due to social networks having little incentive to address misinformation campaigns and bots. The social media space is generally unregulated and topics that rile up a group can attract more attention so social media can run more ads. However, as bots get better at looking human (and they generally are run by real people), it gets harder to catch them. Social media and the Internet at large need to get better at making us aware of coordinated campaigns and dealing with the accounts behind them. To their credit, some social media sites appear to be starting to take the issue seriously. In 2017, Facebook announced it planned to hire 1,000 more people to review ads. More recently, Facebook stopped accepting foreign-funded ads about an Ireland abortion vote.

However, regardless of the actions that social media sites take, either on their own or due to government regulations, we need to be more diligent about vetting what we see online. Russia’s general foreign objective isn’t to forward a specific idea or interest, it’s to weaken anyone they see as adversaries. It’s possible to contain the effects, as France did during a 2017 election, by being aware of attempts to manipulate and having rules in place to control them. At the very least, we need to remember that social media is not a reliable news source, especially because it can mirror our own beliefs back to us instead of giving us credible information.

Care about an open and neutral Internet? Check out my book, Please Upgrade for Access, at book.thenaterhood.com.

• • •

Stay updated by email
or, grab the feed

Found something wrong? Get in touch.

Share this