It only takes about five minutes to buy a Facebook ad with a link to a fake anti-Muslim story and aim it at residents of a Michigan electoral battleground county whom the social network's algorithms perceive to be "very conservative."
This capability is at the heart of an ongoing controversy around revelations that a Russian troll farm gamed Facebook's ad targeting apparatus with exactly these sorts of purchases. Facebook has admitted that it sold 3,000 ads to entities aligned with the Russian government’s campaign to influence the presidential election. The Kremlin-linked actors reportedly used Facebook’s tools to target ads around electoral geography, feelings toward Muslims, and racial divisions, among other political pressure points.
Facebook claims around 10 million Americans saw them. But one social media analyst estimates the actual number to be at least double that when accounting for free posts. Upon alerting Facebook to these findings, the company promptly scrubbed the trove of thousands of posts and data from which the number was derived and told the Washington Postthe analyst's ability to access it in the first place was a "bug."
But one need not have data sets or the backing of state intelligence agencies to follow in the footsteps of Russia's would-be election manipulators. Facebook offers the same means to sow chaos to anyone who bothers to create a page and spend a few dollars.
To demonstrate the extent of these creepy capabilities, we put ourselves in the shoes of a Russian troll. What are the most efficient means of causing maximum disruption to a country’s political system and societal fabric via Facebook?
The easiest way to ensure an audience of people susceptible to a message on Facebook is to use the information users have volunteered about their interests, hobbies, and demographics.
Say someone wanted to reach a set of people with anti-Muslim views, as the Russian actors reportedly did.
A good starting place might be the more than 800,000 Facebook users who’ve expressed interest or liked pages related to far-right firebrand Pamela Geller, whom the Southern Poverty Law Center (SPLC) describes as “the anti-Muslim movement's most visible and flamboyant figurehead.” Ditto for the 500,000 fans of SPLC-designated hate group leader Brigitte Gabriel or fellow traveler David Horowitz’s 200,000.
At this point, Facebook’s ad platform will begin generating automated suggestions to round out the group, including interests in Breitbart, Alex Jones’ Infowars, Ann Coulter, and, of course, Donald Trump.
Credit: screenshotSomeone in the market for a different strain of political strife might exploit other available Facebook interests like the “9/11 Truth Movement” conspiracy theory (340,000 users), SPLC-designated anti-LGBT hate group Family Research Council (380,000 users), or the “National Fascist Party” (1.3 million users).
More innocuous hot-button options might include the National Rifle Association, “Free Palestine,” or U.S. Border Patrol.
SEE ALSO:Worried about your public Facebook data? You might want to try these toolsA ProPublicareport last month exposed a number of outrageously offensive categories Facebook’s algorithms had created based on user activity, including “Jew Haters,” “NaziParty,” and “How to burn Jews.”
Facebook claimed these terms had slipped through the cracks of its vetting operation because they applied to relatively few people. It hastily scrubbed them and later cracked down further by removing certain types user-submitted topics altogether.
But as is the case with many of Facebook’s controversies, the social network's targeting capabilities remain so vast that it’s plausible for an advertiser to approximate an audience that includes people who might sympathize with these hateful sentiments anyway.
In some cases, blocking these alternatives might require Facebook to make a more subjective moral judgment than it’s comfortable doing. Who are Facebook employees to say whether an activist’s views on Islam cross into bigotry?
Other terms are potentially harmless—though they can seem unsavory. The 3.7 million Facebook users who’ve expressed interest in fascism could be scholars of the far right or World War II history buffs. They could also be Nazis.
Taking a cue from Larry Kim's investigation into Facebook's ad vetting this week, we took out an ad with a link to an anti-Muslim article on a site known to peddle fake news. The target audience was set to fans of Gabriel or Horowitz, a swathe of users that Facebook's ad tools described as "great!" in terms of size and reach.
Facebook approved the purchase several hours later, and we immediately ended it. No ads were actually displayed.
Facebook approved this ad several hours after submission.Credit: screenshotThe social network did take issue with another fake news-loaded ad we tried. In that case, a new program Facebook rolled out earlier this year to combat misinformation was able to recognize the suspicious lack of text and abundance of seedy ads on the page in question.
Facebook blocked this particular fake news link.Credit: screenshotOnline marketing forums have suggested that the more ads you buy and the more money you spend on a particular page, the quicker Facebook's approval process becomes. The sixth ad attempted on our shell page—a fake news link aimed at 557,000 "conservative" and "very conservative" residents of Macomb, County Michigan—was approved within a couple minutes of submission.
Facebook allows location targeting by country or city with a radius of up to 50 miles.Credit: screenshoTUser-submitted information is only one piece of Facebook’s targeting toolkit. The company’s software also tries to deduce all kinds of user attributes based on Facebook habits, outside browsing history collected through trackers and cookies embedded around the web, and third-party firms that match offline store purchases with corresponding Facebook users.
Facebook traces patterns in this massive data trove in order to arrange people into perceived groups for advertisers. It guesses your political affiliation, your level of interest in politics, and how much and what sort of television you watch. It tries to determine whether you’ve bought beer or hard spirits, cold medicine, or plus-size outfits in brick-and-mortar stores. It estimates your net worth, your income, and your “multicultural affinity,” an obvious stand-in for race.
Facebook's targeting options for types of alcohol (top left), television habits (top right), political and social causes (bottom left), and 'multicultural affinity' (bottom right). All of these categories use information not necessarily volunteered.Credit: screenshotSome of these categories might seem oddly specific. One section is dedicated to whether you happen to be away from your family, your hometown, or your significant other. Others include the birthdays of your close friends, potential moving plans, and whether you’re a “corporate mom,” a “big-city mom,” and/or a “green mom.”
All of these determinations have innocent, if maybe off-putting, uses for legitimate advertisers. Many people actually prefer more personalized advertising that has a better chance of being relevant.
But some of these categories are fairly recent additions, and their more insidious potential may not be fully realized. For instance, another ProPublicainvestigation last fall determined that “multicultural affinity” could easily be used to direct discriminatory housing and jobs ads in violation of civil rights laws (Facebook has since taken various steps meant to prevent this).
Other identifiers also lend themselves to targeting people's vulnerabilities in obvious ways. For instance, Facebook has been testing a feature that lets users who might be recovering alcoholics opt out of liquor ads or those who may have recently lost a child block parenting promotions.
SEE ALSO:Ads will target your emotions and there's nothing you can do about itIn terms of political subversion, the possibilities are hard to know. Obviously, actors benefit from data on traits like political affiliation and interest level, cable news consumption, and perceived proclivity towards certain political or social causes such as "veterans," "religion," or "environmentalism."
This section of Facebook's ad targeting panel is more comprehensive than whatever political camp you might purposefully indicate on the site.Credit: screenshotBut the significance of various combinations is less immediately clear. Does targeting political junkies who buy spirits, take “casino vacations," and earn a salary of less than $50,000 (99,000 users) yield any meaningful effect? Do the 83,000 people who buy hunting equipment and show interest in the chemtrails or the illuminati conspiracy theories share any exploitable political tendencies?
Facebook has determined that nearly 900,000 of its users are gambling enthusiasts with at least nine lines of credit.Credit: screenshotThose are left-field examples, and each combination is of course subject to Facebook's individual ad vetting. But the point is that the possibilities are essentially limitless.
Again, we tried these tools out with a link to one of the top anti-Muslim stories recently debunked on Snopes and a target audience that included Geller fans who purchase hunting equipment and hard alcohol. Facebook similarly approved the combination, despite its insistence that the audience was too specific.
Facebook approved this ad several hours after submission.Credit: screenshotTo anyone with even a passing familiarity with the internet, the idea that online ads could sway an election might sound ridiculous on its face. Most digital advertising is cheap, annoying, and easily ignored.
Facebook is different. The ads it sells are marked with a small-print “sponsored” tag and otherwise identical to every other post. They often consist of a simple article link or autoplay video.
Even the most media-savvy Facebook users can easily overlook this label on occasion. While Facebook scores better than other parts of the web, market research shows people generally have trouble distinguishing sponsored posts from surrounding content on the internet.
The Russia-linked Facebook ads that have been revealed often used colloquial language, amateur-ish memes, and conversational prompts to blend in with their target's feed. Their authors tried to pose as like-minded, opinionated individuals, the New York Timesreported.
To an American public that increasingly relies on Facebook as a primary news source, the ads may have just appeared to be another voice in the fray. That's exactly how Facebook designed them to be—as long as advertisers press the right buttons.
TopicsFacebookAdvertisingDonald TrumpElectionsPolitics