Facebook just announced that it will start allowing users to opt out of political advertising, though it hasn’t yet said when or how. Will this decision cripple digital campaigning in a presidential election year?
First question: how many people will actually turn off political ads if given the chance? If Facebook makes the process obvious and easy, more users may decide to do it. If it’s a less-visible feature, fewer will exercise the choice. But browser-based ad-blockers have been around for years, and the commercial and political banner-ad industries have survived just fine. As an indicator, how many people actually update their privacy settings when Facebook gives them the option? Not a lot.
Like other restrictions on digital political advertising, if ad-blocking does take off, it’ll hurt down-ballot campaigns, small organizations and unpopular opinions the most. Big campaigns (and corporations) have plenty of ways to get their messages out; people trying to change the world usually have fewer. Plus, Facebook’s decisionmaking process about what constitutes “political” advertising is too opaque for us to allow it to unilaterally determine who can exercise what the U.S. Supreme Court has decided is protected free speech.
Consider a coal company touting the virtues of its carbon-capture technology in a Facebook promoted post. Corporate marketing, for sure, and almost certainly not blocked as political content. But an environmental group using an ad to call out that company for lying about said technology? Likely required to be marked as “political” content and eminently blockable.
Meanwhile, some Democrats propose similarly counterproductive regulations, while Trump’s Justice Department makes recommendations destined to go nowhere. A tense stand-off between Mark Zuckerberg and his critics is probably the best we can hope for this year, which is better than a cop-out like Twitter’s ad ban. But Facebook, Google and Twitter have made themselves central to our communications lives. If they don’t start recognizing the responsibility that somes with it, someone will eventually make them do it. A good start: put as much energy into watching for disinformation as they do into finding copyright violations. If it’s a priority, it’s a priority. If not, Congress or the courts may make it one — and the process probably won’t be pretty.