The views expressed by contributors are their own and not the view of The Hill

Facebook ad boycott is unlikely to solve the problem — a social media standards board would

istock

As more major U.S. corporate advertisers join the Facebook “Stop Hate for Profit” ad boycott, what happens when the boycott ends? The boycott’s leaders proposed “10 Recommended Next Steps” to Facebook’s management to accelerate removal of racist and hate incitement. Boycott leaders acknowledge these are not new proposals. Which poses the same question raised over and over again in recent years: What will it take to finally compel social media companies to make the critical technical and policy reforms to counteract the avalanche of misinformation and extremist hate which triggered this latest boycott?

If the first 2017 ad boycott — which was then directed at Google and YouTube — is any guide, this new embargo will fall short of its objective.

In 2017, The London Times reported that digital ads from major brands “unwittingly pay extremists and pornographers” — noting that ads on YouTube placed by The Guardian newspaper were appearing alongside ISIS promotional videos. The Guardian reacted by revoking its ad purchases from YouTube ushering in the first digital ad boycott against a social media platform. The Guardian’s ad revolt soon triggered a broader exodus of major American digital advertisers from YouTube and Google — including AT&T, Walmart, General Motors and Walmart — when they found out their own ads were popping up alongside extremist websites, leading consumers to believe that blue chip companies were endorsing extremist content. In all, over 250 brands pulled their ad spending. The boycott cost YouTube and Google a combined $700 million in lost revenue, which relatively speaking, was a drop in the bucket compared to Google’s ad income.

But nothing like the loss of ad revenue gets the undivided attention of Silicon Valley executives.

Confronted by the boycott, Google and YouTube launched a PR offensive, promising advertisers to develop new technical safeguards on the automated technology algorithms which migrated corporate ads to terrorist and extremist hate content. The companies introduced new internal artificial intelligence short circuits to protect brands from extremist contagion. Within weeks of a concerted technical repair job, the volume of ads appearing alongside extremist content was substantially curtailed.

Initially mollified, boycotting advertisers ramped up their ad spending once again. They could not afford to keep their brands’ advertising off Google and YouTube very long for fear of losing consumer sales.

But it did not take long for the crushing volume of hate and extremist content to overwhelm the technical fixes. Within months after the boycott petered out, ads once again began reappearing alongside extremist content.

The lesson from the 2017 boycott: Without some form of independent monitoring, social media companies will fall back on their time worn pledges they are making progress taming the extremist beast — just leave them alone to decide when and how.

Formation of the ad industry’s Global Alliance for Responsible Media (GARM)

The 2017 ad boycott, however, opened the eyes of corporate advertisers to the need to better protect their brands on social media. To ensure better brand safety the Association of National Advertisers (ANA) unveiled in 2019 a new partnership with the World Economic Forum to create GARM — the Global Alliance for Responsible Media. GARM’s mission was to create an unprecedented coalition of advertising agencies, media companies, and social media platforms to oversee social media’s efforts to rapidly improve brand safety and urgently develop and impose a new code of conduct on social media companies to independently monitor their progress to severely curtail the magnitude of hate and extremist content engulfing their platforms.

Since its birth over 18 months ago, GARM remains a good idea with no public face or code of conduct in search of an organizational structure to leverage its potential financial leverage over social media companies. Had GARM taken off and begun fulfilling its mission, perhaps the latest ad boycott would not have been necessary.

Meanwhile, as public demand for reform and regulation of social media escalates, Congress and the Trump administration have tied themselves in knots debating whether and how to impose some form of legislative oversight on social media companies. Competing solutions unfailingly devolve into a false choice: whether Congress should or should not end social media’s Holy Grail — content immunity under Section 230 of the Communications Decency Act.

Create a private-sector social media consumer watchdog organization

To transform GARM into a workable solution and dodge further legislative deadlocks denying the public some form of third-party oversight over social media, the Coalition for a Safer Web (of which I am president) proposes creating a new Social Media Standards Board.

The Social Media Standards Board (SMSB) would constitute a private, voluntary regulatory body formed by a new partnership composed of corporate brand advertisers, social media companies, and independent consumer watchdog internet experts.

Modeled after the successful Financial Accounting Standards Board, the SMSB would oversee compliance by social media companies of their own customer contracts and pledges to remove extremist and hate incitement under a new social media corporate-wide code of conduct.

The SMSB would have a staff of independent web monitors serving as content moderators protecting the public and private industry and issuing regular reports analyzing Silicon Valley’s compliance with this new code of conduct envisioned by GARM’s founders. Social media companies which consistently breach this new industry code of conduct would be sanctioned by a forfeiture of ad revenue — and a forfeiture of Section 230 immunity if code of conduct violations persist. The latter depends on Congress. The SMSB is intended to solve Congress’s dilemma over what to do about Section 230 by deploying a scalpel and not a sledgehammer against social media companies.

The SMSB would also serve as an innovation hub to identify and promote new technologies to support the acceleration and interception of harmful web-based content endangering safety and security, including brand safety.

Short-term ad boycotts tend to produce good PR for corporate advertisers. They help mollify consumer anger at social media companies without permanently damaging the symbiotic tie between the digital ad space buyer and the social media company ad space seller. But they do not lead to permanent fixes to social media mischief and mayhem.

That is why a new private sector watchdog agency such as the SMSB may begin solving the problem that a temporary ad boycott alone cannot.

Marc Ginsberg served as US Ambassador to Morocco under President Bill Clinton; he previously served as Deputy Senior Advisor to the President for Middle East Policy, and was a legislative assistant to Sen. Edward Kennedy. He is currently president of the Coalition for a Safer Web.

Tags Big tech Bill Clinton Boycott Facebook Facebook boycott online advertising Section 230 of the Communications Decency Act Social media YouTube

Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..

Main Area Top ↴
Main Area Middle ↴
Main Area Bottom ↴

Most Popular

Load more

Video

See all Video