It would have taken more than privacy laws to prevent the Cambridge Analytica scandal
In the wake of the Cambridge Analytica scandal, many privacy advocates see a golden opportunity to finally achieve their decades-long quest for the United States to follow in Europe’s footsteps by enacting strict privacy legislation. For example, Marc Rotenberg, the president of the Electronic Privacy Information Center (EPIC) recently told the Washington Post, “The United States needs a dedicated privacy agency and a comprehensive privacy law.” But a close look at the facts suggests that Europe’s privacy rules probably would not have prevented this incident. What they would have done is stymie the U.S. digital economy and hurt consumers.
There are two main problems with the suggestion that stricter privacy rules would have prevented Cambridge Analytica from acquiring data from as many as 87 million Facebook users. First, the European Union (EU) has long bragged that its privacy rules are superior to those in the United States, yet Cambridge Analytica, a London-based firm, is the company at the heart of the controversy, and it was already subject to EU regulations.
{mosads}This suggests that merely replicating Europe’s privacy laws in the United States would have done little to change the outcome. Second, stricter rules do not stop unethical companies from violating contracts or flouting the law, as Cambridge Analytica appears to have done when it acquired users’ personal data. When Volkswagen cheated on its emissions tests, the solution was not to raise emission standards, but rather to ensure regulators hold companies to account for bad behavior.
Those calling for stronger U.S. privacy laws conveniently ignore the fact that the existing laws appear to be working. The Federal Trade Commission (FTC) has already confirmed that it has opened an investigation into Facebook’s practices. In addition, a bipartisan coalition of 41 state attorneys general has begun soliciting more information about the social network’s activities, with one of those states launching an official investigation. Consumer watchdog groups have also lodged criminal complaints with the U.S. Department of Justice and the Federal Election Commission, alleging Cambridge Analytica violated federal election law.
In addition to privacy laws, market forces stand ready to penalize companies who are not responsive to public concerns about privacy. In the wake of the recent revelations, Facebook lost over $100B in market capitalization and some users launched a #DeleteFacebook movement. In response, the company made significant changes to how it manages third-party access to user data and initiated a series of changes to how users can manage their privacy, including creating new tools and interfaces to simplify downloading and deleting personal data. And at Cambridge Analytica, the board suspended its CEO, Alexander Nix.
It may be gratifying to grab a virtual pitchfork and demand new laws, but the end result for consumers could be substantially worse. It is important to remember that Facebook and sites like it are digital platforms that enable other products and services—sharing information with third-parties is a feature, not a bug. Countless mobile apps, for example, use Facebook’s login service to simplify how users authenticate themselves and provide personal information. While there are clearly additional steps Facebook can take to screen developers on its platform, curtailing this functionality would hurt a wide variety of digital apps and services, and the consumers who use them. Already, some app developers have documented this phenomenon after Instagram made a number of unscheduled changes to its application programming interface, limiting how much data third parties could pull from it.
In Europe, digital platforms now find themselves caught between two different regulators. On one side are the data-protection agencies, who argue that social networks should sharply limit the information they make available to third parties to protect consumer privacy. On the other side are antitrust regulators, who claim sites like Facebook are data monopolies and should therefore increase third-party access to their data to avoid anti-competitive effects on the market. Such contradictory policies will offer little benefit to consumers but will impose a significant regulatory burden on companies, limiting pro-consumer innovation.
U.S. policymakers should provide a different response. First, they should see current investigations through to their conclusion. Those demanding new laws should be forced to show where existing ones fail to create sufficient accountability.
Second, they should focus on holding the real villain in the story accountable. When someone walks out of a 7-Eleven without paying for a Slurpee, do you blame the store or the shoplifter? You blame the shoplifter – in this case, Cambridge Analytica.
Third, they should give the private sector the time it needs to develop and test solutions. For example, some companies may use “digital rights management” technology—the technology that limits users from sharing multiple copies of a song or movie — to track and limit how third-parties share data, while others may use blockchain—a virtual distributed ledger — to build and enforce smart contracts for data sharing. But either way, innovation can play a key role in limiting abuse.
But what policymakers should not do is overreact to an incident of foreign interference in U.S. elections by allowing Europe’s anti-innovation privacy laws to gain a toehold on U.S. soil.
Daniel Castro (@CastroTech) is vice president at the Information Technology and Innovation Foundation (ITIF), the world’s leading think tank for science and technology policy.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..