Pages

2 April 2018

The hidden pitfalls of digital regulation

Regulation that hampers product innovation or user experience will help Facebook, which already benefits from network externalities  Facebook has been at the centre of public debate since the news of Russian trolls influencing the 2016 US presidential elections. The Cambridge Analytica controversy has given new life to calls for government regulation. Most tech companies are usually wary of government involvement and adopt a proactive approach to self-regulate instead. But Facebook founder and chief executive officer (CEO) Mark Zuckerberg has come out in favour of some government regulation. He told Wired, “The question isn’t, ‘Should there be regulation or shouldn’t there be?’ It’s ‘How do you do it?’”


As much as this is uncharacteristic of a tech company, it is sensible for Facebook’s long-term self-interest. First, Facebook is no longer the tech start-up that Zuckerberg started in his dorm room. It’s a tech giant with the cash reserves and personnel needed to handle onerous regulations on advertising, data security and privacy. Should there be a violation, it’s capable of bearing the costs. These costs might be prohibitory for its competition, which is already conspicuously absent, given Facebook’s takeover of Instagram.

Secondly, the fiasco has heightened the advocacy for greater privacy. The logic is that since it was Facebook’s data-sharing policies that allowed Cambridge Analytica to extract user data, it needs to reduce data-sharing in order to protect privacy. Zuckerberg told Recode, “I was maybe too idealistic on the side of data portability, that it would create more good experiences—and it created some—but I think what the clear feedback from our community was that people value privacy a lot more.” This is fine logic, but what it ends up perpetuating is that privacy and data portability (or sharing) are somehow mutually exclusive.

This is not the case, but for Facebook’s own decision to become a data-silo. If Facebook allowed users to control their data, and app developers to create tools to help them do that, privacy and transparent data-sharing can coexist.

The 50 million American profiles that Cambridge Analytica had were of users who either consented to share personal data, or were friends with those who did, and whose privacy settings were liberal. This episode showed how difficult it is for people to understand how they’re vulnerable online. Privacy settings are complicated, and people need tools which can go through them and make the necessary changes. The fact that there are no tools to aid the process is because Facebook doesn’t allow it.

Contrast this with web browsers or email, where there are apps that prevent marketers’ tracking pixels from loading in your email and block advertisements, allowing users to take control of their privacy and experience. Allowing programmers to make similar tools for Facebook will allow users to express their preferences, and allow them to push back against bad behaviour before the government can be convinced to take interest. But that’s not the direction in which Facebook is moving.

Opening the user-data for developers to build tools, and for users to control their news feed, are the kind of things a social network would do. In an ideal world, users would also be free to take their network to another platform, or inter-operate across platforms. But Facebook is an ads company masquerading as a social network, and what makes sense for an ad company doesn’t make a good social network.

Facebook will continue to target ads, political or otherwise; it’ll just put restrictions on the data that third parties can collect, and how they can use it. This will be a significant departure from an argument being made by University of Chicago economist Luigi Zingales, among others, who would like the government to reassign property rights to a user’s social graph from Facebook to the user in order to overcome the positive network externalities that Facebook benefits from. Portability of data, just like portability of phone numbers in the telecom industry, would make the industry more competitive. But that will become unlikely if Facebook goes forward with locking down users’ data.

As Facebook’s business model has become clearer to everyone, it has changed its algorithms to show more posts from friends. Perhaps privacy will also improve in the near future. But it would be naive to assume that its recent openness to regulation is public-spirited, especially when it will most likely get a seat at the table when the regulations are drawn.

Apple CEO Tim Cook’s comment—asking for “well-crafted” privacy rules—should be seen in a similar light. While Apple has been a far more principled advocate of users’ privacy, it’s a convenient demand when its prime competitor Google’s business model—both its assistant and its ad business—relies on creating profiles from people’s online behaviour, while Apple’s doesn’t.

It’s easy for this situation to play into Facebook’s hands. Any regulation that hampers product innovation or user experience will help Facebook, which already benefits from network externalities. Facebook’s surveillance business model keeps it from becoming a good social network. Proper regulation should allow the creation of competitors, or at least not inhibit them. This is especially crucial right now, when Facebook is facing a crisis of legitimacy and compliance with regulations might be its crutch to gain users’ lost trust.

What is the best approach to regulating tech companies like Facebook and Google? Tell us at views@livemint.com

No comments:

Post a Comment