Regulating Social Media
I think there are three main paths to regulate FB and other social media apps: access, content, and personal data. In this post I will use FB as the main example.
A good example of access regulation is currently playing out in China, where the government is setting a limit of hours of usage per day for video games. One could imagine a similar regulation that limits the number of hours people could use FB or other social media apps.
In a country with limited civil liberties and where state violence is used graciously, such measures might work. But in a free country like the U.S, that’s less likely to be the case.
In the 1920’s, the prohibition amendment tried to regulate access to alcohol. Enforcement was difficult and costly and people found ways to access the product, often at a higher price and lower quality. In the end, the prohibition failed because people wanted to continue drinking alcohol.
In the U.S, we don’t allow people under 13 to use these apps. We could, if we think it’s important and there is strong supportive evidence, increase that age. But it is hard to imagine that a full grown adult needs to be protected by the government from social media.
Regulating content is even harder. Aside from calls to violence, there is little consensus of what constitutes harmful content, and less so what constitutes truth. There are plenty of improvements that can be done to these products to give more context into what might be more or less truthful: the platform can show if it has been shared too many times, or limit reach from new accounts, or mark if some authoritative institution disagrees with a statement. Many of these ideas and more have been implemented, but the idea that FB or others can be the arbiter of truth seems like a fool’s errand to me.
People say hurtful and untruthful things on FB not because of FB, but because they want to. Regulating human nature would be like regulating a drug by asking the manufacturer to change its chemical composition. A bit like saying, could you make ibuprofen not give me a stomach ache?
The way I see it, either the drug is worth the tradeoffs in which case we make it available to market ( limited to some people) or it is not worth the tradeoffs, in which case we decide people should not be allowed to decide if the risk is worth it for them or not. Whether in the market or not, anyone can try to compete, and offer a better product.
This is not an exact science, but I can see how it is reasonable for a product to fall on one side or the other of the permission line. But the assumption that Congress can sit down and change the “chemical composition” of FB by regulating their feed seems unproductive. Congressmen are not running the company and often don’t even understand the most basic elements of it.
Congress is charged with being well versed with social media And the internet And energy And Education And healthcare And security And internal politics, etc. Mark and his team are focused on FB, hence I am fairly confident Mark and his team would run FB better.
Of course, If the legislature has a consensus on the risks associated with social media, they could, for example, mandate that these companies show a disclaimer every time the app opens up like tobacco companies were forced to do so in their package. But like with TV, I think the negative effects of social media are very hard to prove and impossible to balance out against their innumerable benefits.
In my opinion, the most reasonable path to regulation is by increasing the access and mobility of personal data. The way I see it, FB owns the algorithms that organize content, the interfaces with all its features, the relationship with the advertisers, and even all the revenue. But I own my data. My graph of relationships, my photos, videos, and texts.
If I created a ton of content at FB and I would rather have all that content be somewhere else, there should be an open-source protocol to easily (only-click) move from one place to the other.
Today I can export my FB data, but not my graph. The export tool is kind of hidden, and not really good. More importantly, it is not thought of as a solution for interoperability (so I can use it somewhere else,), but rather of access.
In healthcare, there is a regulation of this nature, which aims to make it super easy to move my health data from one provider to another. The implementation is slow and mediocre, but it is the right idea, and it is moving in the right direction. I think a similar regulation should exist for social media.
In telecommunication, there is also a related regulation. The big infrastructure (the holes in the ground and the pipes that run through it which support the cables that connect your house to the outside world) were created by monopolies. In order to promote competition and protect the consumer, these companies were mandated to allow other companies to pass their cables through their pipes. FB built the pipes of the social graph, which I think is in the best interest of the consumer to allow them to move their graph wherever they want.
I believe the idea that FB has problems because the people in it are bad or evil is wrong. As far as I can tell, in the time that I spent working there, I mostly saw smart, good-intentioned people. Of course, I was only a middle-manager so my knowledge is limited - so is the case with the latest Whistleblower - but the times I had the opportunity to present and talk to people in the leadership teams I only saw people trying to build a big, useful, impactful company that would be loved by its users.
What I do think is a problem is Power. To be powerful means to have an outside influence in the lives of others. The bigger the influence and the larger the group of people you influence, the more powerful you are.
FB is very, very powerful. Mark and the leadership team, if they can be clearly accused of anything, is their ability to accumulate power. By building a product people love, that can serve billions of people at the same time, connecting the globe for the first time in human history, they have amassed a great deal of power.
When a powerful entity makes decisions, it affects many people, often very deeply. When +3B people are involved, not everyone is going to be better off with every decision.
There are always trade-offs. This is true regardless of scale: for every decision you make, there are always some aspects of your life that are better off and others that are not. When you add a couple of people to the mix - your family, your small company, your closest group of friends - the potential configurations arise very fast, and managing trade-offs becomes very hard. For billions of people, trade-offs are really hard to compute and even harder to resolve.
The challenge with FB is not that the people are not good - in fact, I think they are incredibly talented -; it is that they have a lot of power, and their decisions affect many people and in many profound ways.
I don’t think we solve for large, concentrated power by moving it around. The solution is not to take power away from Mark and give it to the chairman of the FCC, or to a bunch of congressmen. The solution, in my opinion, is to diffuse the power. From a regulatory standpoint, I think that starts by making it super easy to port my graph and data from FB to wherever I want.