Florida: Elon Musk’s takeover of Twitter, and his controversial statements and decisions as its owner, have fuelled a new wave of calls for regulating social media companies.
Elected officials and policy scholars have argued for years that companies like Twitter and Facebook – now Meta – have immense power over public discussions and can use that power to elevate some views and suppress others. Critics also accuse the companies of failing to protect users’ personal data and downplaying harmful impacts of using social media.
As an economist who studies the regulation of utilities such as electricity, gas and water, I wonder what that regulation would look like. There are many regulatory models in use around the world, but few seem to fit the realities of social media. However, observing how these models work can provide valuable insights.
Not really economic regulation
The central ideas behind economic regulation – safe, reliable service at fair and reasonable rates – have been around for centuries. The US has a rich history of regulation since the turn of the 20th century.
The first federal economic regulator in the US was the Interstate Commerce Commission, which was created by the Interstate Commerce Act of 1887. This law required railroads, which were growing dramatically and becoming a highly influential industry, to operate safely and fairly and to charge reasonable rates for service.
The Interstate Commerce Act reflected concerns that railroads – which were monopolies in the regions that they served and provided an essential service – could behave in any manner they chose and charge any price they wanted. This power threatened people who relied on rail service, such as farmers sending crops to market. Other industries, such as bus transportation and trucking, would later be subjected to similar regulation.
Individual social media companies don’t really fit this traditional mold of economic regulation. They are not monopolies, as we can see from people leaving Twitter and jumping to alternatives like Mastodon and Post.
While internet access is fast becoming an essential service in the information age, it’s debatable whether social media platforms provide essential services. And companies like Facebook and Twitter don’t directly charge people to use their platforms. So the traditional focus of economic regulation – fear of exorbitant rates – doesn’t apply.
Fairness and safety
In my view, a more relevant regulatory model for social media might be the way in which the US regulates electricity grid and pipeline operations. These industries fall under the jurisdiction of the Federal Energy Regulatory Commission and state utility regulators.
Like these networks, social media carries a commodity – here it’s information, instead of electricity, oil or gas – and the public’s primary concern is that companies like Meta and Twitter should do it safely and fairly.
In this context, regulation means establishing standards for safety and equity. If a company violates those standards, it faces fines. It sounds simple, but the practice is far more complicated.
First, establishing these standards requires a careful definition of the regulated company’s roles and responsibilities. For example, your local electric utility is responsible for delivering power safely to your home. Since social media companies continuously adapt to the needs and wants of their users, establishing these roles and responsibilities could prove challenging.
Texas attempted to do this in 2021 with HB 20, a law that barred social media companies from banning users based on their political views. Social media trade groups sued, arguing that the measure infringed upon their members’ First Amendment rights. A federal appellate court blocked the law, and the case is likely headed to the Supreme Court.
Setting appropriate levels of fines is also complicated. Theoretically, regulators should try to set a fine commensurate with the damage to society from the infraction. From a practical standpoint, however, regulators treat fines as a deterrent. If the regulator never has to assess the fine, it means that companies are adhering to the established standards for safety and equity.
But laws often inhibit agencies from energetically policing target industries. For example, the Office of Enforcement at the Federal Energy Regulatory Commission is concerned with safety and security of US energy markets. But under a 2005 law, the office can’t levy civil penalties higher than USD 1 million per day. In comparison, the cost to customers of the California power crisis of 2000-2001, fuelled partially by energy market manipulation, has been estimated at approximately USD 40 billion.
In 2022 the Office of Enforcement settled eight investigations of violations that occurred from 2017 to 2021 and levied a total of USD 55.5 million in penalties. In addition, it opened 21 new investigations. Clearly, the prospect of a fine from the regulator is not a sufficient deterrent in every instance.
From legislation to regulation
Congress writes the laws that create regulatory agencies and guide their actions, so that’s where any moves to regulate social media companies will start. Since these companies are controlled by some of the wealthiest people in the US, it’s likely that a law regulating social media would face legal challenges, potentially all the way to the Supreme Court. And the current Supreme Court has a strong pro-business record.
If a new law withstands legal challenges, a regulatory agency such as the Federal Communications Commission or the Federal Trade Commission, or perhaps a newly created agency, would have to write regulations establishing social media companies’ roles and responsibilities. In doing so, regulators would need to be mindful that changes in social preferences and tastes could render these roles moot.
Finally, the agency would have to create enforcement mechanisms, such as fines or other penalties. This would involve determining what kinds of actions are likely to deter social media companies from behaving in ways deemed harmful under the law.
In the time it would take to set up such a system, we can assume that social media companies would evolve quickly, so regulators would likely be assessing a moving target. As I see it, even if bipartisan support develops for regulating social media, it will be easier said than done. (The Conversation)