Maybe We Should Make Some Rules Here: A Framework for Social Media

• Bookmarks: 216


Article co-authored by Ellie Vorhaben.

This piece comes as a follow up to a prior article concerning Facebook’s limited legal liability from the whistleblower.

We’ve watched the same scene play out over and over. Reporters reveal another negative impact of social media. Outrage and handwringing ensue. Then a mix of pundits and policymakers pound the proverbial table and call for accountability. Accountability seems about as likely to arrive as Godot.

As discussed in this prior article, accountability remains elusive at least partially because the Securities and Exchange Commission, the all-purpose enforcer of corporate malfeasance, has strong legal authority to protect investors and shareholders, but not users of social media themselves. We need a new model of social media enforcement that centers on protecting the user. This new model should be based on two existing paradigms: financial and environmental regulation. While these frameworks may seem incongruous, they provide a roadmap for social media policy that fits into existing legal thinking.

The last major legislation to regulate social media was the Communications Decency Act (CDA) passed in 1996. This bill governs most social media platforms, and section 230 of the CDA in particular limits platforms’ liability for content published by users. These companies, such as Instagram, Facebook, and Twitter have contended that they are non-editorial platforms and thus they should not be held responsible for what is published on their sites. This interpretation may have been believable 25 years ago, before the growth of social media or the increasing power of algorithmic content curation, but this interpretation no longer reflects reality. The reality is that Facebook uses sophisticated algorithms to individually curate what users what to see in order to keep them engaged amid declining interest among younger users. In order to maintain user interaction amid the rapid rate of content generation social media companies felt compelled to curate content, an inherently editorial role.

The legislature has not kept up with the rapid development of these algorithms. Because of this, we’ve unwittingly entered into a social contract with these platforms that lets them gain astronomical profits without being held responsible for the negative externalities they inflict on society. We need to establish safeguards that prevent negligence and provide a reasonable set of guidelines that protect the user from malfeasance, even when the user is not the platform’s customer (the advertisers are the ones picking up the bill). There are two different paradigms that we should base our new regulations on: financial regulation requiring disclosure of algorithms after the 2008 financial crisis and environmental regulation requiring impact assessments passed in 1969.

After the 2008 financial crisis the Treasury Department mandated that all banks provide them with the proprietary pricing models used to price and assess risks on financial instruments. Banks act as a pseudo public good by offering financial services, but they only achieve this with the aid of insured deposits and low-cost credit from the Federal Reserve. In exchange for this, they must abide by strict regulation and capital requirements to avoid leaving the public on the hook. The regulation passed after the financial crisis gives the Treasury Department proper oversight to ensure banks have sufficient capital to avoid a bailout even in the case of a future recession or rapid drop in asset prices by letting them perform stress testing on systemically important financial institutions (SIFIs). Under this model, banks can still have proprietary algorithms; however, regulators are also able to view and ensure that they generate reasonable valuations. Whenever a strategist makes a change to one of these models, the change must go through a model management committee and then the Treasury must re-approve.

Similarly, social media companies rely on our laws to limit their liability, in return they owe us an understanding of the implications of their algorithms. These companies should be required to provide their algorithm inputs in order to provide such an understanding. Any changes to their formulas should be submitted to a regulatory body, so that the public can be informed of the impact of the changes even if they cannot see the exact proprietary inputs of the algorithm. The re-empowerment of the Office of Technology Assessment, a non-partisan agency of the federal government that studied the implications of technology starting in 1972 before being shut down in 1995, could become the equivalent of the Treasury Department in terms of monitoring the impact of social media algorithms.

In addition to requiring disclosure, social media regulation should be required to produce impact assessments similar to those required under environmental legislation for construction projects. Before construction of any project that constitutes major federal action, developers must provide an environmental impact statement and corresponding study as required by the National Environmental Policy Act (NEPA) of  1969. The project can only go forward once the public is made aware of the externalities of the project and is given a chance to comment. This legislation was passed after a report written by Senator Henry Jackson called on legislators to make “a quality environment for all Americans [as] a top-priority national goal which takes precedence over a number of other, often competing, objectives”.

It is now increasingly clear that Congress must make a similar decision on securing a transparent online environment in the face of conflicting goals. We decided in 1969 to regulate developers and in 2010 to regulate banks. In 2022 we must decide to regulate social media. Legislation focused on social media could enforce a standard where every significant algorithmic change includes an impact statement and the expected outcome for the community. As such, the community could make informed comments and consent. In the case of social media, the information may in and of itself be sufficient for informing users. Once users have knowledge of the impact of a given change they can react and understand their own behaviors as users accordingly. The European Union has already taken steps in this direction through their General Data Protection Regulation (GDPR), which requires any platform that gathers personal data to make clear to the user what data they are collecting and when.

We opened the Pandora’s box that is social media years ago. Now that social media platforms have matured and shape the public discourse, Congress must update the regulations that govern them to accept the new reality of their editorial responsibility. Social media platforms are not the same as traditional publishers. They cannot be held fully responsible for the user generated content they host. That said, the curation they perform moves their functionality beyond a simple bulletin board. As such they must be transparent in the algorithms that define the content and discourse.

775 views
bookmark icon