When Even the SEC Can’t Police Bad Behavior: The Facebook Whistleblower
When you think about corporations harming society, Facebook tends to jump to the top of the list. Whether fomenting civil unrest or enabling ethnic cleansing, the world’s largest social network seems to be at the mercy of a barrage of negative headlines. The recent testimony by whistleblower Frances Haugen in front of congress only added to the list of woes, saying that “the company systematically and repeatedly prioritized profits over the safety of its users” (Zakrzewski, 2021.)
That sounds bad.
As far as sound bites are concerned, I am sure Facebook’s PR department canceled their weekend plans for the next six months, but what about their legal department? Is it possible they breathed a sigh of relief? At first blush, Facebook has much to fear from a potential tragedy involving bullying or a similar incident – liability from a potential wrongful death suit looms large. However, Section 230 of the Communications Decency Act likely limits Facebook’s liability from user-generated content. The bigger bogeyman for Facebook isn’t an individual user or even a class action suit, but the greatest of all enforcers of nebulous US law: the Securities and Exchange Commission (SEC.)
This does not have anything to do with a stock issue, suspect trading activity, or accounting skullduggery. You are forgetting the SEC’s favorite enforcement tool: disclosure. To quote Bloomberg columnist Matt Levine: “Everything is securities fraud.” The SEC has already shown that they feel Facebook malfeasance falls under their purview; they enforced a $100 million fine stemming from the Cambridge Analytica scandal by claiming that Facebook did not sufficiently disclose to investors that they were providing data to outside brokers. Of all the victims of Cambridge Analytica, shareholders stand somewhere near the back of the line in the eyes of most, but when it comes to punishing generalized corporate malfeasance in the United States, the SEC tends to be most effective by saying that companies failed to disclose their malfeasance. From a legally pragmatic point of view, proving failure to disclose is much easier than proving guilt. Thus, despite the PR disaster that the whistleblower and congressional hearings may have caused, Facebook has limited legal liability – negligence would be difficult to prove, and the SEC cannot come in as enforcer of last resort because Facebook disclosed the possibility of such a program, and it was entirely profit maximizing as the whistleblower said.
Therein lies the irony: the whistleblower’s disclosures may be critical to limiting Facebook’s liability under securities law. In legalese, the managers of a firm have a fiduciary responsibility to shareholders to maximize shareholder value – more simply, executives legally must maximize profits. Facebook’s internal research shows they expect a 45% decrease in interaction from teenage users by 2023. (Roose 2021). Facebook’s internal documents describe losing younger users as an existential threat. Therefore, they must continue to maximize interaction among younger users or else be resigned to the fate of MySpace and Friendster. As a result, they do all they can to increase engagement, even at the possible expense of user mental health.
If Facebook did anything other than maximizing user engagement, it would be failing its responsibility to its shareholders. An activist shareholder could then make a case that management failed to maximize shareholder profit. If they were found to not be maximizing profit, management could face removal from shareholders. (In a Facebook-specific case, this point is moot given that Mark Zuckerberg intentionally maintains a majority voting stake.) As a further defense of their actions, Facebook management could claim that they were acting in the best interest of their customers, given that their paying customers are not users but rather advertisers. Any action Facebook takes to increase user interaction provides increased value to their customers and thus can be viewed as fiduciarily sound.
The important distinction here is that the Facebook user is, in fact, not a stakeholder at all. While user outrage makes for bad headlines, it is more often shareholder outrage that makes for big legal headaches.
This leaves the government without its favorite cudgel to use when a corporation creates a negative externality: the SEC cannot fine Facebook for either failure to disclose or fiduciary failure. Facebook disclosed both that this was a business risk and that they anticipated they may face legal or regulatory action. So, logically, they attacked their risks to profitability in a way that both protected themselves from product liability and shareholder liability.
At the same time, the SEC may attempt to bring an enforcement action against Facebook for failing to disclose that they knew that there was risk of user safety. That claim seems tenuous given that the key point here is that the user is not a stakeholder. Facebook created a more addictive platform, thus maximizing utility to the customer and increasing the value of the product. Had they not performed these steps, then they would have been exposed to a much greater risk. In fact, the first risk listed in Facebook’s 10-K for 2021 is the risk of losing market share. (Class action lawsuits are also explicitly listed as well as the ability to maintain the brand.)
Facebook did as corporations are inclined to do: they maximized engagement and profits to maintain an active user-base in a market where they must try to remain long term dominant despite the transient nature of social media platform success. The hearings revealed plenty of reprehensible business practices, but reprehensive does not always mean actionable. With political polarization paralyzing congressional action, the SEC has often stepped in to use a mix of disclosure regimes and securities fraud lawsuits to castigate corporate malfeasance. The nature of the testimony inherently removes that threat this time, so perhaps lawmakers will need to consider a new regulatory regime for social media companies – one that considers the rights of the user just as much as the customer or shareholder.