Author: David Earl Wietlispach
Download a PDF of this postI. Introduction
In 1988, a federal jury in New Jersey awarded Antonio Cipollone $400,000 for the death of his wife Rose.[1]
…make these cases extremely burdensome and expensive for plaintiffs' lawyers, particularly sole practitioners. To paraphrase General Patton, the way we won these cases was not by spending all of [R.J. Reynolds]'s money, but by making that other son of a bitch spend all of his.[6]
By the 1990s, plaintiffs’ fortunes had changed. They had far more access to internal tobacco company documents through leaks, government investigations, and litigation.[7] At the end of the decade, the tobacco companies settled a joint action lawsuit brought by all fifty of the nation’s state attorneys general that has paid out $201 billion to date.[8] When Willie Evans brought a private tort case against Lorillard Tobacco Co. on behalf of his mother’s estate in 2011, he had access to an “overwhelming” amount of evidence that showed the tobacco company was liable for hooking his mother, at age thirteen, on the Newport cigarettes that caused her cancer and death.[9] The evidence developed in the case showed Evans’ mother was exactly the kind of customer Newport manufacturer Lorillard targeted in “an effort to attract and addict young smokers who would become lifelong smokers.”[10] A jury awarded Evans $152 million.[11] While the case was on appeal, Lorillard settled with Evans for $79 million.[12]
The reality of powerful companies employing aggressive litigation strategies against sympathetic plaintiffs is not unique to Big Tobacco.[13] But using the history of Big Tobacco litigation as an informed guide, this paper discusses the litigation strategy—and its potential collapse—of a modern-day equivalent: Big Tech. Part I of this paper briefly describes addiction research regarding social media platform design and the use of those platforms by children. Part I also draws comparisons between the intentional, addictive product design of platforms and cigarettes—along with their associated harms. Part II of this paper outlines the significance of emerging case law around Section 230,[14] which Big Tech defendants use as a shield to defeat plaintiff tort claims, and the First Amendment, which the industry uses in the plaintiff posture as a sword to strike down any new product design regulation. Recent court holdings relating specifically to Big Tech’s product design have cracked the Section 230 shield and blunted the First Amendment sword, both of which the industry wields in the early, pleading stages of litigation. Part III of this paper discusses why pleading stage losses matter to Big Tech’s litigation strategy and how the balance of power might shift toward plaintiffs and regulators if the industry cannot defeat tort claims or stop new laws from taking effect in the pleading stage or when seeking pre-enforcement relief, respectively. The paper concludes with a cautionary tale that this analysis could change if the Supreme Court speaks decisively to the applicability of the First Amendment to Big Tech’s algorithms or the scope of Section 230 immunity, which one trial court handling hundreds of consolidated cases against the platforms has described as “an area of law in some flux.”[15]
II. Addiction and Harm in Popular Products
“It literally felt like I was quitting cigarettes.”[16]
Cigarettes are the classic case of a harmful product intentionally designed to addict its users.[17] Over the course of decades of tobacco industry investigations and litigation, regulators and plaintiffs uncovered evidence that addiction was at the center of the cigarette business model.[18] Cigarette manufacturers intentionally engineered higher amounts amount of naturally-occurring nicotine in tobacco to levels they knew would cause addiction in smokers.[19] They did this to keep people hooked on their products, despite having their own research showing the harm of prolonged cigarette use.[20] That harm included a host of health problems, the most serious of which was deadly lung cancer.[21] Scholars, reporters, and litigants alike are now drawing parallels between Big Tobacco and a modern-day equivalent: Big Tech.[22] Structurally, Big Tech and Big Tobacco share similarities. They are highly concentrated industries with deep pockets. The companies marketed the products as cool and fun. But more importantly for this comparison, the products are addictive and are associated with harm.
Research connects the use of Big Tech’s social media platforms to addiction through naturally-occurring dopamine.[23] The human brain releases dopamine as a reward for beneficial, evolutionary behaviors—such as finding good food, exploration, exercising, or having positive social interactions. The feel-good chemical reward is a motivator to repeat the activity. This brain chemistry was critically important over millions of years of human evolution. Dopamine-releasing behavior—such as making connections with each other—created a positive association with actions that increased the odds of human survival and procreation.[24] Simply put: “we’re wired to connect.”[25]
While modern humans no longer live in caves and struggle to survive in the wilderness, our evolutionary reactions to environmental stimuli are still the same.[26] That puts humans at risk of “dopamine-mediated addiction.”[27] Platform app developers have figured this out and have intentionally designed their products in three important ways to take advantage of dopamine’s power. First, they put individualized social networks in the pockets of millions of people and have embedded in those networks endless opportunity for social connection. By amplifying the feel-good aspects of connection, the platforms have “druggified”[28] the process. Second, the platforms deliver those drug hits at variable times—precluding the ability to predict when a dose of dopamine is on the way. If checking for a random reward comes at little immediate cost to a platform user, a user is more likely to check habitually. Third, the designers engineer dopamine hits by letting a user explore the platforms for new content that appeals to them. The platforms’ algorithms then learn what that user likes and suggest “new things that are similar but not exactly the same.”[29] Even though users are not sure what they will see next, the platforms know they will like it.
Industry insiders have confirmed their intent to harness and amplify the power of dopamine in their specific product designs—and have even admitted to falling prey to their own work. The technology engineer who designed the feature of infinite scroll[30] described his job as “taking [behavioral] cocaine and just sprinkling it all over your interface” to make smartphone apps such as social media platforms “maximally addicting.”[31] The co-inventor of Facebook’s Like Button described herself as “hooked” on the sense of self-worth and validation it created.[32] A former platform employee likened the variable rewards of social media to a “slot machine” designed to “suck as much time out of your life as possible.”[33] Perhaps what is most telling is that industry executives do not let their own children use their products.[34]
The work to addict users is part of an effort to boost the time they spend on the platform, which drives engagement numbers.[35] Those engagement numbers were used to attract early investor funding [36] and later advertising revenue.[37] The business model is straightforward. Addicted users provide more attention to the platform. The more attention a platform can capture from its users, the more effective it is as an advertising space.[38] A more effective advertising space can charge advertisers a premium for access.[39] Analysts have projected global social media ad spending to top $240 billion in 2024.[40] Ad spending on Meta-owned platforms alone will surpass all ad spending on linear TV in 2025.[41]
Insider accounts and related research about addictive design have predictably made their way into the complaints of those seeking to hold social media companies accountable for alleged harm through litigation. An attorney general action described TikTok’s For You Page as:
a literally endless series of short-form videos curated by algorithms specifically developed to hold a user’s attention for as long as possible … [It] is one of the numerous features designed to exploit the human body’s natural reaction to the receipt of small rewards through the release of the pleasure-creating neurotransmitter dopamine, and in turn promote addictive behavior.[42]
A massive, multi-district litigation brought by individual plaintiffs, school districts, and attorneys general accuses social media platforms of deploying design strategies to addict children—creating for themselves a “pipeline for growth.”[43] The 279-page complaint against TikTok’s parent company—along with Meta (Facebook and Instagram), Snap, and Alphabet (YouTube)—accuses the platforms of designing their products to “manipulate dopamine release” in children’s not-yet-fully developed brains.[44] Compulsive use of the platforms, plaintiffs allege, is fueling a youth mental health crisis that has caused children a variety of harms.[45] The litany of alleged harms to children include “anxiety, depression, eating disorders, body dysmorphia, self-harm, sexual exploitation, suicidal ideations, other serious diseases and injuries, and suicide itself.”[46] Suicide is the second leading cause of death for young people.[47] The U.S. Surgeon General now wants to place warning labels on social media platforms, noting that the platforms are associated with “significant mental health harms for adolescents.”[48] In calling for the warning labels, the Surgeon General evoked lessons learned from the public health battles with Big Tobacco, citing “evidence from tobacco studies [that shows] warning labels can increase awareness and change behavior.”[49]
III. Cracking the Shield and Blunting the Sword
“‘Winning Without Trial’ is an oxymoron. The figure of speech is contradictory, but the idea makes perfectly good sense.”[50]
The pleading stage of litigation is a defendant’s first chance to get the case against them kicked out of court.[51] Indeed, after the Supreme Court heightened the pleading requirements in the late aughts,[52] the use of motions to dismiss under Fed. R. Civ. P. 12(b)(6) dramatically increased.[53] Defense attorneys at prominent law firms have remarked that they routinely file these motions.[54] Some litigators now consider the failure to file these motions legal malpractice.[55]
Big Tech’s attorneys are no different. Industry lawyers have, for decades, used Section 230 motions in the pleading stage of litigation to get tort claims “dismissed on a basically automatic basis.”[56] Because Congress created Section 230 to protect websites from “having to fight costly and protracted legal battles” courts “aim to resolve the question of [Section] 230 immunity at the earliest possible stage of the case.”[57] And as states have tried to pass new regulations affecting platforms’ businesses, Big Tech has deployed a similar strategy in the First Amendment context.[58] By filing for pre-enforcement injunctive relief, the industry’s goal is to get “quick, paint-by-numbers court orders” that would “stop new regulations from getting off the ground” and preserve the status quo operating environment for their businesses.[59]
But recent decisions from the nation’s circuit courts, as well as the Supreme Court itself, have signaled the era of easy wins for Big Tech might be coming to an end. The decisions in Lemmon v. Snap, Anderson v. TikTok, and Moody v. NetChoice demonstrate the industry faces shifting terrain in its litigation battlefield. It may soon be impossible for companies running massive social media platforms to avoid the quagmire and cost—both pecuniary and existential—of complex, civil litigation that targets their business models. At least one trial judge in a tech-heavy legal jurisdiction now describes the sweeping motion practice central to Big Tech’s litigation strategy as “a waste of [a judge’s] time” and has admonished the industry’s lawyers to “push back on your clients.”[60]
In exploring the effects of recent rulings on Big Tech’s litigation strategy, it is helpful to analogize litigants to warriors in the Colosseum. Those ancient warriors wielded both defensive and offensive weapons—with varying degrees of skill and success—to defeat their rivals. The stronger the shield, the more protection it provides. The sharper the sword, the more dangerous it is for those who face it. In modern court battles, a litigant’s posture—whether defendant or plaintiff—determines which weapon they wield. But litigants today, like warriors from long ago, know they are in a more perilous position if the weapons they wield are compromised—like a cracked shield or a blunted sword.
A. Defendant Posture: The Shield
For years, Section 230 has been an impervious shield deployed by tech companies facing lawsuits related to their platforms. Since its enactment, courts have broadened Section 230 to immunize the companies “from virtually any injury” that shares any connection to a platform’s content.[61] Big Tech companies deployed their Section 230 shield with success in case after case, but a fiery crash on a country road started to change their fortunes.[62]
1. Lemmon v. Snap: Ninth Circuit holds product design falls outside Section 230 Immunity
In May 2017, three boys traveling in a car ran off the road in Wisconsin at 123 miles per hour.[63] They collided with a tree, their vehicle burst into flames, and all three died.[64] Minutes before the crash, one of the boys opened Snapchat and used the app’s “Speed Filter” to document how fast the group was going.[65]
The boys’ parents sued Snap, Inc., the maker of the Snapchat app, claiming that Snapchat’s Speed Filter incentivized young drivers—including their children—to drive at dangerous speeds.[66] Despite plenty of warnings that the Speed Filter was leading to crashes, according to the parents, “Snap did not remove or restrict access to Snapchat while traveling at dangerous speeds.”[67] Their case was based on negligent design, and faulted Snap for its architecture—“contending that the app’s Speed Filter and reward system worked together to encourage users to drive at dangerous speeds.”[68] Snap filed a motion to dismiss under Fed. R. Civ. P. 12(b)(6), claiming that Section 230 immunized it from the parents’ claims.[69] The trial court agreed and dismissed the case.[70]
On appeal, the Ninth Circuit analyzed Snap’s Section 230 defense using the Barnes test.[71] To enjoy Section 230 immunity, a company must demonstrate that (1) it is a provider of an interactive computer service (2) which plaintiffs are seeking to treat as a publisher (3) of third-party content.[72] Neither party in the litigation disagreed with the test’s first prong. So, Snap pressed an argument that worked in front of the trial court. Under the second Barnes prong, Section 230 protected the company because the plaintiffs were attacking Snap’s Speed Filter, which it contended was a publishing tool. Under the third Barnes prong, Snap argued the harm alleged in the case—death during a high-speed crash for a social media stunt—required the publication of the high-speed content. According to Snap, its Speed Filter facilitated the publishing of third-party content—an activity well within the scope of Section 230 immunity.
But the Ninth Circuit rejected Snap’s argument holding that Section 230 immunity was unavailable for two reasons. First, the parents sought to hold Snap liable for its conduct in designing a dangerous product, not its role as a publisher.[73] Snap’s duty, as alleged by the parents, had nothing to do with “editing, monitoring, or removing” user-generated content on the platform—which would create immunity under Section 230.[74] Instead, the duty flowed from Snap’s creation of a dangerous platform tool. Second, internet companies “remain on the hook” for their own content.[75] Even if Snap were a publisher in the context of its Speed Filter, Section 230 “cuts off liability only when a plaintiff’s claim faults the defendant for information provided by third parties.”[76] Here, the content was Snap’s alone.
In short, the Ninth Circuit held that Section 230 does not shield a platform from claims that it designed its product “in such a way that it allegedly encourages dangerous behavior.”[77]
2. Anderson v. TikTok: Third Circuit holds no Section 230 immunity for expressive conduct.
TikTok is “the most successful video app in the world” with more than a billion monthly users.[78] The platform recommends to its users, through an algorithm, videos to watch on a “For You Page.”[79]. Some of these videos are challenges, which “urge users to post videos of themselves replicating the conduct depicted in the videos.”[80] One such challenge—called the “Blackout Challenge”—encouraged users to “choke themselves with belts, purse strings, or anything similar until passing out.”[81] When Nylah Anderson opened her TikTok app, one of the Blackout Challenge videos was waiting for her—served up by the platform’s “astonishingly good”[82] algorithm. The ten-year-old watched the video, replicated the conduct, and unintentionally hanged herself.[83] Anderson’s mother sued TikTok over the death of her daughter. The negligence lawsuit alleged TikTok’s algorithm was defectively designed and the platform failed to warn its users of the defects.[84] TikTok asserted that Section 230 barred those claims.[85] The district court agreed with TikTok, and it granted the platform’s motion to dismiss.[86]
However, on appeal, the Third Circuit reversed in part.[87] The court held platforms “are immunized [under Section 230] only if they are sued for someone else's expressive activity or content (i.e., third-party speech), but they are not immunized if they are sued for their own expressive activity or content (i.e., first-party speech).”[88] Relying on the Supreme Court’s observation that platforms’ curation of others’ content through an expressive algorithm amounted to protected first-party speech, the Third Circuit held “it follows that doing so amounts to first-party speech under [Section] 230, too.”[89] Because Section 230 immunity is only available to platforms if they are sued “for someone else’s expressive activity or content,”[90] the immunity would not be available for the choices of TikTok’s algorithm on what to display to a user.
In short, the Third Circuit held that Section 230 does not provide immunity to platforms if they face tort lawsuits over injury caused by the algorithms they design.
B. Plaintiff Posture: The Sword
Big Tech’s attorneys are not always playing defense. For years, the high-powered law firms retained by platform companies and their trade association, NetChoice, have aggressively targeted any regulations that seek to control the companies’ behavior or constrain their business model.
1. Moody v. NetChoice: the Supreme Court rejects the ‘paint-by-numbers’ approach.
In 2021, the states of Texas and Florida both passed legislation containing content-moderation provisions that would limit the ability of online platforms to “filter, prioritize, and label” the content their users post.[91] The tech industry’s trade association challenged, as plaintiff, the state regulations that would affect platforms’ businesses. Importantly, the trade association brought facial challenges against the laws—contending courts should strike them down entirely as violations of the First Amendment.[92]
The trade association got its preliminary injunctions blocking implementation of the laws at the trial court level.[93] But on appeal, the Fifth Circuit and the Eleventh Circuit diverged—the Fifth Circuit reversed the preliminary injunction against Texas and the Eleventh Circuit upheld it against Florida.[94] Thus, the case arrived at the Supreme Court in an early posture on the issue of whether the preliminary injunctions[95] were appropriate.
In constitutional litigation, facial challenges are disfavored. [96]They only succeed if litigants can "establish that no set of circumstances exists" under which the challenge law would be valid. [97] In the First Amendment context, however, the Supreme Court has lowered the threshold for finding a speech statute facial invalid. Instead of requiring a showing that there are no valid applications of the challenged law, the Court requires the challenged law prohibits a "substantial amount of protected speech relative to its plainly legitimate sweep."[98] The Court implemented this lower threshold because it was concerned that laws aimed at speech would create a chilling effect on constitutionally-protected activity—that people will self-censor "out of fear of state sanctions."[99] When faced with a facial challenge against a state under the First Amendment, courts must conduct an analysis to see if the law is "substantially overbroad," and thus, might have chilling effect. [100] Courts are to "evaluate the full scope of the law's coverage," "decide which of the law's applications are constitutionally permissible and which are not, and finally weigh" the constitutional applications against the unconstitutional ones. [101]
Applying the precedent in Moody, the Supreme Court raised serious doubts about issuing what some scholars have characterized as “paint-by-number[]”[102] preliminary injunctions on sparse records for tech companies challenging regulations in the First Amendment context. Moody signals that these challenges will require more litigation before determining if the tech industry can meet its burden for preliminary injunctions that would halt an entire law from going into effect. This, according to the Court, “is the price of [Big Tech’s] decision to challenge the laws as a whole."
C. Recent Applications of the Aforementioned Cases
1. NetChoice v. Bonta: How Moody blunted Big Tech's First Amendment sword at the appellate level.
Litigants did not have to wait long for an application of Moody. Less than two months after the Supreme Court articulated how it would approach First Amendment facial challenges to tech regulation, the Ninth Circuit applied the Moody holding in NetChoice v. Bonta.[104] In this case, the tech industry challenged a unanimously passed law[105]—the California Age-Appropriate Design Code Act (CAADCA)—that required online platforms to consider the well-being of children and default their privacy and safety settings to protect their mental and physical health.[106] The law also imposed some affirmative obligations for covered businesses.[107] While the trial court granted the tech industry’s request for a preliminary injunction on First Amendment grounds, halting the enforcement of the CAADCA, the Ninth Circuit reversed, in part, finding that it was “less certain” the tech industry would succeed on its facial challenge of certain provisions of the law.[108]
One of those provisions prohibited the platforms’ use of “dark patterns”[109] to target children.[110] The Ninth Circuit held in this instance that the litigation needed a more robust record to determine “whether a ‘dark pattern’ itself constitutes protected speech and whether a ban on using ‘dark patterns’ should always trigger First Amendment scrutiny.”[111] This finding is significant because of what regulations it could cover. While tucked away in a footnote, the Ninth Circuit suggested the trial court needed to grapple with whether “dark pattern” regulations capture certain key platform design features like X’s infinite scroll, TikTok and YouTube’s autoplay, or Snapchat’s streaks.[112] If those design features are considered “dark patterns,” and if “dark patterns” are not protected speech, then suddenly, design features that are critical to platforms’ engagement efforts could be within the crosshairs of state regulation.
2. Cases against Big Tech for defective design move forward despite claims of Section 230 and First Amendment immunity at the trial level.
Some of the appellate decisions outlined in the previous sections are being put into practice at the trial court level as the tech platforms face significant and growing litigation on a host of defective design claims. Two significant collections of these cases are in California’s federal and state courts. Each of those courts, when presented with Section 230 and First Amendment defenses, has declined to entirely dispose of the claims brought by plaintiffs.
a. In re Social Media Adolescent Addition Litigation (federal court)
Pending multi-district litigation in the Northern District of California has consolidated individual plaintiff lawsuits brought on behalf of children, more than 140 actions brought by school districts, and actions filed jointly by more than thirty state attorneys general.[113] The plaintiffs allege the world’s most-used platforms—Facebook, Instagram, YouTube, TikTok, and Snapchat—intentionally designed their products to addict children. [114] The plaintiffs relied on pre-discovery research literature, material from the Surgeon General and Congressional hearings, internal platform documents, and statements of former platform executives to craft their complaint.[115] The crux of their argument boils down to this:
Borrowing heavily from the behavioral and neurobiological techniques used by slot machines and exploited by the cigarette industry, Defendants deliberately embedded in their products an array of design features aimed at maximizing youth engagement to drive advertising revenue. Defendants know children are in a developmental stage that leaves them particularly vulnerable to the addictive effects of these features. Defendants target them anyway, in pursuit of additional profit.[116]
The plaintiffs identified the following design defects: endless content, lack of screen time limitations, intermittent variable rewards, ephemeral content, limitations on content length, notifications, algorithmic prioritization of content, photo filters, barriers to deletion, connection of children and adult users, private chats, geolocation, age-verification, and lack of parental controls.[117] Plaintiffs’ theories of harm include claims of negligent design defects and failure to warn about those defective designs.[118]
The platform defendants moved to dismiss the case on “two global grounds” that are familiar strategies in the Big Tech litigation playbook: Section 230 immunity and the First Amendment.[119] In two key rulings, however, the trial court cracked the Section 230 shield and found the First Amendment inapplicable to certain claims. In its first ruling on negligent design defects, Social Media Litigation I, the court rejected an “all or nothing approach” to platform immunity and instead went defect-by-defect with its analysis.[120] While this approach filtered out some of the plaintiffs’ alleged design defects from the negligence claim, it allowed others to survive.[121] A later ruling swept further, with the court recognizing Section 230 as “an area of law in some flux.”[122] As such, Social Media Litigation II declined to filter out any of the plaintiffs’ alleged design defects as they related to their failure to warn claims.[123]
In Social Media Litigation I, the court focused on the plaintiffs’ negligent design claims. The court’s analysis rejected the ‘all or nothing’ approach of both the defendants (‘all’) and the plaintiffs (‘nothing’) regarding Section 230 and First Amendment immunity.[124] To determine whether either could act as a shield, the court elected to use a “conduct-specific approach,”[125] testing each of the alleged defective design features to see if they are eligible for the protections claimed by the platforms.
Alleged defective designs that qualified for Section 230 protection included short-form and ephemeral content, private content, timing and clustering of the delivery of third-party content, recommending minor accounts to adult strangers, a lack of screen time limitations, and endless content or infinite scroll.[126] Lemmon was distinguishable from these design defect claims because the court found curing some of these alleged defects would “necessarily require [platforms] to publish less third-party content.”[127] Others—such as determining the length of content, allowing private content, or recommending certain accounts—were “‘traditional editorial functions’ immune under Section 230, where exercised with regard to third-party content.”[128] Put another way, these design defects implicated the kind of third-party speech Section 230 traditionally immunized.
As for the timing and clustering of the platform's own content, while that design was not entitled to Section 230 protection, the court held that it was entitled to First Amendment protection.[129] Those notifications, such as the awards platforms provide for engaging, “are speech.”[130] The court held there was “no way to interpret plaintiffs’ claim with respect to the frequency of the notifications that would not require [platforms] to change when and how much” of their own speech they publish.[131] After filtering the design defects through Section 230 and the First Amendment, the following remained: not providing effective parental controls, not providing options for users to self-restrict time spent on a platform, making it challenging to delete an account, not using robust age verification, and not allowing users or visitors of platforms to report child sexual assault material or predator accounts without having an account themselves.[132] To this court, claims about these design defects are analogous to the defect found in Lemmon because they do not implicate in any way the publication of third-party speech.[133]
But as the court considered whether Section 230 would bar failure to warn claims in Social Media Litigation II, it jettisoned the defect-by-defect, conduct-specific approach used in Social Media Litigation I. Instead, the court took special care to note that “not one court has yet [held] that Section 230 bars . . . failure-to-warn theories in general.”[134] Further, the court described Section 230 as “an area of law in some flux,” noting that Anderson pushed Section 230 assessments “into new territory.”[135] Because the plaintiffs’ failure-to-warn claims presented novel theories, despite the court’s “skepticism of these claims,” it would not foreclose liability in the pleading stage as to “known risks of addiction attendant to any platform features”—even those it barred previously through a Section 230 analysis.[136]
b. Social Media Cases (state court)
The social media platforms face similar cases in California state court. Plaintiffs—who are minor children and their families—accuse the platforms of employing defective and dangerous product features that are purposely engineered to create compulsive use and addiction by young people.[137] The plaintiffs brought 13 causes of action against the defendant platforms, including negligence, negligent design, failure to warn, fraudulent concealment, wrongful death, and loss of consortium.[138] Among the notable allegations against specific platforms in the complaint, the plaintiffs allege Snapchat’s design features, such as the “Snap Streak,”[139] push notifications, and “Spotlight”[140] induce addiction, compulsive use, and other mental and physical harm to young users of the app.[141] Plaintiffs also specifically alleged that Meta failed to disclose “its detailed research regarding addiction to its products,” including Meta’s finding that problematic use “causes profound harms.”[142]
The platforms, predictably, moved to dismiss all the plaintiffs’ causes of action on Section 230 and First Amendment grounds.[143] The platforms successfully dismissed some of the causes of action, but the court refused to dismiss causes of action under common law negligence or fraudulent concealment.[144] The court found that the plaintiffs adequately stated a negligence claim “based on lack of reasonable care in the [platforms’] own conduct from which harm might reasonably be anticipated.”[145]
Under California negligence law, each person has a duty to exercise reasonable care for the safety of others.[146] California’s negligence statute expressly states that “everyone is responsible … for an injury occasioned to another by his or her want of ordinary care or skill in the management of his or her property.”[147] The duty of reasonable care extends to when a company makes its property, such as a product, “available for public use and one of those products” causes harm.[148] In this litigation, the plaintiffs:
allege that they were directly injured by Defendants' conduct in providing Plaintiffs with the use of Defendants' platforms. Because all persons are required to use ordinary care to prevent others from being injured as the result of their conduct, Defendants had a duty not to harm the users of Defendants' platforms through the design and/or operation of those platforms. [149]
In rejecting the Section 230 defense to the negligence claim, the court relied on the Ninth Circuit holding in Lemmon. If plaintiffs’ claims target the design features of a social media site that create harm and not the content of the material published on the site, Section 230 cannot provide immunity.[150] In analyzing the plaintiffs’ negligence claims, the court found their contentions about the addictive qualities of the platform’s interactive features “do not fall within the ‘blanket immunity from tort liability for online republication of third party content.’”[151] Instead, the plaintiffs contend the platform features “operate to addict and harm minor users of the platforms regardless of the particular third-party content viewed by the minor user.”[152] Importantly, the court noted that plaintiffs’ allegations about the platform features that maximize engagement “do not challenge algorithms that decide what content to publish.”[153]
The court also rejected the platforms’ First Amendment defense to the negligence claim.[154] In doing so, the court grappled with whether the design features attacked by plaintiffs could be protected speech.[155] But, the platform defendants failed to demonstrate their design features “must be understood at the pleadings stage to be protected speech or expression.”[156] Instead, the platforms focused on content-related First Amendment defenses.[157] The platforms argued the addictive design features identified by the plaintiffs “can be analogized to how a publisher chooses to make a compilation of information.”[158] While the First Amendment generally protects a publisher for editorial decisions, the court held “[d]esign features of the platforms (such as endless scroll or filters) cannot readily be analogized to mere editorial decisions made by a publisher.”[159] The court concluded platform design features have more to do with how users interact with the platform itself, not the nature of any protected content the users view.[160] And the court noted that even though certain allegations in the complaint might read as the plaintiffs tying their harm to the content they viewed on the platforms, the complaint can also read as a statement that the design features alone caused the plaintiffs’ harm.[161] That distinction was enough to survive a motion to dismiss.
Additionally, the court found no Section 230 or First Amendment defense for a claim of fraudulent concealment, which implicates a failure to warn.[162] Plaintiffs leveled this claim against Meta individually, alleging that its internal research on addiction showed a host of safety risks associated with the use of Facebook and Instagram.[163] Users could not have discovered those risks on their own, and thus, Meta should have warned about them.[164] The platform’s internal research showed, among other things, that:
up to 25 percent of people on Facebook experience addiction to the product,
that those using the product for long amounts of time were disproportionately younger on average,
that addictive use causes harms such as sleep disruption, relationship impacts, and safety risks,
that teens report Instagram as a source of anxiety and depression,
and that Meta’s researchers concluded that teens’ use of Instagram follows an addict’s narrative.[165]
Had the plaintiffs pressed for warnings only about problematic content—such as content about suicide or body image—that would have presented Section 230 issues.[166] But because the plaintiffs also contemplated warnings associated with platform design, Meta “could have fulfilled its duty to warn of these potential harms without referencing or deleting any content—the duty springs from its capacity as a creator of features designed to maximize engagement for minors, not from its role as publisher.”[167] Additionally, in the First Amendment context, warnings about addictive use do not involve any content on the platform.[168] “Therefore, the First Amendment is not implicated by failure to provide warnings concerning potential harms from features created by Defendants” that seek to maximize minors' use.[169]
IV. Why It Matters: Discovery Can Unlock Big Tech’s Secrets
“Litigation is notoriously time-consuming, inefficient, costly, and unpredictable.”[170]
A plaintiff’s complaint must survive the pleading stage of litigation—including motions to dismiss—before the court will unlock the doors of discovery.[171] But once a plaintiff successfully opens those doors, the stakes of the litigation grow higher. Discovery is “intrusive, unpleasant, time-consuming and costly.”[172] “It is, like life itself, ‘nasty and brutish’ . . . [but] it is not generally ‘short.’”[173] This stage of litigation, in the words of Judge Posner, is “the bane of modern litigation.”[174] Posner’s colleague on the Seventh Circuit, Judge Easterbrook, described this stage of litigation as “trench warfare”[175] and discovery as “both a tool for uncovering facts essential to accurate adjudication and a weapon capable of imposing large . . . costs on one's adversary.”[176] Indeed, this view of discovery—that it is a “sprawling, costly, and hugely time-consuming undertaking”—was one of the reasons the Supreme Court cited to rationalize the heightened pleading standard it imposed in Twombly.[177]
It is no surprise that discovery critiques include cost. In federal cases, discovery expenses comprise half of all litigation costs.[178] In the most expensive cases, 90 percent of the litigation cost comes from discovery.[179] More than a decade ago, information volunteered by corporate defendants showed discovery costs reaching $9.7 million on a single case.[180] That same 2010 survey showed corporate defendants claiming, on the high end, it cost more than $200,000 to collect, process, and review one gigabyte of electronically stored information.[181] In 2023, median legal spending—including litigation costs—amounted to $80 million for companies with more than $20 billion in revenue.[182] That was a 57 percent jump from the previous year.[183]
But, the discovery stage of litigation is incredibly important. After all, “discovery . . . is the battleground where civil suits are won and lost.”[184] While defendants may sometimes prevail by developing facts that could lead to a successful motion for summary judgment, in other instances, discovery yields key evidence that will help plaintiffs prevail at trial or secure a “favorable settlement.”[185] The plaintiffs in cases against social media platforms know this. While they have alleged harm caused by the platforms, they have also noted that
it is impractical to create a comprehensive list of addictive, harm-causing defects in the product until in-depth discovery occurs. Many product features, such as the inner workings of Meta’s algorithms, are secret and unobservable to users. Discovery during this litigation will reveal additional detail about the defective, addictive, and harmful design of Meta’s products.[186]
Each case outlined above, In re Social Media Addiction and Social Media Cases, has advanced to the discovery stage of litigation. Both the federal and state courts have issued discovery orders and set the bellwether cases for trial in 2025.
A. The Big Tobacco Litigation Lesson
While no one knows for sure what material plaintiffs will uncover during their discovery with Big Tech, if the litigation follows the pattern of Big Tobacco, the public may see some damning documents that detail the inner workings of the platforms. Fourteen million tobacco industry documents produced during litigation are now housed by the University of California San Francisco Library and are digitally accessible. Among the more embarrassing documents are:
A confidential R. J. Reynolds memo asking, “Why, then, are younger adult smokers important to RJR? . . . Younger adults are the only source of replacement smokers.”[187]
A confidential memo from Brown & Williamson noting that “nicotine is addictive. We are, then, in the business of selling nicotine, an addictive drug.”[188]
A rationalization of the cancer caused by cigarettes because “with a general lengthening of the expectation of life we really need something for people to die of.”[189]
And the premise of marketing menthol cigarettes to African Americans—that their “desire for instant gratification reflects the inclination of a deprived people to get as much satisfaction as they can as soon as they can. And it may explain, to some degree, the tendency of Blacks to smoke cigarettes at a greater rate than the rest of the population.”[190]
But importantly for the context of addiction liability, Big Tobacco “designed their cigarettes to precisely control nicotine delivery levels and provide doses of nicotine sufficient to create and sustain addiction.”[191]
As Big Tobacco’s litigation losses mounted, and more discovery of what the cigarette manufacturers knew and hid from consumers became public, the industry’s public perception plummeted. [192] Its public image became just as toxic as the products it manufactured and sold.
There is a parallel in the Big Tobacco litigation and what is unfolding against Big Tech. Indeed, while some early, damning documents were uncovered during litigation, it was not until an industry insider leaked a trove of incriminating documents that the litigation against Big Tobacco accelerated and started achieving success.[193] Big Tech, namely Meta, faced a similar situation when a former employee on its trust and safety team leaked internal company documents and reports that painted a very different picture of the platform than what it publicly portrayed.[194] Some of those leaked documents have been used in the litigation the platform now faces.[195]
B. A Predictable Outcome: Snap Settles
One month after the Ninth Circuit reversed and remanded the motion to dismiss in Lemmon, it was back in front of the trial court. In supplemental briefs, the references to Section 230 were gone. [196] Instead, Snap’s supplemental brief read much like that of a products manufacturer—arguing causation and contributory negligence under simple tort law theories.[197]
Without the Section 230 shield, Snap fared much worse in front of the trial court, which denied the motion to dismiss.[198] Importantly, the court found:
On causation: “The causal connection between the Speed Filter and the speeding accident is strong given that the accident occurred while the Plaintiffs were using the Speed Filter for the exact purpose for which it appears to have been designed: to record the user traveling at excessive speeds.”[199]
On contributory negligence: “The Court is not convinced that Plaintiff's negligence outweighs Defendant's negligence here.”[200]
On public policy: “this is not a distracted driving case—this case is about a mobile application feature that was seemingly designed solely for users to record themselves traveling at high speed.”[201]
Two months after denying the motion to dismiss, the parties in Lemmon had a Fed. R. Civ. P. 26(f) discovery plan filed with the court.[202] But as discovery neared its end, the parties filed a joint stipulation asking the court to stay the process.[203] Snap reached a settlement with the plaintiffs after mediation,[204] and the court dismissed the case.[205] The settlement terms were not disclosed, but before the parties settled, Snap eliminated its Speed Filter feature.[206]
C. The Platforms are Acknowledging the Risk
Most of the platforms facing design defect lawsuits or design-related regulations are publicly traded companies.[207] As such, the Securities and Exchange Commission requires risk disclosures, including disclosure of “significant pending lawsuits.”[208] Companies will disclose these litigation risks—along with risks posed by regulation—in their annual 10-K filings.[209] Not disclosing the risks of pending litigation or evolving regulation would invite a different kind of plaintiff problem for the publicly traded platforms—lawsuits against them by their own shareholders.[210]
An analysis of recent 10-K filings from Meta, Alphabet, and Snap all reveal that the companies have disclosed to their investors the risks posed to their businesses if courts turn hostile to their Section 230 immunity defenses. Meta noted that plaintiffs “are attempting to avoid or limit the application of Section 230.”[211] The company warned that efforts to remove or restrict the scope of Section 230 immunity “may increase our costs or require significant changes to our products. . . which could adversely affect our business and financial results.”[212] In commenting on litigation risks, Meta specifically referenced the federal In Re Social Media Addiction litigation and state Social Media Cases litigation.[213] While Alphabet declined to name specific litigation, it remarked generally that court rulings affecting Section 230 immunity “may adversely affect us and may impose significant operational challenges.”[214] Snap acknowledged that the court presiding over the lawsuits consolidated in California state court allowed negligence claims against it to proceed and warned that “litigation is inherently uncertain, . . . an unfavorable outcome could seriously harm our business.”[215] Importantly, Snap noted that “if courts begin to interpret [Section 230] more narrowly than they have historically done, this could expose us to additional lawsuits and potential judgments.”[216]
The same 10-K filings have also highlighted the risks associated with new regulations—the very regulations Big Tech’s trade group NetChoice has tried to attack with facial First Amendment challenges. Alphabet warns its investors that its governing laws and regulations are “evolving and their applicability and scope, as interpreted by the courts, remain uncertain.”[217] Compliance could be “onerous,” according to Alphabet, and could “increase our cost of doing business, make our products and services less useful, limit our ability to pursue certain business models, cause us to change our business practices, affect our competitive position relative to our peers, [and] otherwise harm our business, reputation, financial condition, and operating results.”[218]
Snap notes that actual or perceived failure to comply with evolving laws and regulations “may lead to costly litigation or otherwise adversely impact our business.”[219] Meta warned its shareholders that regulation
concerning the manner in which we display content to our users, moderate content, provide our services to younger users, or are able to use data in various ways, including for advertising, could adversely affect user growth and engagement. Such actions could . . . adversely affect our financial results, including by imposing significant fines that increasingly may be calculated based on global revenue.[220]
And while Meta filed its most recent 10-K before the critical Supreme Court holding in Moody v. NetChoice regarding Big Tech’s First Amendment facial challenges, the platform noted the importance of its trade industry’s challenges to the regulations in Texas and Florida that culminated in that decision.[221]
V. Conclusion
As the federal trial court overseeing hundreds of consolidated cases against the world’s biggest platforms noted in a pre-trial ruling, Section 230 is an area of law “in some flux.”[222] While courts have historically expanded the breadth of the law’s immunity, it now appears they are starting to trim it back. This flux has affected the litigation strategy of Big Tech and seems poised to result in more cases against the platforms as defendants head to the discovery stage of litigation—the stage which can unearth benign or damning internal documents that can affect whether civil cases are won or lost.
Despite having opportunities to speak decisively on its scope, the Supreme Court has, to date, sidestepped Section 230 questions—even though one of its members openly questions whether the provision provides platforms too much protection.[223] If the Supreme Court takes a Section 230 case, its decision could upend the litigation strategy again—either enshrining broad protections for the platforms or cracking their shield, not just in certain federal circuits, but nationwide.
Of course, if the Supreme Court speaks to Section 230’s scope, it would be a case of statutory interpretation. As such, Congress could act in response to a Supreme Court ruling it opposes. Congress could, of course, act before a Section 230 interpretation case reaches the Supreme Court. Plenty of political commentators and consumer advocates have called for changes to the liability shield platforms enjoy. If Congress amends Section 230, that will profoundly affect the platforms' litigation. Congress acting to amend Section 230 in either instance—before a Supreme Court ruling or after—assumes that Congress can, in fact, function as a body and legislate on important issues.
Supreme Court holdings on constitutional requirements for First Amendment facial challenges are not open to the same kind of Congressional override as cases of statutory interpretation. After Moody, platforms as plaintiffs will struggle to preserve their business models by waging easy attacks on regulation through their First Amendment litigation strategy. As the Supreme Court made clear in Moody, Big Tech is not entitled to special First Amendment protection, but must adhere to the same constitutional litigation rules as any other plaintiff. Without the availability of facial challenges, Big Tech plaintiffs will be left with as-applied attacks—each company will have to employ its own litigation team to defeat the regulation as applied to its business model. This is much less efficient for the industry and could allow certain aspects of a regulation to take effect—demonstrating the regulation’s efficacy—as the platforms’ individual challenges to the regulation work their way through the court system.
As the title of this paper suggests, there are parallels between what happened to Big Tobacco and what could happen to Big Tech should its litigation strategy fail. Big Tech, like Big Tobacco before it, could stand to lose billions of dollars in judgments and settlements. More importantly, regulations—if allowed to take root—could upend Big Tech’s platform business model. Either outcome, or both, could fundamentally rewrite the relationship between Big Tech and its users. If Big Tech’s litigation strategy truly goes up in smoke, these colossal corporations could find themselves in a magnificent mess.
[1] Cipollone v. Liggett Grp., Inc., 693 F. Supp. 208, 219 (D.N.J. 1988), aff’d in part, rev’d in part, 505 U.S. 504 (1992).
[2] Id. at 210.
[3] Id. at 217.
[4] Cipollone v. Liggett Grp., Inc., 505 U.S. 504, 509 (1992).
[5] U.S. Dep’t of Health and Hum. Serv., The Health Consequences of Smoking—50 Years of Progress 32 (2014), https://www.ncbi.nlm.nih.gov/books/NBK294310/ [hereinafter Surgeon Gen. Rep.].
[6] Haines v. Liggett Grp., Inc., 814 F. Supp. 414, 421 (D.N.J. 1993).
[7] Surgeon Gen. Rep., supra note 5, at 32.
[8] Actual Annual Tobacco Settlement Payments Received By The States, Campaign for Tobacco-Free Kids, https://assets.tobaccofreekids.org/factsheets/0365.pdf (Nov. 7, 2024).
[9] Evans v. Lorillard Tobacco Co., 30 Mass.L.Rptr. 207, at *1 (Sept. 6, 2011), aff’d in part, rev’d in part, 990 N.E.2d 997 (Mass. 2013).
[10] Id.
[11] Id.
[12] Andrew Scurria, Lorillard Pays $79M to Settle Mass. Smoker Case, Law360 (Oct. 23, 2013, 7:33 PM), https://www.law360.com/articles/482660/lorillard-pays-79m-to-settle-mass-smoker-case.
[13] See generally David Enrich, How Abbott Kept Sick Babies from Becoming a Scandal, N.Y. Times (updated Sept. 8, 2022), https://www.nytimes.com/2022/09/06/business/abbott-baby-formula-lawsuits-jones-day.html (describing Abbott Laboratories litigation strategy against plaintiffs in baby formula litigation); Lauren Berg, 3M Sanctioned for Using Ch. 11, Subsidiary to Duck Liability, Law360 (Dec. 22, 2022, 9:41 PM), https://www.law360.com/articles/1561041/3m-sanctioned-for-using-ch-11-subsidiary-to-duck-liability (describing the products litigation over combat earplugs manufactured by 3M as a “[s]corched earth battle”).
[14] This paper will use Section 230 to refer to provisions of the Communications Decency Act that provide companies immunity for third-party content they host on their platforms. See 47 U.S.C. § 230.
[15] In re Soc. Media Adolescent Addiction/Pers. Inj. Prod. Liab. Litig., No. 23-cv-05448, 753 F.Supp.3d 849, 888 (N.D. Cal. Oct. 15, 2024) [hereinafter Soc. Media Litig. II].
[16] Hilary Andersson, Social Media Apps Are ‘Deliberately’ Addictive to Users, BBC Panorama (July 3, 2018), https://www.bbc.com/news/technology-44640959.
[17] Alison Kodjak, In Ads, Tobacco Companies Admit They Made Cigarettes More Addictive, Nat’l Pub. Radio (Nov. 27, 2017, 4:10 PM) https://www.npr.org/sections/health-shots/2017/11/27/566014966/in-ads-tobacco-companies-admit-they-made-cigarettes-more-addictive.
[18] Id.
[19] Id.
[20] Id.
[21] Id.
[22] See Jake Snow, Big Tech is Trying to Burn Privacy to the Ground—And They’re Using Big Tobacco’s Strategy to Do It, Tech Policy Press (Oct. 9, 2024), https://www.techpolicy.press/big-tech-is-trying-to-burn-privacy-to-the-ground-and-theyre-using-big-tobaccos-strategy-to-do-it/; Brad Wilcox & Riley Peterson, It’s Time to Treat Big Tech Like Big Tobacco, Inst. for Fam. Stud. (Jan. 20, 2023), https://ifstudies.org/blog/its-time-to-treat-big-tech-like-big-tobacco; Alexandra Sternlicht, The $200 billion playbook that kneecapped Big Tobacco is coming for Mark Zuckerberg and his social media offspring, Fortune (Oct. 26, 2023, 8:33 PM), https://fortune.com/europe/2023/10/26/lawsuits-meta-tiktok-snap-youtube-states-attorneys-general-echo-big-tobacco-litigation-playbook/.
[23] See Bruce Goldman, Addictive potential of social media, explained, Stanford Med. (Oct. 29, 2021), https://scopeblog.stanford.edu/2021/10/29/addictive-potential-of-social-media-explained/; Trevor Haynes, Dopamine, Smartphones & You: A battle for your time, Science in the News (May 1, 2018), https://sitn.hms.harvard.edu/flash/2018/dopamine-smartphones-battle-time/.
[24] Haynes, supra note 23.
[25] Goldman, supra note 23.
[26] Haynes, supra note 23.
[27] Goldman, supra note 23.
[28] Id.
[29] Id.
[30] See Infinite Scrolling, Interaction Design Found., https://www.interaction-design.org/literature/topics/infinite-scrolling?srsltid=AfmBOopZnlfpNfMDoPk33xLSIsQR74i60z5mLBDZqGBn6VYseeH4un5X (last visited Dec. 3, 2024). (“Infinite scrolling is [a digital] interaction design pattern in which a page loads content as the user scrolls down, allowing the user to explore a large amount of content with no distinct end. It is often used on social media platforms and feeds where content has no definite structure or sorting order.”)
[31] Andersson, supra note 16.
[32] Id.
[33] Id.
[34] Katie A. Paul, Tech Execs Protect Their Kids From Their Own Products. America’s Children Deserve The Same, Fast Co. (May 24, 2023), https://www.fastcompany.com/90900166/tech-social-media-protection-children; James Vincent, Former Facebook Exec Says Social Media Is Ripping Apart Society, The Verge (Dec. 11, 2017, 5:07 AM), https://www.theverge.com/2017/12/11/16761016/former-facebook-exec-ripping-apart-society (quoting early Facebook executive Chamath Palihapitiya as saying of social media that his kids “aren’t allowed to use that shit”).
[35] Andersson, supra note 16.
[36] Id.
[37] See Nicholas Thompson, Within Facebook, a Senate of Relief Over the Zuckerberg Hearings, Wired (Apr. 13, 2018, 7:00 PM), https://www.wired.com/story/sigh-of-relief-inside-facebook/. (“Senator, we run ads.” This viral remark from Facebook (now Meta) founder Mark Zuckerburg during a 2018 Senate committee hearing was in response to a question about how the platform sustains a business model where users do not pay for the service.)
[38] Julian Morgans, The Secret Ways Social Media Is Built For Addiction, Vice (May 17, 2017, 11:09 PM), https://www.vice.com/en/article/the-secret-ways-social-media-is-built-for-addiction/.
[39] Id.
[40] Grace Gollasch, Global Social Media Ad Spend to Approach £200bn in 2024, MarketingWeek (May 2, 2024), https://www.marketingweek.com/social-media-spend-200bn/.
[41] Id.
[42] IN RE: SOCIAL MEDIA ADOLESCENT ADDICTION/PERSONAL INJURY PRODUCTS LIABILITY LITIGATION, No. 4:22-MD-3047 (Feb. 17, 2023), https://storage.courtlistener.com/recap/gov.uscourts.cand.401490/gov.uscourts.cand.401490.138.0_1.pdf
[43] Id. at ¶ 11.
[44] Id. at ¶ 12.
[45] Id. at ¶ 15.
[46] Id. at ¶ 18.
[47] Eileen McClory and Samantha Wildow, Kids in Crisis: Suicide a leading cause of death among young people. What can be done, Dayton Daily News (Nov. 13, 2024), https://www.daytondailynews.com/local/kids-in-crisis-suicide-a-leading-cause-of-death-among-young-people-what-can-be-done/S4QEOTQN5NDARNXALCQZCEQPNY/.
[48] Michelle Chapman, Tobacco-like warning label for social media sought by US surgeon general who asks Congress to act, Associated Press (June 17, 2024 4:09 PM), https://apnews.com/article/surgeon-general-social-media-mental-health-df321c791493863001754401676f165c.
[49] Id.
[50] Winning Without Trial, 14 Litig. at 5 (Winter 1988), https://www.google.com/url?sa=t&source=web&rct=j&opi=89978449&url=https://lawcat.berkeley.edu/record/1112963/files/fulltext.pdf&ved=2ahUKEwjitcKWlYCKAxXGHNAFHafYNX8QFnoECBEQAQ&usg=AOvVaw1nZaX-OqGpDWvX2WY0JVfu.
[51] Scott Dodson, A Closer Look at New Pleading in the Litigation Marketplace, Judicature, https://judicature.duke.edu/wp-content/uploads/2022/06/Dodson-2015.pdf (last visited Dec. 3, 2024) (Pleading standards require a complaint to survive a two-step test.).
[52] See generally Bell Atlantic Corp. v. Twombly, 550 U.S. 544 (2007); Ashcroft v. Iqbal, 555 U.S. 1030 (2008).
[53] Dodson, supra note 51.
[54] Id.
[55] Id.
[56] Kyle Langvardt & Alan Z. Rozenshtein, Moody v. NetChoice is a Blow to Silicon Valley’s Litigation Strategy, Lawfare (July 26, 2024, 9:41 AM), https://www.lawfaremedia.org/article/moody-v.-netchoice-is-a-blow-to-silicon-valley-s-litigation-strategy.
[57] Nemet Chevrolet, Ltd. v, Consumeraffairs.com, Inc., 591 F.3d 250, 255 (4th Cir. 2009).
[58] See generally Moody v. NetChoice, LLC, 603 U.S. 707, 720 (2024) (discussing state laws restricting social media platforms in Texas and Florida).
[59] Langvardt & Rozenshtein, supra note 56.
[60] Isaiah Poritz, Silicon Valley Judges Ask Lawyers to Cut Out the Gamesmanship, Bloomberg (Feb. 5, 2025 6:44 AM), https://news.bloomberglaw.com/artificial-intelligence/silicon-valley-judges-ask-lawyers-to-cut-out-the-gamesmanship?source=newsletter&item=read-text®ion=digest&login=blaw.
[61] Matthew P. Bergman, Assaulting the Citadel of Section 230 Immunity: Products Liability, Social Media, and the Youth Mental Health Crisis, 26 Lewis & Clark L. Rev. 1159, 1161 (2023).
[62] Bobby Allyn, Snapchat Can Be Sued Over Role In Fatal Car Crash, Court Rules, Nat’l Pub. Radio (May 4, 2021 7:51 PM), https://www.npr.org/2021/05/04/993579600/snapchat-can-be-sued-for-role-in-fatal-car-crash-court-rules.
[63] Id.
[64] Lemmon v. Snap, Inc., 995 F.3d 1085, 1088 (9th Cir. 2021).
[65] Id.
[66] Id. at 1089-90.
[67] Id. at 1090.
[68] Id. at 1093.
[69] Id.
[70] Id.
[71] See Barnes v. Yahoo!, Inc., 570 F.3d 1097, 1100–01 (9th Cir. 2009).
[72] Id.
[73] Lemmon, 995 F.3d at 1092.
[74] Id.
[75] Id. at 1093.
[76] Id.
[77] Id. at 1094.
[78] Ben Smith, How TikTok Reads Your Mind, N.Y. Times (Dec. 5, 2021), https://www.nytimes.com/2021/12/05/business/media/tiktok-algorithm.html.
[79] Anderson v. TikTok, Inc., 116 F.4th 180, 182 (3d. Cir. 2024).
[80] Id.
[81] Id.
[82] Smith, supra note 78.
[83] Anderson, 116 F.4th at 181.
[84] Id. at 182.
[85] Anderson v. TikTok, Inc., 637 F. Supp. 3d 276, 279 (E.D. Pa. Oct. 25, 2022), rev’d in part, vacated in part, 116 F.4th 180 (2024).
[86] Id. at 282.
[87] Anderson, 116 F.4th at 185.
[88] Id. at 183.
[89] Id. at 184.
[90] Id. at 183.
[91] Moody v. NetChoice, LLC, 603 U.S. 707, 717 (2024).
[92] Id. at 2397.
[93] See NetChoice, LLC v. Moody, 546 F. Supp. 3d 1082, 1096 (N.D. Fla. June 30, 2021); NetChoice, LLC, v. Paxton, 573 F. Supp. 3d 1092, 1117 (W.D. Tex. Dec. 1, 2021).
[94] See NetChoice, LLC v. Att’y Gen., Fla., 34 F.4th 1196, 1231 (11th Cir. 2022); NetChoice, LLC v. Paxton, 49 F.4th 439, 494 (5th Cir. 2022).
[95] See generally Winter v. Nat. Res. Def. Council, Inc., 555 U.S. 7 (2008). (A preliminary injunction is an “extraordinary remedy,” and plaintiff seeking this relief must demonstrate that they are likely to succeed on the merits, likely to suffer irreparable harm without the preliminary relief, that the balance of equities tip in their favor, and that the injunction is in the public interest.)
[96] Alex Kreit, Making Sense of Facial and As-Applied Challenges, 18 Wm. & Mary Bill of Rights J. 657, 658 (2010) (noting that litigants should use facial challenges “sparingly and only in exceptional circumstances”).
[97] United States v. Salerno, 481 U.S. 739, 745 (1987).
[98] Moody v. NetChoice, LLC, 603 U.S. 707, 723 (quoting United States v. Hansen, 599 U.S. 762, 770 (2023)).
[99] David H. Gans, Strategic Facial Challenges, 85 B. U. L. Rev. 1333, 1342 (2005).
[100] Virginia v. Hicks, 539 U.S. 113, 199–20 (2003) (“To ensure that these costs do not swallow the social benefits of declaring a law overbroad, we have insisted that a law's application to protected speech be substantial, not only in an absolute sense, but also relative to the scope of the law's plainly legitimate applications.”).
[101] Moody, 603 U.S. at 744.
[102] Langvardt & Rozenshtein, supra note 56.
[103] Moody, 603 U.S. at 744.
[104] NetChoice, LLC v. Bonta, 113 F.4th 1101, 1122 (9th Cir. 2024).
[105] Jesús Alvarado & Dean Jackson, The California Age Appropriate Design Code Act May Be the Most Important Piece of Tech Legislation You’ve Never Heard Of, Tech Policy Press (July 9, 2024), https://www.techpolicy.press/the-california-age-appropriate-design-code-act-may-be-the-most-important-piece-of-tech-legislation-youve-never-heard-of/.
[106] See Press Release, Gavin Newsom, California Governor, Governor Newsom Signs First-in-Nation Bill Protecting Children’s Online Data and Privacy (Sept. 15, 2022), https://www.gov.ca.gov/2022/09/15/governor-newsom-signs-first-in-nation-bill-protecting-childrens-online-data-and-privacy/.
[107] The CAADCA required “online businesses to create a Data Protection Impact Assessment (DPIA) report identifying for each offered online service, product, or feature likely to be accessed by children, any risk of ‘material detriment to children that arise from the data management practices of the business.’” See Bonta, 113 F.4th at 1109–1110.
[108] Bonta, 113 F.4th at 1122.
[109] California law defines a “dark pattern” as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decisionmaking, or choice.” See Cal. Civ. Code § 1798.140(l) (West 2024).
[110] The CAADCA prohibits the use of dark patterns to “lead or encourage children to provide personal information beyond what is reasonably expected to provide that online service … to forego privacy protections, or to take any action that the business knows, or has reason to know, is materially detrimental to the child’s physical health, mental health, or well-being.” See Cal. Civ. Code § 1798.99.31(b)(7) (West 2024).
[111] Bonta, 113 F.4th at 1123.
[112] Id. at 1123 n.8.
[113] In re Soc. Media Adolescent Addiction/Pers. Inj. Prod. Liab. Litig., No. 22-md-03047, 702 F. Supp. 3d 809, 817 (N.D. Cal. 2023) [hereinafter Soc. Media Litig. I].
[114] Id. at 819.
[115] Matthew B. Lawrence, Public Health Law’s Digital Frontier: Addictive Design, Section 230, and the Freedom of Speech, 4 J. Free Speech L. 299, 319 (2024).
[116] Plaintiff’s Amended Master Complaint (Personal Injury) ¶ 2, In reSoc. Media Adolescent Addiction/Pers. Inj. Prods. Liab. Litig., No. 4:22-MD-3047 (N.D. Cal. Apr. 14 2023).
[117] Soc. Media Litig. I, 702 F.Supp.3d at 819–22.
[118] Id. at 818.
[119] Lawrence, supra note 115 at 320–21.
[120] Soc. Media Litig. I, 702 F.Supp.3d at 818.
[121] Id. at 862–63 (granting in part and denying in part the pending motions to dismiss).
[122] Soc. Media Litig. II, 753 F. Supp. 3d 849, 889 (N.D. Cal. 2024).
[123] Id.
[124] Soc. Media Litig. I, 702 F.Supp.3d at 829.
[125] Id.
[126] Id. at 830–31.
[127] Id. at 831.
[128] Id. at 832.
[129] Id. at 837.
[130] Id.
[131] Id.
[132] Id. at 836.
[133] Id.
[134] Soc. Media Litig. II, 753 F. Supp. 3d 849, 887 (N.D. Cal. 2024).
[135] Id. at 888–89.
[136] Id. at 889 (emphasis added).
[137] Soc. Media Cases, No. 22STCV21355, 2023 WL 6847378, at *1 (Cal. Super. Ct. L.A. County Oct. 13, 2023) [hereinafter Soc. Media Cases].
[138] Id. at *1–2.
[139] Snap Streak encourages use of the platform by rewarding users for sending daily snaps to their friends. A user must both send and receive a snap within 24 hours to count toward the streak. The length of a streak, measured in days, is displayed for users in the platform along with emojis which act as rewards for streaks that are particularly long. See William Antonelli, How to start a Snapchat Streak and keep it alive to boost your Snap Score, Bus. Insider (Aug. 18, 2022, 9:25 AM), https://www.businessinsider.com/guides/tech/snapchat-streak.
[140] Snapchat’s Spotlight feature is a dedicated section that promotes short viral videos created by the platform’s users. It functions similarly to TikTok’s For You Page, although lacks a comments section and creates some safeguards for account holders who are under the age of 18. See Dave Johnson, What is Snapchat Spotlight? How to promote your videos on the TikTok-like feature of the app, Bus. Insider (Mar. 19, 2021, 1:34 PM), https://www.businessinsider.com/guides/tech/snapchat-spotlight.
[141] Soc. Media Cases, 2023 WL 6847378 at *5.
[142] Soc. Media Cases, 2023 WL 6847378 at *44.
[143] Id. at *9.
[146] Id. at *22 (quoting Brown v. USA Taekwondo, 483 P.3d 159, 164 (2021)).
[147] Soc. Media Cases, 2023 WL 6847378 at *22 (emphasis added).
[148] Id. at *23.
[149] Id.
[150] Id. at *30.
[151] Id. at *32 (quoting Barrett v. Rosenthal, 146 P.3d 510, 525 (2006).
[152] Soc. Media Cases, 2023 WL 6847378 at *31.
[153] Id. at *35.
[154] Id. at *39.
[155] See id. at 37.
[156] Id.
[157] Id.
[158] Id.
[159] Id. at 38.
[160] Id.
[161] Id. at *39.
[162] Id. at *45.
[163] Id. at *46.
[164] Id.
[165] Id. at *44.
[166] Id. at *46.
[167] Id.
[168] Id. at *47.
[169] Id.
[170] See Brief of Washington Legal Foundation as Amicus Curiae in Support of Defendant-Appellant, Laramie v. Philip Morris USA Inc., 173 N.E.3d 731 (Mass. 2021), 2021 WL 1568759 (quoting Roger Lowenstein, Buffett: The Making of an American Capitalist 217 (reprint ed. 2013).
[171] Mujica v. AirScan Inc., 771 F.3d 580, 593 (9th Cir. 2014).
[172] Flentye v. Kathrein, No. 06-CV-3492, 2007 WL 2903128, at *2 (N.D. Ill. Oct. 2, 2007).
[173] Id.
[174] Rossetto v. Pabst Brewing Co., Inc., 217 F.3d 539, 542 (7th Cir. 2000).
[175] Frank H. Easterbrook, Discovery as Abuse, 69 B.U. L. Rev. 635, 635 (1989).
[176] Id. at 636.
[177] Bell Atl. Corp. v. Twombly, 550 U.S. 544, 560 n.6 (2007).
[178] Scott A. Moss, Litigation Discovery Cannot be Optimal but Could be Better: The Economics of Improving Discovery Timing in a Digital Age, 58 Duke L.J. 889, 892 (2009).
[179] Id.
[180] Mary Nold Larimore & Matthew J. Hamilton, Cost-Shifting Can Stimulate More Focused, Efficient Discovery in MDL Proceedings, Washington Legal Found. (June 1, 2018), https://www.wlf.org/2018/06/01/publishing/cost-shifting-can-stimulate-more-focused-efficient-discovery-in-mdl-proceedings/#easy-footnote-bottom-7-7159.
[181] Id.
[182] Lyle Moran, Companies’ legal spend has risen nearly 30%, survey finds, Legal Dive (June 16, 2023), https://www.legaldive.com/news/legal-spend-benchmarking-outside-counsel-association-of-corporate-counsel-major-lindsey-and-africa/653260/.
[183] Id.
[184] Moss, supra note 178 at 892 (alteration in original).
[185] Id.
[186] Plaintiff’s Amended Master Complaint (Personal Injury) ¶ 263, In re Soc. Media Adolescent Addiction/Pers. Inj. Prods. Liab. Litig., No. 4:22-MD-03047 (N.D. Cal. Apr. 14, 2023).
[187] The Importance of Younger Adults, RJ Reynolds Records Master Settlement Agreement 1 (last visited Dec. 4, 2024), https://industrydocuments.ucsf.edu/docs/#id=jzyl0056.
[188] Implications of Battelle Hippo I & II and the Griffith Filter, UCSF Brown & Williamson Collection 4 (last visited Dec. 4, 2024), https://www.industrydocuments.ucsf.edu/docs/#id=hrwh0097.
[189] See L T Kozlowski, First, Tell the Truth: A Dialogue on Human Rights, Deception, and the Use of Smokeless Tobacco as a Substitute for Cigarettes, 12 Tobacco Control 34, 36 (2003).
[190] How Black Consumers Are Different, R.J. Reynolds Records Master Settlement Agreement 2 (last visited Dec. 4, 2024), https://industrydocuments.ucsf.edu/docs/#id=ghjw0057.
[191] United States v. Philip Morris USA, Inc., 449 F. Supp. 2d 1, 309 (D.D.C. 2006), aff’d in part, vacated in part, 566 F.3d 1095 (D.C. Cir. 2009).
[192] Kristi Keck, Big Tobacco: A history of its decline, CNN Politics (Jul. 19, 2009), https://edition.cnn.com/2009/POLITICS/06/19/tobacco.decline/#:~:text=As%20skepticism%20from%20the%20public,against%20the%20top%20tobacco%20companies.
[193] See Douglas Martin, Merrell Williams Jr., Paralegal Who Bared Big Tobacco, Dies at 72, N.Y. Times (Nov. 26 2013), https://www.nytimes.com/2013/11/27/business/merrell-williams-jr-paralegal-who-bared-big-tobacco-dies-at-72.html.
[194] See Ryan Mac & Cecilia Kang, Whistle-Blower Says Facebook ‘Chooses Profits Over Safety,’ N.Y. Times (updated June 23, 2023), https://www.nytimes.com/2021/10/03/technology/whistle-blower-facebook-frances-haugen.html.
[195] Avi Asher-Schapiro, How leaked Facebook documents propelled social media lawsuits, Context News (Feb. 08, 2023), https://www.context.news/big-tech/how-leaked-facebook-documents-propelled-social-media-lawsuits.
[196] See Lemmon v. Snap, Inc., No. CV 19-4504-MWF (KSx), 2022 WL 1407936, at *1 (C.D. Cal. Mar. 31, 2022).
[197] Id. at *9–12.
[198] Id. at *12.
[199] Id. at *10.
[200] Id. at *12.
[201] Id.
[202] Joint Rule 26(f) Report, Lemmon et al. v. Snap, Inc., No. 2:19-cv-04504, Filing No. 90 (C.D. Cal. May 27, 2022).
[203] Id.
[204] Id.
[205] Order Granting Stipulation of Dismissal with Prejudice, Lemmon et al. v. Snap, Inc., No. 2:19-cv-04504, Filing No. 105 (C.D. Cal. May 23, 2019).
[206] Bobby Allyn, Snapchat Ends ‘Speed Filter’ That Critics Say Encouraged Reckless Driving, Nat’l Pub. Radio (Jun. 17, 2021, 11:58 AM), https://www.npr.org/2021/06/17/1007385955/snapchat-ends-speed-filter-that-critics-say-encouraged-reckless-driving.
[207] For example, the platforms involved in the In Re Social Media Addiction litigation—Facebook and Instagram, YouTube, and Snapchat—are owned by Meta (trading on the NASDAQ exchange under the ticker symbol META), Alphabet (trading on the NASDAQ exchange under the ticker symbol GOOG), and Snap (trading on the New York Stock Exchange under the ticker SNAP), respectively. At the end of 2023, Meta and Alphabet alone had market capitalizations of more than $2.6 trillion.
[208] U.S. Sec. and Exch. Comm’n, Investor Bulletin: How to Read a 10-K 2 (2011), https://www.sec.gov/files/reada10k.pdf.
[209] See 17 C.F.R. § 229.103 (requiring disclosure of material legal proceedings affecting a company).
[210] Virginia Milstead & Mark Foster, Beware of potential securities litigation over risk-factor disclosures, Reuters (Jan. 24, 2024, 11:50 AM), https://www.reuters.com/legal/legalindustry/beware-potential-securities-litigation-over-risk-factor-disclosures-2024-01-24/.
[211] Meta Platforms, Inc., Annual Report at 52 (Form 10-K) (Feb. 1, 2024).
[212] Id. at 10.
[213] Id. at 56.
[214] Alphabet, Inc., Annual Report at 19 (Form 10-K) (Jan. 30, 2024).
[215] Snap, Inc., Annual Report at 51 (Form 10-K) (Feb. 6, 2024).
[216] Id. at 36.
[217] Alphabet, Inc., supra note 214, at 10.
[218] Id.
[219] Snap, Inc., supra note 215, at 4.
[220] Meta Platforms, Inc., supra note 211, at 40.
[221] See generally Moody v. NetChoice, LLC, 603 U.S. 707 (2024).
[222] Soc. Media Litig. II, 753 F. Supp. 3d 849, 889 (N.D. Cal. 2024).
[223] See generally Sarah S. Seo, Failed Analogies: Justice Thomas’s Concurrence in Biden v. Knight First Amendment Institute, 32 Fordham Intell. Prop. Media & Ent. L. J. 1070 (2022).
Category: