By Matthew Lewis
October 10, 2022
Social media, while a valued tool for enhancing communication and community, has caused deep polarization and societal harm to the American public. Recent research has shown that social media is causing rising rates of depression and anxiety in children. As social media becomes a persistent staple of life for American children, the platforms continue to go unregulated and operate as nefarious tools with which children are made to feel isolated and judged in a way that is proving to be a source of long-term trauma. While social media executives are not blind to the harm they are causing to children, it remains unclear how much social media executives knew about the harm and subsequently exploited this harm for their own financial gain.
Presently, as Congress remains stalled in its attempt to regulate social media firms,1 Paul Barrett, Justin Hendrix, & J. Sims, Fueling the Fire: How Social Media Intensifies U.S. Political Polarization And What Can Be Done About it, N.Y.U. Stern Center for Business and Human Rights 23 (Sept. 2021), https://static1.squarespace.com/static/5b6df958f8370af3217d4178/t/613a4d4cc86b9d3810eb35aa/1631210832122/NYU+CBHR+Fueling+The+Fire_FINAL+ONLINE+REVISED+Sep7.pdf. state attorneys general are meeting the moment by launching their own investigation into the unique and pervasive harm of social media on children. In two independent investigations, a coalition of state attorneys general are seeking to examine the sources of social media’s harm to children and uncover what the executives at major social media sites knew in relation to exploitation of the harm. These investigations are a vital step, but should still go further. State attorneys general have yet to issue subpoenas and file complaints that would require disclosure of social media platform operations. Moreover, state attorneys general have additional untapped power to curb the harms of social media through legislative and public advocacy.
This piece begins by exploring the unique and challenging harm that social media presents to children. It then examines the power state attorneys generals have to address the harm in the public interest.
I. Social Media Operation and its Harms
Despite social media becoming an important part of everyday social life for many children – approximately 38 percent of children aged eight to twelve report using social media2Melinda Wenner Moyer, Kids as Young as 8 Are Using Social Media More Than Ever, Study Finds, N.Y. Times (Mar. 24, 2022), https://www.nytimes.com/2022/03/24/well/family/child-social-media-use.html. – the process by which these platforms provide content to users remains highly elusive to the general public. This process, also known as an “algorithm,” filters content for users based on their disclosed preferences and past interactions on the platform.3Id. Social media firms develop algorithms based on a series of factors weighed against each other in order to maximize a user’s time on the sites. Social media firms subsequently sell user data to firms seeking to advertise to their target demographic. The harm to children resides in how, for profitmaking purposes, the algorithms amplify content that exacerbates a child’s insecurities and vulnerabilities. Social media sites are commercially driven to amplify the duration of time children spend on their sites. However, in amplifying content that is most likely to drive user engagement, firms are also promoting content that is most likely to cause children to feel alienation, self-doubt, depression, and anxiety.
A wide swath of data supports the hypothesis that social media is harmful to the mental health of children. In 2017, the independent research charity Royal Society for Public Health found that different social media platforms harmed children by exacerbating their anxiety, loneliness, body image, and sleep. The study, which asked teens to rate how different platforms affected them, consistently ranked Instagram as the most harmful. Moreover, an additional study by University of Cambridge researchers found that girls aged eleven to thirteen and boys aged fourteen to fifteen were less satisfied with their lives just a year after increasing their time on social media. In a 2021 Atlantic piece, New York University Professor Jonathan Haidt diagnosed the ills of social media by explaining that teenagers are posting and waiting for likes in a judgmental game in which those who don’t “play” may be ostracized from social circles. Ultimately, this leaves parents in a difficult trap of determining whether to permit their children to use an app likely to be harmful to their mental health, or leave them at risk of being excluded from social circles that are centered on the platforms.
While the sites purport to operate under a self-policing form of regulation that works to stifle the harms of cyberbullying, these practices do little to stymie the harmful social pressure that is perpetuated on the platforms.4Paul M. Barrett, The metaverse is the world’s strongest argument for social media regulation, The Hill (Feb. 23, 2022, 11:31 AM), https://thehill.com/opinion/technology/595333-the-metaverse-is-the-worlds-strongest-argument-for-social-media-regulation [https://perma.cc/UWZ7-T5EL]. Moreover, the self-regulation system ignores the ability of executives to break their own rules. For example, in 2021 reporting of the “Facebook Papers” found that Facebook (now known as Meta) maintains a list of VIPS that are permitted to break Facebook’s own anti-violence and misinformation policies because they are influential users. Therefore, due to the futility of social media firms’ ability to self-regulate, state attorneys general are needed to find a way towards reform that can better protect their children residents and inform parents of the harms of social media. As Nebraska Attorney General Doug Peterson stated when launching a multi-state investigation into the ills of social media platforms: “When social media platforms treat our kids as mere commodities to manipulate for longer screen time engagement and data extraction, it becomes imperative for state attorneys general to engage our investigative authority under our consumer protection laws.”
II. Federal Inaction on Social Media’s Harm to Children
Congress’s lack of comprehensive reform on social media firms is a driving force in the need for state attorneys general to seek investigation and reform. Presently, Congress is pursuing a multitude of reforms and investigations into social media operations with limited success. Congress’s legislative agenda on social media reform includes proposals to develop a digital bureau of the Federal Trade Commission, expand liability against social media firms through reform to Section 230 of the Communications Decency Act, and permit independent audits of platforms’ risk management practices.5Barrett, supra note 1. Additionally, Congress has held a series of hearings with senior social media executives where legislators have demanded enhanced knowledge in the field prior to reform.6For example, in a December 2021 hearing with the head of Instagram, Adam Mosseri, Congress demanded that Instagram share internal data on its algorithmic ranking system related to children to support its legislative agenda on heightened social media protections for adolescents. Cecilia Kang, Lawmakers urge the head of Instagram to better protect children, N.Y. Times (Dec. 8, 2021), https://www.nytimes.com/2021/12/08/technology/adam-mosseri-instagram-senate.html. Specific to social media’s effect on children, Senators Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) have proposed an array of new rules for social media firms to protect children under the Kids Online Safety Act, which would enhance parental supervision and impose a “duty of care” to protect children from self-harm content.7The bill would impose upon covered platforms “a duty to prevent and mitigate the heightened risks of physical, emotional, developmental, or material harms to minors posed by materials on, or engagement with, the platform” including self-harm content, bullying, and “patterns of use that indicate or encourage addiction-like behaviors.” Kids Online Safety Act, S. 3663, 117th Cong. § 3 (2022). However, while reform proposals and hearings are worthy of Congress’s consideration, Congress’s legislative inertia and its disjointed approach to policymaking stifles its ability to adequately address the unique problem of social media’s harm to children.8Former President Donald Trump, Senator Ted Cruz, and others repeatedly threaten regulation on social media platforms for its alleged anti-conservative bias despite independent studies finding no such prejudice. Without a more cohesive non-partisan approach to social media policy, reform remains unlikely to surmount partisan gridlock in Congress. Adam Gabbatt, Claim of anti-conservative bias by social media firms is baseless, report finds, The Guardian (Feb. 1, 2021, 12:19 PM), https://www.theguardian.com/media/2021/feb/01/facebook-youtube-twitter-anti-conservative-claims-baseless-report-finds [https://perma.cc/XW4Q-PZZA].
Most recently, California Governor Gavin Newsom signed into law AB 2273, which prohibits social media platforms and other online service providers likely to be accessed by children “from using a child’s personal information; collecting, selling, or retaining a child’s geolocation; profiling a child by default; and leading or encouraging children to provide personal information.” The new law will require social media companies to study the harm caused to children and allow for the attorney general to access the reports if requested. The bipartisan bill was passed after the failure of another social media reform bill that would have permitted government lawyers to sue social media firms that cause addiction to child users.
The United States is not alone in its efforts to curb social media’s influence on children. Both the United Kingdom and Australia are seeking to limit the rising power of unchecked algorithms. In the United Kingdom, a new bill known as the Online Safety Bill will tax social media firms that fail to limit the spread of abusive content that is harmful to children – fines may be as high as 10% of global turnover. Additionally, in Australia legislators are weighing passage of a bill that requires parental consent before social media companies use or release personal information for users younger than 16.
III. State Attorneys Generals’ Power to Limit Social Media Harm
State attorneys general are equipped in three ways to serve the public interest and address the harms of social media against children: (A) investigation and litigation against social media platforms; (B) advocating for policy reform in their state legislatures, Congress, and directly to platforms; and (C) educating the public. This section will provide a roadmap for action by state attorneys general to address the rising harm of social media on children’s mental and emotional wellness.
A. Attorneys General Investigation and Litigation into Social Media Platforms
Attorneys general have the power to fill the information gap on executive decision-making and the algorithmic operations of social media firms through their broad investigatory powers. A multi-state investigation, involving California, Florida, Kentucky, Nebraska, New Jersey, Tennessee, and Vermont, against Tik-Tok and Meta is being co-led by Massachusetts Attorney General Maura Healey; thus an analysis of her office’s power to investigate and subpoena private companies is instructive. In public statements regarding the open investigation, Attorney General Healey posits that the investigation into TikTok will focus on “the methods and techniques utilized by TikTok to boost young user engagement, including increasing the duration of time spent on the platform” as a means to uncover the “harm such usage may cause young people and what TikTok knew about those harms.”
The Massachusetts Attorney General is statutorily authorized to conduct investigations, including seeking testimony under oath of parties involved in alleged behavior based on their “belief” that a person is violating the laws of the state of Massachusetts. Massachusetts’ General Laws Chapter 93A §6 provides:
“The attorney general, whenever he believes a person has engaged in or is engaging in any method, act or practice declared to be unlawful by this chapter, may conduct an investigation to ascertain whether in fact such person has engaged in or is engaging in such method, act or practice. In conducting such investigation he may (a) take testimony under oath concerning such alleged unlawful method, act or practice; (b) examine or cause to be examined any documentary material of whatever nature relevant to such alleged unlawful method, act or practice….”9Mass. Gen. Laws Ann. ch. 93A, § 6 (West 2022).
In a case involving alleged unfair trade practices of Exxon Mobil in its statements to investors regarding potentially misleading claims about the risks of climate change, a Massachusetts Superior Court found that the attorney general satisfied the §6 requirements.10In re Civ. Investigative Demand No. 2016-EPD-36, No. SUCV20161888F, 2017 WL 627305, at *5 (Mass. Super. Ct. Jan. 11, 2017). In that case the court found that Attorney General Healey satisfied both the belief and particularity factors with which she requested information related to climate change. First, the court found that “[t]here is no requirement that the Attorney General have probable cause to believe that a violation of G.L.c. 93A has occurred; she need only have a belief that a person has engaged in or is engaging in conduct declared to be unlawful.” Second, the court found that Attorney General Healey properly articulated with particularity the material required for the investigation in her request to Exxon of “essentially all documents related to climate change.”11See id. at *1. Allowing the attorney general to seek such a broad set of information related to an investigation is what places the attorneys general at the forefront of oversight when Congress lacks the ability to provide federal reform.
Regarding a social media investigation, Attorney General Healey may use the Exxon case as a template for pursuing information related to the harms of social media and what executives knew at the time. Based on studies and past evidence of social media harm to children and exploitation of that harm for higher screen time, Attorney General Healey would likely satisfy the “belief” standard under §6 of the Massachusetts General Laws pursuant to her assertions that TikTok is violating consumer protection laws by “designing, operating, and promoting its social media platform to children, teens, and young adults in a manner that causes or exacerbates physical and mental health harms.” Furthermore, Attorney General Healey may request “all documents related” to the operation of algorithms on adolescent users from any social media platform per the Exxon standard. Additionally, under §6, the Massachusetts Attorney General may compel testimony from any person who “has engaged in or is engaging in such method, act, or practice” permitting the attorney general to further pursue disclosure of the platform’s algorithmic operations from top executives.12Mass. Gen. Laws Ann. ch. 93A, § 6 (West).
Ultimately, the Massachusetts attorney general’s investigative power represents a viable path to disclosure where Congress has failed. By requiring disclosure, the attorney general will be able to further reform social media’s harm to children in litigation, advocacy in the state legislature, and education to the public. As noted above, the Massachusetts’s attorney general is not alone in her pursuit of oversight and reform on social media companies. States like New York, California, Nebraska, and others similarly maintain broad investigatory power that should be harnessed to enact change in the industry.
B. Advocating for Social Media Reform in Congress, Direct to Platforms, and State Legislatures
The current model of federal enforcement for the online protection of children is outdated and demands the action of state attorneys general to fill the gap with legislative advocacy. Under the present method of federal enforcement, websites are not permitted to collect personal data on children under the age of thirteen without parental consent per the Children’s Online Privacy Protection Act (COPPA) passed in 1998.13Children’s Online Privacy Protection Rule, 16 C.F.R. § 312.2 (2013) (implementing the Children’s Online Privacy Protection Act of 1998, 15 U.S.C. § 6501-6508). The act, which was modernized in 2010, continues to provide the basis for federal action against online platforms.14See id. For example, in 2019 the FTC and State of New York settled with Youtube’s owner Google under COPPA for $170 million for violations related to a lack of child identification on content and notice of parental consent requirements.15FTC v. Google, Case No. 1:19-cv-02642 (D.D.C. 2019) (Stipulated Order), www.ftc.gov/system/files/documents/cases/172_3083_youtube_coppa_consent_order_signed.pdf. However, the Act’s rules on parental consent does not protect children from the new harms of social media like depression and anxiety.
State attorneys general are able to advocate for legislative change in three arenas. First, state attorneys general can press Congress for updates to laws either through independent letters or an organizational forum. For example, in September 2021, a bipartisan coalition of state attorneys general argued for an update to Federal Antitrust laws in a letter to Congress. Second, state attorneys general may press for reform directly to private firms that present a threat to the public interest. Most recently in May 2021, the National Association of Attorneys General addressed a letter to Facebook CEO Mark Zuckerberg urging the platform to end its plan for an exclusive under-thirteen version of Instagram – an initiative that Facebook ultimately abandoned. Finally, as federal law is often intended to provide a floor for safety, state attorneys general may advocate in their own state legislatures for laws that heighten safety for children online.16For example, California is advancing a bill aimed at allowing parents to sue social media firms for damage caused by their children becoming addicted to the sites. Brian Contreras, California bill would let parents sue social media companies for addicting kids, L.A. Times (Mar. 16, 2022, 12:37 PM) https://www.latimes.com/business/technology/story/2022-03-16/california-bill-would-let-parents-sue-social-media-companies-for-addicting-kids. See also Geier v. Am. Honda Motor Co., 529 U.S. 861, 862 (2000) (discussing how a federal law may preserve the ability of states to “to establish greater safety than the minimum safety achieved by a federal regulation intended to provide a floor”). Therefore, as the highest law enforcement officer in their state, attorneys general maintain legislative and direct advocacy outlets that can positively affect social media platforms for teenagers.
C. Public Education of Social Media Harms
According to famed Eastern District of New York Judge Jack Weinstein, the core of a government attorney’s role is to serve the public interest.17Jack B. Weinstein, Some Ethical and Political Problems of a Government Attorney, 18 Me. L. Rev. 155 (1966). One of the ways in which the Attorney General serves the public is through issuing warnings and tips for consumers to avoid the malfeasance of nefarious actors in consumer protection. Each year in March, the nation undergoes a “National Consumer Protection Week” that includes statements from the White House, Federal Trade Commission, and state attorneys general among others on how consumers can be “fully informed about their rights and the potential risks in the marketplace.” As there is ongoing research on the harms of social media toward children, incorporating public education directed toward parents represents a viable path toward curbing widespread trauma on children.
Presently, a coalition of attorneys general maintain tips for parents to protect children on social media sites. For example, New York Attorney General Letitia James provides a webpage with information and links on how parents can best be involved in their child’s usage of social media. While informative, the sites lack comprehensive information on how social media platforms are causing harm through not only cyberbullying but through social comparison that systemically causes greater levels of anxiety and depression for children. Therefore, it is in the best interest of the public for state attorneys general to modernize their public education programs on social media for parents by incorporating information about social media harms into consumer protection week and updating their websites on recent studies.
This piece serves as a basis for action by state attorneys general in seeking disclosure and reform from social media platforms related to their algorithmic operations on adolescent users. As Congress remains stalled in its efforts to pursue executive decision-making around social media’s targeting of children, international reporting and scientific studies have confirmed the role of social media in exacerbating depression and anxiety among children. Due to the unique legal standard and power entrusted to their offices, state attorneys general should act to limit the harms of social media on children through investigation and litigation, direct advocacy to social media firms and legislatures, and public education to their residents.
Matthew Lewis, J.D. Class of 2022, N.Y.U. School of Law.
Suggested Citation: Matthew Lewis, The Role of the Attorney General in Reforming Social Media for Children, N.Y.U. J. Legis. & Pub. Pol’y Quorum (2022).
- 1Paul Barrett, Justin Hendrix, & J. Sims, Fueling the Fire: How Social Media Intensifies U.S. Political Polarization And What Can Be Done About it, N.Y.U. Stern Center for Business and Human Rights 23 (Sept. 2021), https://static1.squarespace.com/static/5b6df958f8370af3217d4178/t/613a4d4cc86b9d3810eb35aa/1631210832122/NYU+CBHR+Fueling+The+Fire_FINAL+ONLINE+REVISED+Sep7.pdf.
- 2Melinda Wenner Moyer, Kids as Young as 8 Are Using Social Media More Than Ever, Study Finds, N.Y. Times (Mar. 24, 2022), https://www.nytimes.com/2022/03/24/well/family/child-social-media-use.html.
- 4Paul M. Barrett, The metaverse is the world’s strongest argument for social media regulation, The Hill (Feb. 23, 2022, 11:31 AM), https://thehill.com/opinion/technology/595333-the-metaverse-is-the-worlds-strongest-argument-for-social-media-regulation [https://perma.cc/UWZ7-T5EL].
- 5Barrett, supra note 1.
- 6For example, in a December 2021 hearing with the head of Instagram, Adam Mosseri, Congress demanded that Instagram share internal data on its algorithmic ranking system related to children to support its legislative agenda on heightened social media protections for adolescents. Cecilia Kang, Lawmakers urge the head of Instagram to better protect children, N.Y. Times (Dec. 8, 2021), https://www.nytimes.com/2021/12/08/technology/adam-mosseri-instagram-senate.html.
- 7The bill would impose upon covered platforms “a duty to prevent and mitigate the heightened risks of physical, emotional, developmental, or material harms to minors posed by materials on, or engagement with, the platform” including self-harm content, bullying, and “patterns of use that indicate or encourage addiction-like behaviors.” Kids Online Safety Act, S. 3663, 117th Cong. § 3 (2022).
- 8Former President Donald Trump, Senator Ted Cruz, and others repeatedly threaten regulation on social media platforms for its alleged anti-conservative bias despite independent studies finding no such prejudice. Without a more cohesive non-partisan approach to social media policy, reform remains unlikely to surmount partisan gridlock in Congress. Adam Gabbatt, Claim of anti-conservative bias by social media firms is baseless, report finds, The Guardian (Feb. 1, 2021, 12:19 PM), https://www.theguardian.com/media/2021/feb/01/facebook-youtube-twitter-anti-conservative-claims-baseless-report-finds [https://perma.cc/XW4Q-PZZA].
- 9Mass. Gen. Laws Ann. ch. 93A, § 6 (West 2022).
- 10In re Civ. Investigative Demand No. 2016-EPD-36, No. SUCV20161888F, 2017 WL 627305, at *5 (Mass. Super. Ct. Jan. 11, 2017).
- 11See id. at *1.
- 12Mass. Gen. Laws Ann. ch. 93A, § 6 (West).
- 13Children’s Online Privacy Protection Rule, 16 C.F.R. § 312.2 (2013) (implementing the Children’s Online Privacy Protection Act of 1998, 15 U.S.C. § 6501-6508).
- 14See id.
- 15FTC v. Google, Case No. 1:19-cv-02642 (D.D.C. 2019) (Stipulated Order), www.ftc.gov/system/files/documents/cases/172_3083_youtube_coppa_consent_order_signed.pdf.
- 16For example, California is advancing a bill aimed at allowing parents to sue social media firms for damage caused by their children becoming addicted to the sites. Brian Contreras, California bill would let parents sue social media companies for addicting kids, L.A. Times (Mar. 16, 2022, 12:37 PM) https://www.latimes.com/business/technology/story/2022-03-16/california-bill-would-let-parents-sue-social-media-companies-for-addicting-kids. See also Geier v. Am. Honda Motor Co., 529 U.S. 861, 862 (2000) (discussing how a federal law may preserve the ability of states to “to establish greater safety than the minimum safety achieved by a federal regulation intended to provide a floor”).
- 17Jack B. Weinstein, Some Ethical and Political Problems of a Government Attorney, 18 Me. L. Rev. 155 (1966).