The First Amendment as a Parchment Barrier – the US Supreme Court’s TikTok Decision – Constitutional Law and Philosophy

0
8


[This is a guest post by Rudraksh Lakra.]


Introduction

The Supreme Court of the United States (“SCOTUS”), in TikTok Inc. v. Garland on January 17, 2025, rejected TikTok’s appeal thereby upholding the Protecting Americans from Foreign Adversary Controlled Applications Act, 2024 (“PAFACA Act”). Passed in April 2024, the Act required ByteDance Ltd., TikTok’s parent company, to divest and sell its interest in TikTok or face a ban. The Act does not ban the platform directly; rather, it imposes onerous penalties on companies like Apple and Google for distributing, maintaining, or updating an entity classified as a foreign adversary-controlled application, in this case, TikTok. The law gave ByteDance 270 days to divest its interest in TikTok. During this period, TikTok challenged the measure unsuccessfully first before the District of Columbia Circuit in September 2024, and later before SCOTUS in December of the same year. On January 19, TikTok went offline in the United States, but came back online after 12 hours based on assurances from then-President-elect Donald Trump. On January 20th, the day of his inauguration, he followed up on his assurances by issuing an executive order to delay the ban for 75 days. However, this decision faced opposition regarding the resulting impact on the platform.

This analysis aims to examine SCOTUS’s decision in TikTok Inc. v. Garland, focusing on both the standard of review applied by the Court and the merits of the case. The critique highlights SCOTUS’s reasoning as excessively deferential to the government’s position. SCOTUS did so through two strategic moves: (1) by incorrectly applying a lower standard of review, intermediate scrutiny, and (2) by failing to meaningfully engage with the government’s invocation of speculative national security concerns and the proposed alternative measures by the petitioners.

Analysis

Standard for Review

SCOTUS held that the PAFACA Act is a content- and viewpoint-neutral measure, making it subject to intermediate scrutiny rather than strict scrutiny (Page 13). Intermediate scrutiny, a lower standard for review, requires the government to demonstrate an important or substantial interest, as opposed to the compelling interest required under strict scrutiny. Unlike strict scrutiny, intermediate scrutiny does not necessitate adopting the least restrictive means but instead focuses on whether the measure is narrowly tailored. Additionally, intermediate scrutiny grants greater deference to the state. Hence, strict scrutiny is a significantly more rigorous standard, and historically cases have rarely have successfully met its requirements.

The adoption of the intermediate scrutiny standard set the tone for the entire case. It enabled the Court to grant wide deference to the state, avoiding a meaningful examination of alternatives and the national security claims made by the government. In this case, the Court applied intermediate scrutiny because the “TikTok-specific designation and divestiture requirement” were found to regulate TikTok based on a content-neutral interest—namely, data protection linked to national security (Page 11-12). The cited data security concern was that TikTok collected “personal data from 170 million U.S. users” (ibid.) Thus, according to the SCOTUS, the rationale for the PAFACA Act was data security concerns. As Chief Justice Roberts noted during the oral proceedings, “Congress doesn’t care what’s on TikTok” (Page 81).

However, the Court overlooked the government’s brief submitted to SCOTUS and the Court of Appeals for the DC Circuit Court, which highlighted TikTok’s potential to enable China-backed covert propaganda and manipulation as a key reason for the restrictions. The DC Circuit Court in its impugned decision noted that the government justified the Act substantially by referencing “a foreign adversary’s ability to manipulate content seen by Americans” (Page 25). This was also reflected in the hearings on TikTok, where there were attempts to characterize TikTok’s Singaporean CEO as a Chinese citizen or a CCP member—tapping into anti-China sentiment repeatedly (Page 7). Statements by lawmakers further indicate their objections to certain types of content on the platform, such as pro-Palestinian content.

The content- and viewpoint-neutral assumption is also undermined by the text of the law itself. The statute targets only applications that host expressive “content,” and it explicitly singles out TikTok by name. It creates exceptions for certain types of platforms, such as e-commerce or review platforms, which could raise similar data security concerns, as TikTok. At the heart of the “qualified divestiture” provision is the concern related to TikTok’s algorithm rather than data concerns. The provision precludes any purchaser from obtaining input from its ownership concerning the “content recommendation algorithm.” Consequently, if the platform were sold, it would still require changes to how TikTok curates and recommends “content.”

This indicates that TikTok’s content curation is integral to PAFACA Act. TikTok’s content recommendation algorithm itself is an expressive function protected by the First Amendment (Moody v. NetChoice, 2024). Further, it affects the ability of millions of U.S. users and creators (also petitioners in this case) to speak, collaborate with editors and publishers of their choice, and hear ideas from others—rights tied to freedom of association on the platform. The SCOTUS in Brown v. Entertainment Merchants Association, 2011 has held that “the government has no power to restrict expression because of its message, its ideas, its subject matter, or its content.”  This protection extends even when foreign entities are involved in directing speech (Meese v. Keene, 1987; Lamont v. Postmaster General, 1965).

From the law’s language, stated objectives, and broader political context, it becomes clear that PAFACA aims to regulate specific types of user-generated content on TikTok while controlling viewpoints promoted by its algorithm. The law either mandates divestiture—potentially altering editorial choices—or leads to a platform shutdown, thereby restricting the dissemination of viewpoints disfavored by the government. Thus, PAFACA should have been subjected to strict scrutiny due to its implications for free expression and viewpoint discrimination.

Least Restrictive Measure

The SCOTUS noted that the “petitioners parade a series of alternatives” and that “those alternatives do not alter our tailoring analysis” (Page 16). However, it provided no explanation for why these alternatives do not affect the analysis. The Court immediately followed this remark noting that “petitioners’ proposed alternatives ignore the ‘latitude’ we afford the Government to design regulatory solutions to address content-neutral interests” (Page 17). Further, the Court stated that:

“[T]he validity of the challenged provisions does not turn on whether we agree with the Government’s conclusion that its chosen regulatory path is best or ‘most appropriate’… We cannot displace [the Government’s] judgment respecting content-neutral regulations with our own, so long as its policy is grounded on reasonable factual findings supported by evidence that is substantial for a legislative determination.” (ibid.)(Citing United States v. Albertini 1985 and Turner Broadcasting System, Inc. v. FCC 1997)

Thus, the Court did not meaningfully engage with any alternative measures, justifying its approach by deferring to the government under the guise of content-neutral laws. If strict scrutiny had been applied—as argued earlier, it should have been—it would have been much more difficult for the Court to evade this duty. The alternatives offered by petitioners included, inter alia, disclosure requirements, data-sharing restrictions, a proposed national security agreement, the general designation provision, and data protection requirements for social media platforms (Page 16). Disclosure requirements have traditionally been a remedy for consumer welfare (Zauderer v. Office of Disciplinary Counsel, 1985 and American Meat Institute v. Department of Agriculture, 2014). The next three alternatives would have addressed the data security concerns if that were the primary objective of the law.

The petitioners also raised concerns regarding the underinclusiveness of the PAFACA Act, as it creates exceptions for the designation of “covered companies” for entities whose “primary purpose is to allow users to post product reviews, business reviews, or travel information and reviews” (Page 42). The petitioners argued that there is no reason why TikTok would have been a more pressing concern than large e-commerce companies, especially those actually based in China (Page 43-44). This underinclusiveness leads to the singling out of an expressive entity i.e. TikTok, and SCOTUS has held that discrimination based on the “identity of the speaker is offensive to the First Amendment” (Citizens United v. Federal Election Commission, 2010; Williams-Yulee v. Florida Bar, 2015).

National Security and Data-security Related Concerns

The SCOTUS approach to the government’s national security claims involved granting considerable deference and accepting speculative concerns. The primary data security issue revolved around TikTok’s collection of data from millions of Americans, which is allegedly sent back to its parent company, ByteDance (Page 13-15). Under Chinese law, the government has extensive powers to access or monitor this data (ibid.). However, the government did not present any concrete evidence that China has utilized this data for its strategic national security purposes. The Court recognized this lack of evidence, stating, “even if China has not yet leveraged its relationship with ByteDance Ltd. to access U.S. TikTok users’ data, petitioners offer no basis for concluding that the Government’s determination that China might do so is not at least a ‘reasonable inference based on substantial evidence.’” (Page 14-15). Before the DC Circuit Court, the government had acknowledged “that it lacks specific intelligence that shows the PRC has in the past or is now coercing TikTok into manipulating content in the United States” (Page 47).

The Court’s rationale was predicated on a notion of ‘reasonable possibility’ of harm, which diverges from the standard established in Brandenburg v. Ohio 1969, which emphasizes an “imminent lawless action principle.” The Court in cases like Landmark Communications, Inc. v. Virginia 1978 and New York Times Co. v. United States 1971 held that a security justification for suppressing speech “must not be remote or even probable; it must immediately imperil.” Similarly, in Holder v. Humanitarian Law Project, 2010, the SCOTUS mandated the requirement of immediacy and accepted the law as necessary to “prevent imminent harms,” not speculative risks for national security invocation. The Court failed to engage with these standards and accepts the speculative concerns of the government.

SCOTUS noted the government’s worry that China’s access to TikTok data may enable “it to track the locations of federal employees and contractors, build dossiers of personal information for blackmail, and conduct corporate espionage” (Page 14). Mueller and Farhat noted in their national security threat analysis of TikTok, that the data collected by the platform “can only be of espionage value if it comes from users who are intimately connected to national security functions and use the app in ways that expose sensitive information. These risks arise from the use of any social media app, not just TikTok, and cannot be mitigated by arbitrarily banning one app.” Overall, there is minimal empirical evidence to support claims that the Chinese government has exploited TikTok in a manner that poses an immediate threat to United States national security, particularly through content manipulation or downstream harm resulting from data collection (see here, and here).

Conclusion

When the different strands of analysis above are brought together, in my opinion, there is a compelling argument that the PAFACA Act failed to meet the First Amendment requirements. The Act primarily relied on speculative assertions regarding national security threats, overlooked less restrictive alternatives that could be more precisely tailored, and was ultimately underinclusive. Moreover, it disproportionately impacted TikTok’s creators and users who depend on the platform to exercise their rights to free speech and association. Their voices were notably absent in the majority’s decision. The majority opinion contained only one line acknowledging their concerns: “There is no doubt that, for more than 170 million Americans, TikTok offers a distinctive and expansive outlet for expression, means of engagement, and source of community.” This acknowledgement appeared at the end of the decision and lacked substantive engagement with the rights of these users. In contrast, the concurrence by Justice Sotomayor at least recognized that “the Act implicates content creators’ ‘right to associate’ with their preferred publisher ‘for the purpose of speaking.’” Similarly, Justice Gorsuch, in his concurring opinion, also raised concerns, “harbor[ing] serious reservations about whether the law before us is ‘content neutral’ and thus escapes ‘strict scrutiny,’” referencing the petitioners’ submissions.

During the First Red Scare, leading up to World War I, the Supreme Court decided Whitney v. California in 1927, upholding the conviction of an individual under a law that criminalized even mere advocacy and participation in Communist Labor Party activities. The Court concluded that the goals of the party were “inimical to the public welfare, tending to incite crime, disturb the peace, or endanger the foundations of organized government and threaten its overthrow,” relying on what was essentially the “bad tendency test.” TikTok Inc. v. Garland has certain similarities to Whitney v. California. In the latter, during the First Red Scare there were anxieties due to the perceived threat from what was termed “radicalism at home,” while today there is the spectre of China as a looming “superpower,” and the U.S.-China relations have been strained over the last decade. Based on speculative data security and systemic manipulation concerns, the law was passed by both houses of Congress and subsequently upheld on similarly thinly veiled national security grounds by the DC Circuit Court and SCOTUS. The real reason for the ban, it should be evident, was the government disagreement, per se, regarding the substance of the content posted by TikTok’s users and its alleged editorial choices in disseminating that content. Interestingly, Justice Kagan remarked during the oral proceedings that the Solicitor General employed arguments reminiscent of those used during the Red Scare (Page 133-34). By upholding the PAFACA Act, SCOTUS has effectively weakened First Amendment protections and significantly expanded the government’s authority to restrict speech under the pretext of national security. This precedent could pave the way for more repressive policies in the future.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here