The Supreme Court may be ready to weigh in on an issue that, even more than other far-reaching issues in its field, affects almost every citizen almost every day: the Internet. By doing so, judges have an opportunity to make a confusing area of government less murky. They also have a chance to do a lot of damage along the way.
A divided panel of the US Court of Appeals for the Fifth Circuit last week upheld a Texas law that prohibits online platforms from removing user-generated material on their sites based on the user’s point of view or point of view. of view expressed in a publication. Earlier this year, a unanimous panel of the 11th US Circuit Court of Appeals found that a Florida law similarly restricting tech companies violated the First Amendment. Now, Florida has asked the Supreme Court to reconsider. The court, if it agrees to take the case, will face questions about the ability of governments to regulate speech in the digital age, which both sides have so far approached as all or nothing, but which do require nuance and care.
Those two attributes were conspicuously missing from Justice Andrew Oldham’s majority opinion in NetChoice vs. Paxton, the Fifth Circuit case, which denies any First Amendment protection for what most people call content moderation by platforms, but which its author insists on calling censorship. This conflicts with many precedents for the right of corporations to decide what kind of speech they will present. But most alarming are the blatant mischaracterizations of social networking sites that opinion uses to justify this position. The claim that neo-Nazi and terrorist materials are “borderline hypothetical” ignores the platforms’ documented and ongoing game of whack-a-mole with that kind of hate. The claim that the sites “exercise virtually no editorial control or judgment” somehow misses the millions of pieces of content they review daily, and the many more algorithmic filters prevent them from showing up.
This last point is supposed to prove that the government can classify platforms as “common carriers”, just like railroads or phone providers, and require them not to discriminate. Those on the opposite side of this debate believe that this is the wrong analogy, and it is. But the alternative they propose is just as shaky: they say these platforms are more like newspapers or radio stations. The truth lies somewhere in between. Social networking sites act as a kind of public utility; they also exercise the editorial control and judgment that are essential to the value they provide. They exist in a category of their own, and so far no court has figured out what standard should apply to them, or what types of speech regulation, from the extreme restrictions in Texas and Florida to the more moderate transparency mandates under consideration elsewhere or nothing at all. absolute. everything, the Constitution allows it.
It seems more likely than ever that the Supreme Court will do so in the near future. If so, judges must resist the temptation of seemingly easy answers that ignore the more difficult realities of the digital age.
The publication view | About the Editorial Board
Editorials represent the views of The Washington Post as an institution, as determined through discussion among members of the Editorial Board, based on the Opinions section and separate from the newsroom.
Members of the Editorial Board and areas of focus: Deputy Editorial Page Editor Karen Tumulty; Deputy Editorial Page Editor Ruth Marcus; Editorial Page Associate Editor Jo-Ann Armao (Education, DC Affairs); Jonathan Capehart (national politics); Lee Hockstader (immigration; issues affecting Virginia and Maryland); David E. Hoffman (global public health); Charles Lane (foreign affairs, national security, international economics); Heather Long (economics); Molly Roberts (technology and society); and Stephen Stromberg (elections, White House, Congress, legal, energy, environment, health).