
This Article addresses a critical but underexplored aspect of content moderation: if a user’s online content or actions violate an Internet service’s rules, what should happen next? The longstanding expectation is that Internet services should remove violative content or accounts from their services as quickly as possible, and many laws mandate that result. However, Internet services have a wide range of other options—what I call “remedies”—they can use to redress content or accounts that violate the applicable rules. This Article describes dozens of remedies that Internet services have actually imposed. It then provides a normative framework to help Internet services and regulators navigate these remedial options to address the many difficult tradeoffs involved in content moderation. By moving past the binary remove-or-not remedy framework that dominates the current discourse about content moderation, this Article helps to improve the efficacy of content moderation, promote free expression, promote competition among Internet services, and improve Internet services’ community-building functions.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 31 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
