Downloads provided by UsageCounts
Dominant social media platforms are increasingly using automation and AI to find and remove problematic content. While this helps stop some of the worst content from spreading on the Internet, algorithmic content moderation can delete content that should not be deleted (“overblocking”) or discriminate against minorities. Crucially, there is very little transparency from platforms about how algorithmic content moderation works, how accurate its technologies are believed to be, and how much content they remove, especially without human review. As the use of technologies is likely to only increase, regulators should take the initiative on transparency by requiring platforms to make disclosures.
transparency, accountability, multi-level disclosure rules, platform governance, algorithmic content moderation, redress mechanism
transparency, accountability, multi-level disclosure rules, platform governance, algorithmic content moderation, redress mechanism
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 26 | |
| downloads | 16 |

Views provided by UsageCounts
Downloads provided by UsageCounts