The role of dominant content platforms like Facebook and Twitter in facilitating Russian election interference in the 2016 US presidential election has precipitated a backlash against “big tech,” and now the pendulum is swinging toward greater regulation of platforms for what their users say and do. This paper examines an illustration of that effort in the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA), the first-ever statutory amendment to Section 230 of the Communications Decency Act, and the corporate turn to algorithmic moderation in the shadow of regulation in the United States and abroad. We urge caution in both legislative and corporate responses to the current crisis of confidence in technology, while counseling openness to solutions for improving speech environments online.
Platform Justice: Content Moderation at an Inflection Point by Hoover Institution on Scribd