When users abuse a platform, who is responsible?

What do we expect of content moderation? And what do we expect of platforms?

Last week, Logan Paul, a 22-year-old YouTube star with 15 million-plus subscribers, posted a controversial video that brought these questions into focus. Paul’s videos, a relentless barrage of boasts, pranks, and stunts, have garnered him legions of adoring fans. But he faced public backlash after posting a video in which he and his buddies ventured into the Aokigahara Forest of Japan, sometimes called the “suicide forest,” only to find the body of a young man who appeared to have recently hanged himself.

Rather than turning off the camera, Paul continued his antics, pinballing between awe and irreverence, showing the body up close and then turning the attention back to his own reaction. The video lingered on the body, including close-ups of his swollen hand. Paul’s reactions were self-centered and cruel.

After a blistering wave of criticism in the comment threads and on Twitter, Paul removed the video and issued a written apology, which was itself criticized for not striking the right tone. A somewhat more heartfelt video apology followed. He later announced he would be taking a break from YouTube. YouTube went on to strip Paul from their top-tier monetization system, and announced yesterday that Paul would face “further consequences.”

The controversy surrounding Paul and his video highlights the undeniable need, now more than ever, to reconsider the public responsibilities of social media platforms. For too long, platforms have enjoyed generous legal protections and an equally generous cultural allowance: to be “mere conduits” not liable for what users post to them.

In the shadow of this protection, they have constructed baroque moderation mechanisms: flagging, review teams, crowdworkers, automatic detection tools, age barriers, suspensions, verification status, external Read More Here