8chan is the anonymous forum that became the staging point for #GamerGate after they were booted off of 4chan for coordinating harassment and doxxing “operations.”
8chan bills itself as being a so-called pure bastion of free speech, and in the midst of Reddit’s recent meltdown over the banning of hateful and abusive subs, it became the new home of many disaffected Redditors.
Well, apologies to those who predicted that a small amount of moderation would be the death of Reddit, but they were wrong. In fact, 8chan is the forum facing marginalization as Google recently stopped listing the site in their search results due to reports of child pornography being hosted on the site.
Voat, the other major Reddit-style alternative site that users flocked to in the wake of the #RedditRevolution, faces similar problems:
First Voat’s hosting provider, Host Europe, shut down its servers on the grounds that the site tolerated “illegal right-wing extremist content.”
Days later, PayPal — the vendor that Voat was using to process more than $8,000 in donations — froze the site’s account, ruling that it violated PayPal’s policy against “certain sexually oriented materials or services.”
Not five hours later, those “sexually oriented materials” came to light on Voat’s front page: Many of the girls pictured nude in the site’s “jailbait” forum, one user observed, are clearly underage.
And its not just the sites that GamerGate and the like frequent. When the message boards for the virtual pet website Neopets descended into chaos when their filtering system failed during a facility move, turning the previously kid-friendly site into this:
These incidents drive home the importance of some basic level of moderation and sorting in any online community. Typical narratives about the importance of free speech use the image of the town hall where all ideas, good, bad, and ugly, are voiced and the best ideas rise to the top because they bad ones are vigorously debated and publicly demonstrated to be erroneous.
But online communities, as these episodes demonstrate, cannot function in the same way for several reasons. First, the assumption with the town hall model is that everyone who attends the town hall has the same goal in mind: the preservation and improvement of their community. Some folks might have awful or ignorant ideas about how to achieve this goal, but the assumption is that everyone genuinely wants to contribute to the welfare of the whole. In fact, we can further assume that, according to this model, if, say, a spy were to arrive from another town whose goal it was to undermine the community from the inside, it would be very difficult for him to do so in any lasting way because he would be so outnumbered by rational people working for success that his plans would never take permanent hold. They might be tested out, but once they failed, they would be replaced with other plans.
Sounds great, right?
Well, on the Internet, we cannot assume that every participant in a community has the community’s best interest at heart. Everyone who spends their time online has encountered trolls, people whose sole purpose is to disrupt and destroy a community. Their motivations may be ideological or they might simply be in it for the lulz, but they function like the spy in the example above. In moderated communities, they pop up constantly and are banned out once they are discovered. But this doesn’t stop the trolls from trying to do harm to the community. They just make new accounts and start the process all over again. Moderating communities online is like playing Wack-A-Mole. You just keep knocking down the same instigators over and over again. But that work is what allows the rest of the community to function. Spies keep getting kicked out of the town hall while they change their disguises over and over again, but over all the majority of members who are actually invested in the community is maintained.
On a board with no moderation, however, the spies will quickly come to outnumber the earnest participants. Trolls arrive, sew mischief, and are allowed to stay. And so the news spreads that there is a space where trolls can gather and operate in peace. As 8chan’s affiliation with GamerGate demonstrates, these sites even become bases from which trolls can launch attacks into other communities (the town hall becomes a spy headquarters intent on upending order in all the neighboring towns). And so the balance quickly tips over so that those whose primary goal is to destroy other communities become the majority.
But, the great irony is that these spies end up undoing themselves by wrecking the community that welcomed them in addition to the communities they target. Trolls care not for whether a community will grow and prosper or even remain in existence. They are happy to jump from site to site leaving nothing but ruins in their wake. They don’t care if they use up and discard 8chan or Voat. They don’t care if their town hall is quarantined from the rest of the world because of their own bad or illegal behavior. When the champions of free speech who invited them in find themselves in legal trouble or labeled as child pornography supporters by the world’s biggest and most popular search engine or denied access to funds and hosting services, well, that’s just more lulz for the trolls to enjoy.
Now I’m not saying that all speech online should be carefully monitored and scrutinized. I agree with the town hall model that debate is important and censorship can damage the integrity and intellectual capacity of a community. However, there do need to be SOME rules governing how a town hall meeting is to be conducted. Otherwise, don’t be surprised when it burns to the ground.
Update: Check out this piece for some more ideas about how certain moderation can actually encourage mobbing/brigading behavior: “GamerGate is Going after SXSW Panels: How ‘The Downvote’ Give Power to the Mob“