If you are reading this article on the Internet, stop afterward and think about it. Then scroll to the bottom and read the comments. Then recheck your views.
Chances are your thinking will have changed, especially if you have read a series of insulting, negative, or mocking remarks—as so often you will. Once upon a time, it seemed as if the Internet would be a place of civilized and open debate; now, unedited forums often deteriorate into insult exchanges. Like it or not, this matters: Multiple experiments have shown that perceptions of an article, its writer, or its subject can be profoundly shaped by anonymous online commentary, especially if it is harsh.
. . . .
Some news organizations have responded by heavily curating comments. One Twitter campaigner, @AvoidComments, periodically reminds readers to ignore anonymous posters: “You wouldn’t listen to someone named Bonerman26 in real life. Don’t read the comments.” But none of that can prevent waves of insulting commentary from periodically washing over other parts of the Internet, infiltrating Facebook or overwhelming Twitter.
If all of this commentary were spontaneous, then this would simply be an interesting psychological phenomenon. But it is not. A friend who worked for a public relations company in Europe tells of companies that hire people to post, anonymously, positive words on behalf of their clients and negative words about rivals. Political parties of various kinds, in various countries, are rumored to do the same.
. . . .
For democracies, this is a serious challenge. Online commentary subtly shapes what voters think and feel, even if it just raises the level of irritation, or gives readers the impression that certain views are “controversial,” or makes them wonder what the “mainstream” version of events is concealing. For the most part, the Russian trolls aren’t supplying classic propaganda, designed to trumpet the glories of Soviet agriculture. Instead, as journalists Peter Pomerantsev and Michael Weiss have written in a paper analyzing the new tactics of disinformation, their purpose is rather “to sow confusion via conspiracy theories and proliferate falsehoods.” In a world where traditional journalism is weak and information is plentiful, that isn’t very difficult to do.
But no Western government wants to “censor” the Internet, either, and objections will always be raised if government money is spent even studying this phenomenon. Perhaps, as Pomerantsev and Weiss have also argued, we therefore need civic organizations or charities that can identify deliberately false messages and bring them to public attention. Perhaps schools, as they once taught students about newspapers, now need to teach a new sort of etiquette: how to recognize an Internet troll, how to distinguish truth from state-sponsored fiction.