But the disclosure of Wiki-PR in the Signpost letter is the largest network of sock-puppet user accounts ever discovered on Wikipedia, reflecting the increasing efforts companies are making to influence their online profiles.
The Wikipedia pages involved run the gamut from small technology startups to large multinationals such as Viacom Inc. and minor rock bands. The investigation also has led to the blocking of 323 accounts for violating the site’s terms of service, with another 84 suspected.
The investigation was prompted by editors’ suspicions about patterns of contributions made by certain users.
Wikipedia administrators, or admins, are the elite of the website's huge army of volunteer editors, and have special privileges to resolve disputes and promote or block users for violating terms of service.
The idea that some of them might be in the pay of PR companies is disturbing to the Wikipedia community, for it challenges the integrity of the website’s self-policing community ethos.
But the fact that Wiki-PR’s efforts were exposed says something about the site’s transparency, said Shane Greenstein, a professor at the Kellogg School of Management, Northwestern University, and the author of a study on the neutrality of Wikipedia.
“There is a working assumption: Sunlight is healthy,” he said Monday at an event organized by the Hudson Institute, a conservative Washington think tank, noting that an editor’s record was viewable.
The longest running of the Wiki-PR sock-puppet accounts, for instance, called Morning277, had been active since 2008 and made more than 6,000 edits.
It was by looking at the editing histories of suspect accounts like Morning277 and then connecting them to other user accounts working on the same pages or logging on from the same computer, that the volunteer investigators uncovered the whole sock-puppet network.
“The Wikipedia community is suspicious of edits from people with no track record,” Mr. Greenstein said.
In his own research, dealing with political bias, Mr. Greenstein said, he found that there was evidence to support the site’s vision of a self-correcting collaborative process.
He examined more than 70,000 pages relating to U.S. politics and found that the articles that were more heavily edited — with the most readers and contributors — tended to be less biased and more accurate.
“The entries that have gotten the most attention within Wikipedia look most like the [comparable entries in Encyclopedia] Britannica,” he said.
The problem was that, even with more than 3,000 regular editors, “there is only so much editorial attention to go around,” he said.