You are currently viewing the printable version of this article, to return to the normal page, please click here.

Click capitalism: PR firms cash in cleaning up clients’ Wikipedia pages

- The Washington Times - Monday, October 21, 2013

A public relations and reputation management firm has been using hundreds of hidden identities to create and edit Wikipedia entries for their clients, according to an investigation that underlines questions about the credibility and reliability of the sixth most visited place on the Internet.

An investigation by volunteer contributors to the online, anyone-can-edit-it encyclopedia uncovered "a sophisticated array of concealed user accounts," known in Internet slang as "sock puppets" employed by a company called Wiki-PR.

The false accounts "created, edited, or maintained several thousand Wikipedia articles for paying clients," according to Signpost, an online newsletter written by and for Wikipedia's army of volunteer editors.

"Paid advocacy editing is extremely problematic [and] we consider it a 'black hat' practice," said Sue Gardener, executive director of the Wikimedia Foundation, the nonprofit that runs Wikipedia.

Wiki-PR dismissed charges of any wrongdoing but acknowledged that mistakes had been made.

"The 'PR' in Wiki-PR is a misnomer — we're a research and writing firm," founder and CEO Jordan French said in an email to The Washington Times. "We counsel our clients on how to adhere to Wikipedia's rules. We research the subject and write in an accurate and properly referenced way about it. What we do is get Wikipedia to enforce the rules so our clients are presented accurately."

He acknowledged that the firm's editors sometimes made "bad calls on 'notability,'" the question of whether or not a subject has enough news value to merit a Wikipedia page.

"We do paid editing and not paid advocacy," he added, insisting that the network of linked user accounts uncovered by the investigation were all real people, many of whom also edited Wikipedia as volunteers in their spare time. As a result, he said, "Volumes of Wikipedia pages we didn't work on were wrongly swept into that investigation."

But Signpost said the "problems with these articles were far from limited to notability."

For example, footnotes in the paid-for pages often misleadingly claimed to refer to news websites. In reality, they linked to sites that accept unsolicited content, like citizen journalist services such as iCNN and Yahoo Voices, but the link was mislabeled as CNN or Yahoo. Sometimes the stories referenced did not even mention the company or product the Wikipedia page was about.

Tellingly, investigators found, many of the suspect user accounts had worked together to modify pages, without communicating on the special talk pages provided for editors to discuss changes.

Longtime users confirm that it is unusual for two or more editors to work together on an article and never communicate, suggesting either that the accounts were really a single person or that they had been collaborating offline.

Wiki-PR's own website claims it will "directly edit your [Wikipedia] page using our network of established Wikipedia editors and admins."

Wiki-PR is one of a host of companies that have sprung up in the past five years in the field of for-profit online reputation management. These firms provide companies with services, considered both ethical and unethical, ranging from affecting the search results on Google to countering bad reviews or inaccurate information on community sites such as Yelp.

This is not the first time that PR professionals have been accused of abusing the voluntary, self-policing character of Wikipedia to try to make clients' pages more favorable, nor the first time false user accounts have been exposed. For example, in March, Wikipedia editors said a BP PLC employee had edited the company page to "whitewash" its environmental record.

But the disclosure of Wiki-PR in the Signpost letter is the largest network of sock-puppet user accounts ever discovered on Wikipedia, reflecting the increasing efforts companies are making to influence their online profiles.

The Wikipedia pages involved run the gamut from small technology startups to large multinationals such as Viacom Inc. and minor rock bands. The investigation also has led to the blocking of 323 accounts for violating the site's terms of service, with another 84 suspected.

The investigation was prompted by editors' suspicions about patterns of contributions made by certain users.

Wikipedia administrators, or admins, are the elite of the website's huge army of volunteer editors, and have special privileges to resolve disputes and promote or block users for violating terms of service.

The idea that some of them might be in the pay of PR companies is disturbing to the Wikipedia community, for it challenges the integrity of the website's self-policing community ethos.

But the fact that Wiki-PR's efforts were exposed says something about the site's transparency, said Shane Greenstein, a professor at the Kellogg School of Management, Northwestern University, and the author of a study on the neutrality of Wikipedia.

"There is a working assumption: Sunlight is healthy," he said Monday at an event organized by the Hudson Institute, a conservative Washington think tank, noting that an editor's record was viewable.

The longest running of the Wiki-PR sock-puppet accounts, for instance, called Morning277, had been active since 2008 and made more than 6,000 edits.

It was by looking at the editing histories of suspect accounts like Morning277 and then connecting them to other user accounts working on the same pages or logging on from the same computer, that the volunteer investigators uncovered the whole sock-puppet network.

"The Wikipedia community is suspicious of edits from people with no track record," Mr. Greenstein said.

In his own research, dealing with political bias, Mr. Greenstein said, he found that there was evidence to support the site's vision of a self-correcting collaborative process.

He examined more than 70,000 pages relating to U.S. politics and found that the articles that were more heavily edited — with the most readers and contributors — tended to be less biased and more accurate.

"The entries that have gotten the most attention within Wikipedia look most like the [comparable entries in Encyclopedia] Britannica," he said.

The problem was that, even with more than 3,000 regular editors, "there is only so much editorial attention to go around," he said.

© Copyright 2014 The Washington Times, LLC. Click here for reprint permission.