- The Washington Times - Sunday, November 16, 2014

Americans began heading anew this weekend to President Obama’s official Obamacare Facebook page to gather information on the new round of health care enrollment, share their experiences shopping for insurance on the federal exchange and voice their opinions on the president’s signature domestic achievement.

However, what some would view as a robust marketplace of ideas is actually controlled by just a few, an analysis of the Web page shows.

Sixty percent of the site’s 226,838 comments generated from September 2012 to early last month can be attributed to fewer than 100 unique profiles, according to an analysis completed by The Washington Times with assistance from an outside data analytics team. Many of those profiles belong to just one person who created multiple aliases or personas to widen her influence and multiply her voice.

Cindi Huynh, an Obamacare supporter in California, posted on average 59 times a day on the site in 60 days, making her the No. 1 poster in that period. She posted only during work hours — as is the trend of the top 25 posters on the site — and never on weekends.

Over the past two years, Ms. Huynh has been a prolific poster — ranking twice in the top 25 profiles contributing to the site — once under the name “Cindi Huynh” and again as “Cyndi Huynh Vellucci.”

Ms. Huynh has had at least four Facebook profiles, she confirmed.

PHOTOS: See Obama's biggest White House fails

Ms. Huynh told The Times that she has never been paid for her posts but has volunteered for the California Democratic Party and was approached to become an Obamacare patient advocate. She said she was too busy to contribute in that way and felt she could better spread the message online. She has a full-time job but has declined to name her employer.

Copious posters on the Obamacare site, like Ms. Huynh, were not all so easily identifiable. Once The Times made known it was conducting an investigation into the audience on the site, at least three pro-Obamacare commentators disappeared or deactivated their accounts.

Wanda Milner has posted 4,695 times in the entire timeline evaluated, putting her in the top 25 commentators. Ms. Milner, who is from Canada, told The Times she was passionate about the issue and decided to get active about it. She denied having any aliases or being paid for her actions, but said fake pages were created to mock her. She has “liked” many of Ms. Huynh’s comments as well as those of other pro-Obamacare posters.

Paul J. Nunley is an anti-Obamacare poster who made 2,316 posts in 60 days, ranking only behind Ms. Huynh. A retired veteran from New Mexico, Mr. Nunley said he dedicates his full time to the site to try to “rid it of misinformation.” In the process of doing so, he has made friends with Ms. Huynh and other top posters, and has engaged in online fighting, leading to multiple timeouts of his profile.

Eileen A. Wolf, from North Dakota, posted 5,870 times in the entire period evaluated and 325 times in 60 days. She also used the account of her husband, James Wolf, to post prolifically under his name, Mr. Wolf said.

The official Obamacare page is controlled by Organizing for Action, the president’s former political action committee and now a nonprofit group. It has more than 771,000 Facebook “likes” and is updated every day with a new link promoting the policy.

The more likes, links, content and comments generated off the site move it higher in the list of Google searches, meaning the very few who are posting on the site obsessively are able to drive it as a top return in another person’s Google search — no matter the ideological tilt of the comments.

Organizing for Action declined to comment to The Times when asked whether it hired paid commentators to post on the site during high-traffic days or tried to spur online conversation through volunteers.

Organizing for Action also handles the president’s Twitter feed. This summer, it was found that nearly half of the president’s 43 million followers at the time appeared to be fake, according to researchers at Barracuda, a computer security company in Campbell, California. Organizing for Action also declined to comment at that time.

Organizing for Action employs Blue State Digital, the digital public relations mastermind behind Mr. Obama’s revolutionary online campaign, both in 2008 and 2012, according to Blue State Digital’s website. The firm is cited by its peers as a leader in using digital tools to create both a persuasive and informative online forum.

It was Blue State Digital that invented an online tool for users to organize a campaign event in their hometowns and then watch their Web “thermometer” rise as their contacts increased. Once somebody in the social media world sent their email address to the organization, influential people such as first lady Michelle Obama would respond back, relaying the president’s talking points for that week.

With Blue State’s help, the Obama campaign in 2008 collected the email addresses of 13 million supporters and organized them into lists, segmenting them by social networks, contributions and favored issues, according to a Boston Globe profile written about the firm shortly after Mr. Obama’s first winning presidential election.

Blue State Digital, which is based in Boston, declined to comment for this article.

With Facebook’s 1.23 billion monthly users and Twitter’s more than 240 million, a campaign or political point of view can reach millions instantly and the trends on social media often become segments on the nightly news.

E.J. Dionne told his readers at The Washington Post during the 2012 presidential contest to watch their Twitter feeds to get an unfettered view of who won the second debate.

However, social media is anything but unfettered. It can be leveraged to spread rumors, undercut the opposition or create a false sense of public pressure, computer analysts and public relations representatives say.

“There have been smear campaigns since Adams and Jefferson in the early 1800s and we’re seeing the same thing here, with just a new set of tools,” said Richard Levick, chairman and chief executive officer of Levick, a public relations and strategic communications firm in Washington.

“Where do undecided voters, journalists go to get their information? Google. So controlling the search engine is hugely important. We need to know who is our audience, how do we reach them, how do we engage them, and then, how do we control the territory?” he said.

Many digital public relations shops are deploying social media robots, or “socialbots,” built by software programmers and deployed by political campaigns to block Twitter feeds, build support for otherwise unpopular opinions and create a flood of content for their own agendas that hopefully will end up in someone’s Facebook or Twitter feed and then be liked or reposted by a real individual.

The technique has been so widely used that the National Science Foundation and the Defense Department have awarded a grant to Indiana University to study the phenomenon and develop an application to detect whether a real person or an android is sending a social media message.

The select few masquerading as a crowd are hijacking social media to “effectively attack the democratic process,” said P. Takis Metaxas, a computer science professor at Wellesley College who studied social media manipulation in the 2012 election cycle.

“As social animals, we are influenced by our peers’ opinions in many ways. In our research we have seen political zealots interested in the electoral results, time and time again. They do not want to let chance, or their enemies, determine an electoral outcome. So they try to create content that would fool social networks into promoting their own candidates and, in some cases, into spreading lies to the detriment of their opponents,” Mr. Metaxas said.

For example, during the 2010 U.S. Senate special election campaign in Massachusetts, Mr. Metaxas found an attacker used nine Twitter alias accounts to send about 1,000 tweets in a little over two hours. Real users retweeted the message and it reached more than 60,000 people in one day.

The “Twitter bomb” seemed to come from a group of concerned citizens, but it likely started with one person or political group trying to alter public opinion, Mr. Metaxas said.

In its annual report, Twitter estimated that less than 5 percent of its 240 million customers — about 12 million — represent fake or alias accounts.

“As operatives we have to remember that Twitter is not a representative sample. One or two Twitter loudmouths can make minor issues seem tremendously important when they are, in fact, completely irrelevant,” Mark Harris, a partner at Cold Spark Media and former campaign manager for Sen. Patrick J. Toomey, Pennsylvania Republican, wrote in the July/August issue of Campaigns and Elections.

“If Twitter is your only news source, which too often it is for many political reporters, some random malfeasance would appear to have seismic repercussions when survey research would show 80 percent of voters are unaware of the issue at all,” Mr. Harris said.

In terms of Facebook, it is trying to put an end to fake profiles and fake “likes,” understanding how misinformation and manipulation can spread.

Facebook, in its annual report, estimated last year that 5.5 percent to 11.2 percent of its users were fake. In January, the company said at least 67.7 million fake accounts were used, a number that could be as high as 137.8 million deploying the company’s highest estimate.

Fake “likes” are another big concern for Facebook.

Facebook has vowed to “aggressively get rid of fake likes” on its network, according to an Oct. 4 posting on its security blog.

The site said it had won more than $2 billion against scam artists who sold fake likes in order to make a site seem more popular.

“We have a strong incentive to aggressively get rid of fake likes because businesses and people who use our platform want real connections and results, not fakes,” Facebook’s security blog said. “Fake likes are only profitable when they can spread at scale. To make it harder for these scams to be profitable, our abuse-fighting team builds and constantly updates a combination of automated and manual systems that help us catch suspicious activity at various points of interaction on the site, including registration, friending, liking, and messaging.”

The social and digital battlefield has been set, and those running for office must be able to play, Mr. Levick said.

“The ‘aha’ moment has not occurred yet for most politicians,” said Mr. Levick. “Fifteen years after the Model T was developed, buggy whips were still selling exceptionally well. Social and digital media is not a tactic, it’s a revolution — just like the Industrial Revolution. It changes everything.”

• Kelly Riddell can be reached at kriddell@washingtontimes.com.

Copyright © 2022 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.

Click to Read More and View Comments

Click to Hide