Facebook manipulated the new feeds of 1.9 million American users during the 2012 election without telling them.

The manipulation led to a 3 percent increase in voter turnout, according to the company's own data scientist.

In a stunning revelation, for three months prior to Election Day in 2012, a Facebook employee "tweaked" the feeds of 1.9 million Americans by sharing their friend's hard news posts rather than the usual personal posts. The effect was felt most by occasional Facebook users who reported in a survey they paid more attention to the government because of their friend's hard news feeds. Facebook didn't tell users about this psychology experiment, but the effects ended up boosting voter turnout by 3 percent.

The experiment and manipulation was shared with the public in two talks given by Facebook's data scientist, Lada Adamic, in the fall of 2012, and first disclosed by Mother Jones. In those talks, Adamic said a colleague at Facebook, Solomon Messing "tweaked" the feeds. After this, Messing surveyed the group and found that voter turnout and political engagement grew from a self-reported 64 percent to more than 67 percent.

 

Michael Buckley, vice president for global business communication, said the Messing study was an "in-product" test designed to see how users would react when news feeds were more prominent.

"This was literally some of the earliest learning we had on news," Buckley told Mother Jones. "Now, we've literally changed News Feed, to reduce spam and increase quality of content."

Buckley said the public will not receive full answers about that experiment until some point in 2015, when the academic papers are expected to be published.

It is not the first time that Facebook has been exposed for conducting psychological experiments on its users without their knowledge. In June, it was revealed that Facebook tried to manipulate users' emotions when toying with the feelings of 689,003 randomly selected English-speaking Facebook users by changing the contents of their news feed, according to a paper published in the June edition of the journal Proceedings of the National Academy of Scientists (PNAS). During a week-long period in January 2012, researchers staged two parallel experiments, reducing the number of positive or negative updates in each user's news feed.

"These results suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks, and providing support for previously contested claims that emotions spread via contagion through a network," according to a paper published in the June edition of PNAS.

The Facebook users were not notified of the experiment. However, according to Facebook's terms of service -- to which every person agrees when they register on the social network -- users' data may be used "for internal operations, including troubleshooting, data analysis, testing, research and service improvement."

The researchers argue that their experiment was consistent with Facebook's Data Use Policy.

The experiments are raising eyebrows over the manipulation of voters when big data can be used to "engineer the public" without the public's knowledge, according to sociologist Zeynep Tufekci.

"At minimum, this environment favors incumbents who already have troves of data, and favors entrenched and moneyed candidates within parties, as well as the data-rich among existing parties. The trends are clear. The selling of politicians -- as if they were "products" -- will become more expansive and improved, if more expensive. In this light, it is not a complete coincidence that the 'chief data scientist' for the Obama 2012 campaign was previously employed by a supermarket to 'maximize the efficiency of sales promotions.' And while the data advantage is held, for the moment, by the Democratic party in the United States, it will likely be available to the highest bidder in future campaigns," said Tufekci.