What happens when a research study has 689,003 participants?
Recently Facebook and Cornell University partnered to study the spread of emotions in social media. They conducted typical A/B testing with one control group and another group where the news feeds were altered and to see how positive and negative emotions were altered based on Facebook’s changes in the news feed. They found that users who saw more positive posts tended to post more positive messages; those who saw more negative posts tended to post more negative messages.
The “news” in this study focuses on the ethics of conducting a study among participants who have agreed to participate in research but do not know they are participating in a particular study. Is this ethical? That is the raging debate.
Lets focus on the implications of the findings. Our society functions best when we have a diversity of opinions and perspectives shaped around a common core of values. This diversity is supported by diverse information sources (magazines, newspapers, television, radio, etc.) that each provide a different viewpoint and collectively provide a spectrum of viewpoints that supports diversity in society.
Unlike any other information source, Facebook is an immense media outlet with 1.3 BILLION users. Now they are learning how to use their technology to manipulate user emotions. Such understanding with such a huge proportion of the world’s population creates a ethical responsibility for Facebook. Facebook should be agnostic to users and posts. But can they? Of course not. They currently have a policy for “acceptable” posts and a team that polices the policy. As a business, they need that policy and the team to keep users from defecting. But, are they on a slippery slope?
Facebook is free to users so it must have advertisers to survive. Facebook is learning how to manipulate the emotions of 1.3 BILLION users. In light of this, the question arises, will Facebook be able to resist the urge to manipulate its huge following to meet the needs of its biggest advertisers?
If Facebook resists and allows its 1.3 BILLION users to drive content free of manipulation and censorship, Facebook can be the most diverse information resource in the world. If it learns to use its immense power to promote advertisers and/or its viewpoint, it threatens the diversity that makes our society strong. What will they choose? How will they act?
The reason this is interesting to me is that all traditional media presents a viewpoint because each media outlet is ultimately controlled by one, or a few, people with opinions and an agenda to please advertisers. Television, radio, newspapers, etc. must make daily decisions about news to cover and news not to cover. These decisions are based on their point of view and reflected in their news product. That is why some people love FoxNews and others CNN. Diversity is important and we citizens get a diversity of news and a diversity of viewpoints.
Social media (Facebook) is different. It is user driven. Facebook potentially contains 1.3 BILLION viewpoints. In its purest form, this is a rich treasure trove of diversity. But, Facebook is a corporation that must provide ever-increasing returns to shareholders. In light of this pressure and its ever-increasing knowledge of how to manipulate news feeds, can it remain a rich treasure trove of diversity, or will it become more like other media outlets that espouse a specific viewpoint? The pressure is on; the decisions are tough; the stakes are huge. What will they do?