Manipulate Facebook users at your own risk.

It was recently made public that Facebook altered the content of some users’ News Feeds in an attempt to study the psychology behind what causes people to post emotional material.

Adam Kramer, Facebook’s data scientist and co-author of the study explained, “The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”

10385565

According to news reports, the study affected 0.04 percent of Facebook’s users (approximately 690,000 users) over a one-week period in 2012. The study’s findings contradicted conventional wisdom that seeing positive emotions on Facebook would encourage similarly positive content. In a statement, Facebook said the research was designed to “improve our services and to make the content people see on Facebook as relevant and engaging as possible.”

Reaction to the news of this experiment was immediate and intense. “I wonder if Facebook KILLED anyone with their emotion manipulation stunt,” privacy activist Lauren Weinstein said on Twitter. “At their scale and with depressed people out there, it’s possible.”

That reaction poses some interesting questions for marketers. Facebook, Forbes Magazine says, “Is the best human research lab ever.” And as marketers, research is the foundation on which our work is built. If we don’t know what people think and feel, how can we craft messages they’ll relate to?

Most marketers probably rationalize their use of Facebook user data with the argument that Facebook users voluntarily put themselves “out there,” on a daily basis, sharing (or oversharing) information that becomes public pretty much forever.

So how do we as marketers tap into the goldmine of user data available on Facebook without alienating the very users we’re attempting to attract?

The answer would seem to be to tread carefully. Determining how users think and feel is very different from trying to manipulate those thoughts and feelings. And, at least for now, users seem to be drawing a pretty clear line between having their online content monitored and becoming digital guinea pigs.

Plus, judging by how people are reacting to this study, Facebook probably won’t be repeating this kind of experiment anytime soon.