By Pam Martens and Russ Martens: July 3, 2014
Let us see if we have this straight: Facebook is a company that has been publicly traded for just slightly more than two years. It pays no dividend so its key attraction for its shareholders is that it knows how to run and grow its business. Its initial public offering launch was one of the biggest fiascos in modern finance. Its core asset from which its revenues flow is based on the loyalty and growth of its user base upon whom it decided to conduct secret psychological experiments – and then publish the findings.
But wait. It gets worse.
Facebook’s secret human lab rat study on a self-described “massive” 689,003 of its users was published just last month in the Proceedings of the U.S. National Academy of Sciences under the title: “Experimental Evidence of Massive-Scale Emotional Contagion Through Social Networks.” The study said the significant finding was that “emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.”
According to Facebook, this is what they did to manipulate the behavior of its unpaid and involuntary human lab rats:
“In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and non-verbal cues are not strictly necessary for emotional contagion, and that the observation of others’ positive experiences constitutes a positive experience for people.”
Facebook has observed along with the rest of America the fallout from the revelations of secret government surveillance of its citizens. Somehow the public outcry over secret surveillance did not send a “non-verbal cue” to Facebook that there might be an outcry to revelations that it was using an algorithm to manipulate the emotional mood of its users without their knowledge or informed consent.
What if some of these users were under psychiatric care for depression? What if they had just lost their job, or their marriage, or their home, or experienced the death of a loved one? How outrageously irresponsible is it to secretly attempt to manipulate the mood of an already depressed person to a more negative state?
But wait. It gets worse.
In 1994, the CIA declassified a secret paper outlining other attempts to manipulate a person’s behavior without their knowledge. The document, “The Operational Potential of Subliminal Perception” by Richard Gafford notes the following:
“Usually the purpose is to produce behavior of which the individual is unaware. The use of subliminal perception, on the other hand, is a device to keep him unaware of the source of his stimulation. The desire here is not to keep him unaware of what he is doing, but rather to keep him unaware of why he is doing it, by masking the external cue or message with subliminal presentation and so stimulating an unrecognized motive.”
We’re also informed by the CIA that “The operational potential of other techniques for stimulating a person to take a specific controlled action without his being aware of the stimulus, or the source of stimulation, has in the past caught the attention of imaginative intelligence officers.”
And, the CIA offers some other helpful tips that Facebook may want to consider in its next human lab rat study:
“In order to develop the subliminal perception process for use as a reliable operational technique, it would be necessary a) to define the composition of a subliminal cue or message which will trigger an appropriate preexisting motive, b) to determine the limits of intensity between which this stimulus is effective but not consciously perceived, c) to determine what preexisting motive will produce the desired abnormal action and under what conditions it is operative, and d) to overcome the defenses aroused by consciousness of the action itself.”
But wait. It gets worse.
The jury is still out on whether this study had a military connection. The original press release issued by Cornell University, which was involved in the research study, indicated that the U.S. Army Research Office was one of the funders of the study. After there was a public uproar about the study itself, this correction appeared at the bottom of the press release:
“Correction: An earlier version of this story reported that the study was funded in part by the James S. McDonnell Foundation and the Army Research Office. In fact, the study received no external funding.”
While questions continue to swirl around this dubious study, one thing is not in doubt: Facebook has a unique talent for brand suicide.