Facebook “communicated really badly”
about a controversial study in which it secretly manipulated users’
feelings, the social network’s chief operating officer Sheryl Sandberg
admitted Wednesday.
The company clandestinely altered
the emotional content of feeds of nearly 700,000 users for a week in
2012, giving some sadder news and others happier news in a study aimed
at better understanding “emotional contagion.”
The
research, published last month, has prompted online anger and questions
about the ethics of the research and forced Facebook on the defensive.
It
was an experiment as part of product testing, Sandberg told a women’s
business seminar in New Delhi when asked whether the study was ethical.
“We communicated really badly on this subject,” she said, before adding: “We take privacy at Facebook really seriously.”
UK TO PROBE EXPERIMENT
Sandberg, who was in India to promote her gender-equality book Lean In and meet leaders of Indian companies and senior politicians, declined to speak to reporters asking further questions.
The
comments came as several European data protection regulators began
looking into whether Facebook broke privacy laws when it carried out the
study.
British authorities will question Facebook over the experiment, officials said Wednesday.
The
Information Commissioner’s Office, Britain’s independent data watchdog,
is liaising with the Irish data protection authority and seeking “to
learn more about the circumstances,” a spokesman said.
The
study by researchers affiliated with Facebook, Cornell University, and
the University of California at San Francisco, appeared in the 17 June
edition of the Proceedings of the National Academy of Sciences.
In
the study, Facebook placed positive or negative posts in users’ feeds
to see how this affected their mood — all without their explicit consent
or knowledge.
ETHICAL REGULATIONS
The
results indicate “emotions expressed by others on Facebook influence
our own emotions, constituting experimental evidence for massive-scale
contagion via social networks”, the researchers concluded, and noted
emotion was relevant to human health.
Given Facebook’s
billion users and widespread influence, the experiment has raised
worries over its ability to influence users’ moods and thinking.
Critics say research on people is normally governed by strict ethical regulations.
In
a statement earlier in the week, Facebook said none of the data used in
the study was associated with a specific person’s account.
It
said it did research to make its content “as relevant and engaging as
possible,” which meant understanding how people respond to positive or
negative information.
The researchers said the study
was consistent with Facebook’s Data Use Policy, to which all users agree
before creating a Facebook account.
But a number of
users criticised the psychological experiment, posting words like “super
disturbing,” “creepy” and “evil” as well as angry expletives.
No comments:
Post a Comment