Aug 14 2014 : The Economic Times (Delhi)
Researchers Grapple with Ethics of Studying Users
@The New York Times
|
Scientists can now analyse personal data on millions of people online without their knowledge, and some want to bring ethical guidelines to such studies, writes Vindu Goel
Scholars are exhilarated by the prospect of tap ping into the vast troves of personal data collected by Facebook, Google, Amazon and a host of startups, which they say could transform social science research.Once forced to conduct painstaking personal interviews with subjects, scientists can now sit at a screen and instantly play with the digital experiences of millions of internet users.
It's the frontier of social science -experiments on people who may never even know they are subjects of study , let alone explicitly consent.
Scholars are exhilarated by the prospect of tapping into the vast troves of personal data collected by Facebook, Google, Amazon and a host of startups, which they say could transform social science research.
Once forced to conduct painstaking personal interviews with subjects, scientists can now sit at a screen and instantly play with the digital experiences of millions of internet users.
It's the frontier of social science -experiments on people who may never even know they are subjects of study , let alone explicitly consent.
“This is a new era,“ said Jeffrey T Hancock, a Cornell University professor. “I liken it a little bit to when chemistry got the microscope.“
But the new era has brought some controversy with it. Hancock was a co-author of the Facebook study in which the social network quietly manipulated the news feeds of nearly 700,000 people to learn how the changes affected their emotions. When the research was published in June, the outrage was immediate.
Now Hancock and other university and corporate researchers are grappling with how to create ethical guidelines for this kind of research.
In his first interview since the Facebook study was made public, Hancock said he would help develop such guidelines by leading a series of discussions among academics, corporate researchers and government agencies.
Scholars from the MIT and Stanford are planning panels and conferences on the topic, and several academic journals are working on special issues devoted to ethics.
Microsoft Research, a quasi-independent arm of the software company, hosted a panel last month on the Facebook research with Hancock and is offering a software tool to scholars to help them quickly survey consumers about the ethics of a project in its early stages.
Much of the research done by the internet companies is in-house and aimed at product adjustments, like whether people prefer news articles or cat videos in their Facebook feeds or how to make Google's search results more accurate.
But bigger social questions are studied as well, often in partnership with academic institutions, and scientists are eager to conduct even more ambitious research.
The Facebook emotion experiment was in that vein. The brainchild of a company data scientist, Adam DI Kramer, but shaped and analysed with help from Hancock and another academic researcher, Jamie E Guillory , it was intended to shed light on how emotions spread through large populations.
Such testing raises fundamental questions. What types of experiments are so intrusive that they need prior consent or prompt disclosure after the fact? How do companies make sure that customers have a clear understanding of how their personal information might be used? Who even decides what the rules should be?
Existing US federal rules governing research on human subjects, intended for medical research, generally require consent from those studied unless the potential for harm is minimal.
Mary L Gray , a senior researcher at Microsoft Research and associate professor at Indiana University , said that too often, researchers conducting digital experiments work in isolation with little outside guidance.
Gray advocates a simple litmus test for researchers: If you're afraid to ask your subjects for their permission to conduct the research, there's probably a deeper ethical issue that must be considered.
It's the frontier of social science -experiments on people who may never even know they are subjects of study , let alone explicitly consent.
Scholars are exhilarated by the prospect of tapping into the vast troves of personal data collected by Facebook, Google, Amazon and a host of startups, which they say could transform social science research.
Once forced to conduct painstaking personal interviews with subjects, scientists can now sit at a screen and instantly play with the digital experiences of millions of internet users.
It's the frontier of social science -experiments on people who may never even know they are subjects of study , let alone explicitly consent.
“This is a new era,“ said Jeffrey T Hancock, a Cornell University professor. “I liken it a little bit to when chemistry got the microscope.“
But the new era has brought some controversy with it. Hancock was a co-author of the Facebook study in which the social network quietly manipulated the news feeds of nearly 700,000 people to learn how the changes affected their emotions. When the research was published in June, the outrage was immediate.
Now Hancock and other university and corporate researchers are grappling with how to create ethical guidelines for this kind of research.
In his first interview since the Facebook study was made public, Hancock said he would help develop such guidelines by leading a series of discussions among academics, corporate researchers and government agencies.
Scholars from the MIT and Stanford are planning panels and conferences on the topic, and several academic journals are working on special issues devoted to ethics.
Microsoft Research, a quasi-independent arm of the software company, hosted a panel last month on the Facebook research with Hancock and is offering a software tool to scholars to help them quickly survey consumers about the ethics of a project in its early stages.
Much of the research done by the internet companies is in-house and aimed at product adjustments, like whether people prefer news articles or cat videos in their Facebook feeds or how to make Google's search results more accurate.
But bigger social questions are studied as well, often in partnership with academic institutions, and scientists are eager to conduct even more ambitious research.
The Facebook emotion experiment was in that vein. The brainchild of a company data scientist, Adam DI Kramer, but shaped and analysed with help from Hancock and another academic researcher, Jamie E Guillory , it was intended to shed light on how emotions spread through large populations.
Such testing raises fundamental questions. What types of experiments are so intrusive that they need prior consent or prompt disclosure after the fact? How do companies make sure that customers have a clear understanding of how their personal information might be used? Who even decides what the rules should be?
Existing US federal rules governing research on human subjects, intended for medical research, generally require consent from those studied unless the potential for harm is minimal.
Mary L Gray , a senior researcher at Microsoft Research and associate professor at Indiana University , said that too often, researchers conducting digital experiments work in isolation with little outside guidance.
Gray advocates a simple litmus test for researchers: If you're afraid to ask your subjects for their permission to conduct the research, there's probably a deeper ethical issue that must be considered.