Social scientists have to get better at recognising and responding to ethical problems. Although economists, political scientists and psychologists have not been responsible for the same level of abuses that have occurred in biomedical research, the social sciences have witnessed their share of old-fashioned scandalous behaviour.
Social scientists were co-opted into American intelligence and military operations in Iraq and Afghanistan. Political scientists at Stanford University and Dartmouth College involved in experimentation on voter participation may have disrupted judicial elections in Montana. Harvard sociologists studying Facebook failed to protect the anonymity of their students.
Australian experimental psychologists at La Trobe University replicating Stanley Milgram’s studies in the 1970s were found to have caused long-term distress to some of their participants. There are examples of American historians and Dutch social psychologists engaging in fabrication and falsification. Meanwhile, Chinese anthropologists at Beijing University have been censured for plagiarism, and economists became mired in conflicts of interest following the global financial crisis.
It would be comforting to believe we can leave ethical dilemmas like these in the hands of university committees governing academic research ethics. But the sad truth is that in many places, the relationship between researchers and ethics committees is seen as adversarial. Current regulations are not always fit for purpose, lagging behind in their understanding of the new kinds of complex work that researchers are doing.
New ways of working in our disciplines, as well as broader social, political and economic shifts in our societies, bring their own ethical dilemmas. Some issues are familiar. How do researchers secure sensitive data when their institutions use cloud computing – particularly as we learn more about the information-gathering capabilities of national intelligence agencies?
Other issues such as research connected to big data and human-computer interactions have brought up new ethical questions. These are also relevant for scientific disciplines that are gathering social science data as they study interactions between technology and people.
Problems with petabytes
For instance, the emergence of predictive analytics and the mining of petabyte-sized datasets (1,000 terabytes) containing billions of pieces of information – so called “big data” – threatens to undercut the concept of informed consent in research. Political scientists studying everyday political communication in Scandinavia were able to gather over 100,000 tweets using just one election-related hashtag on Twitter covering a one-month period in the lead up to the Swedish national election in 2010.
It was impractical to obtain consent from all the people who tweeted and, even if it had been possible, this might have introduced a bias into the sample. So the researchers had to argue, ultimately successfully, with a research ethics committee in Sweden and a commission for privacy protection in research in Norway that publishing on Twitter constituted a public rather than a private act.
Problems with video
Other examples involve videoethnography, the video recording of everyday settings for research purposes. Following increased interest in videoethnography as a research tool, scholars now need to identify the potential uses for images. They must ensure the anonymity of participants and obtain consent from incidental people who enter the frame. It’s also important to maintain data security and control any secondary use of the research videos.
This is particularly important for videoethnographers working in hospitals who also need to minimise the possibility of causing distress to families of participating patients. Many education researchers are already familiar with having to protect students who opt out of research, but not necessarily with the social and educational consequences of making sure those students who do not wish to be filmed are seated in a “blind spot” in a classroom.
Scientists doing social science
New disciplines are also engaging in research using human subjects. Empirical research is growing in computer sciences, particular in those parts of the discipline that investigate human-computer interaction. But the line between human and non-human data can be blurred. In some cases, field testing of these new types of technology can even place human operators and other people in the vicinity at physical risk.
Where information and communication technologies are used for development and to promote social good, some research may be interventionist, as new technology is trialled in locations targeted for aid. But researchers need to assess what impact their interventions may have in a disadvantaged community. Who might be harmed and who might benefit from research? Where should they draw the line between coercive inducement and fair compensation? As researchers working in Bangalore slums reported, in poor communities even small gifts to children might compromise a family’s ability to make an autonomous decision.
Do the hard graft
Existing codes, regulations and even training materials may offer researchers little help with facing complex new conditions, as my forthcoming research documents. Rather than helping social scientists to respond to such new challenges, regulation of research ethics in many countries seems to be underpinned by an unsettling combination of models that were originally designed for biomedical research or institutional risk minimisation.
Individually and collectively, researchers have little choice but to do the hard graft themselves. They need to get used to identifying and working through ethical issues, reflecting upon and standing up for their decisions in public as best they can.