IEEE Talks Big Data: Lyria Bennett Moses & Greg Adamson
Lyria Bennett Moses chairs IEEE’s Society on Social Implications of Technology (IEEE-SSIT) Australia Chapter and is an Associate Professor of the Faculty of Law, University of New South Wales (UNSW) in Kensington, Australia. She is also a member of the IEEE Big Data Initiative Steering Committee. Greg Adamson serves as president of IEEE-SSIT and is a Principal Fellow at the Melbourne School of Engineering, University of Melbourne, Australia. Both Bennett Moses and Adamson take deep personal and professional interest in the social implications of technology. They share their personal views of the topic in this interview.
Question: What draws each of you to this topic?
Greg Adamson: The world has so many problems and so many great technologies. A lot of those problems could be solved by existing or nascent technologies. Why aren’t they? Why does every major city in the world have homeless people when we’ve been building shelters for several hundred thousand years? For me it’s a long term question. It involves philosophy, and policy. To my mind, how we use technology and effectively derive value from it is more important than simply inventing more technologies.
Lyria Bennett Moses: I came to the topic a bit sideways. My doctoral research was on the interaction between law and technology. Throughout that process I was very interested in the legal issues that technologies generate and I came to look more broadly at the relationship between law, society and technology. Like Greg, I’m interested in how policy that addresses technology can create desirable outcomes.
Question: How does one go about assessing the social implications of technology?
Adamson: The wording of that question is quantitative. That approach has a role. We see organizations with an office of technology assessment and we often quantify in, say, risk management. Those approaches work in the right context. The language I prefer to use would include “understand.” Say I’m working on a technology. I must think about the way it could change the world. I can’t know how – nobody can – but I must think about it. Because of the almost infinite diversity of technology possibilities – my technology could change the world for good or ill, both or somewhere in between – quantitative measures tend to fail us. I think it’s more useful to talk qualitatively and understand the attitudes that people bring to technology or that they can be encouraged to bring to technology.
Bennett Moses: An enquiry into the social implications of technology invokes a very big question. Asking about the social implications of changes in technology focuses the discussion a bit more. Think of society as an ongoing system with elements of law, economics, science, what have you. And changes in technology have potentially far-reaching influence over all of those elements. Quantitative measures don’t get us very far. The real question is largely one of working out the implications. I agree with Greg, that’s really a qualitative question.
Question: Would each of you give us an example of your current work?
Bennett Moses: I’ve been looking at the implications of data analytics tools in decision making. I’m a “key researcher” at Data to Decisions Cooperative Research Centre here in Australia that is examining big data collection and data analytic tools for law enforcement and national security purposes. Our challenge is to understand the ethical and legal implications of those tools in that highly sensitive context. We’re asking, “What do agencies want to do with data?” Then, “What are the implications?” “What legal frameworks should society require agencies to comply with for positive outcomes?” One challenge with data analytic tools is not necessarily, say, overt profiling, but the ways in which profiling can take place hidden within relatively complicated algorithms used to analyze that data. If that issue exists, what action do we take? Build better analytical tools? Evaluate the assumptions we perhaps unconsciously make as we draw inferences from data? Evaluation, review, oversight, accountability, legal frameworks all seem appropriate if, in this example, the use of big data and analytics for profiling terrorism suspects has undesirable impacts on some communities.
Adamson: I’ve been working to raise awareness of the social implications of technology through a particular figure from 20th-century technology history, Norbert Wiener. Wiener served in the mathematics department at MIT from 1919 until his death in 1964. He’s best known for an influential book, Cybernetics or Control and Communication in the Animal and the Machine. His name is virtually unknown today. From my observation, of all of the 20th-century technologists he went the furthest in drawing people’s attention to the implications of factory automation and the automation of jobs and other activities. I recently chaired a conference in Melbourne on his work, Thinking Machines in the Physical World. At the conference, I presented a paper comparing his call to scientists to take responsibility for their work, in a couple of letters that appeared in The Atlantic Monthly in 1947, with General Eisenhower’s military-industrial complex speech in 1961. I compared and contrasted these two sources as ways to review the role of technologists in modern industrial society. Eisenhower’s speech has generally been analyzed from a policy or an economic point of view, but not so much from a technology practitioner’s point of view, which is Wiener’s frame of reference.
Question: Where do IEEE and the SSIT fit in?
Adamson: SSIT was one of the Melbourne conference sponsors. And SSIT is heavily involved in IEEE policy work now. We’re looking at, say, the U.N. sustainable development goals. Instead of using value-laden phrases like “progress – good or bad?” – we are trying to find a platform that others have developed, such as the seventeen sustainable development goals. Then we can ask specific questions. Does a particular technology contribute to those goals? What are the potential outcomes of various technology choices? Again, we don’t want to burden a technology with value judgments because it is likely to have potentially beneficial uses and potentially injurious uses.
Bennett Moses: It has been said that “technology is neither good nor bad, nor is it neutral.” Technology has “effects,” which might be a more neutral term. And those effects have to be measured against some kind of societal value system. The sustainable development goals are a good list of internationally applicable goals which address societal values. That’s a starting point to understand the range of implications of our technology choices. Sometimes we say “no” to technologies. We say something runs counter to our values. Human cloning is an example of that. But sometimes we can decide to look at a technology’s particular effects and ask, “How can we change the design of that technology to better align with our values”?
Adamson: SSIT is also involved with one of the subcommittees in the IEEE Internet Initiative that’s looking at policy and engagement. We are working on the “Ethics, Society and Technology” initiative organized within Technical Activities. And we’ve committed volunteers to all of the Future Directions initiatives.
We like to say that “the idea of any technologist not taking an interest in the social implications of their work is disturbing, but not unthinkable.” I invite readers to join SSIT. We think that technologists, in fact, have a profound responsibility to consider the social implications of their work every step of the way.
|Lyria Bennett Moses chairs IEEE’s Society on Social Implications of Technology (IEEE-SSIT) Australia Chapter and is an Associate Professor of the Faculty of Law, University of New South Wales (UNSW) in Kensington, Australia. She is also a member of the IEEE Big Data Initiative Steering Committee.|
|Greg Adamson serves as president of IEEE-SSIT and is a Principal Fellow at the Melbourne School of Engineering, University of Melbourne, Australia.|