News
Gartner calls Digital Ethics a strategic trend for 2019 – but ethics are not enough
In its most recent predictions of trends for the coming year, Gartner indicates digital ethics and privacy as a top strategy for companies. As the technology and financial industries, governments and even research institutions grapple with ethical issues raised through the data practices and their social impact, ethics starts to fill the vacuum left by the regulators and a quite absent political debate of these issues.
At Data School and Datafied Society, we have been busy for years studying/teaching digital ethics and developing methods for responsible data practices. In a joint effort with Utrecht University’s programme for Applied Ethics, Data School developed the course Digital Ethics, which is taught by Dr. Joel Anderson and Dr. Mirko Tobias Schäfer. Our Data Ethics Decision Aid (DEDA) helps to review data projects concerning responsibilities and ethics. Recently, Maranke Wieringa started her PhD research with us, where she inquires how to develop accountable algorithms. Last year, our DEDA developer Aline Franzke started a PhD project at University of Duisburg-Essen where she researches data practices and ethical frameworks in public administration. Our research is carried out within societal organizations that are facing the challenges of algorithmic governance and data practices. We are glad, we were on time. We are here to help organisations to tackle the challenges of digital ethics. But we also think that ethics are not enough.
As much as we appreciate Gartner to point out digital ethics as essential asset for organizations working with data, algorithms and digital technology in general, we are convinced that ethics will not solve these issues in the long run. A society-wide debate on values and on how we want to live in the digital age is desperately needed. In our opinion, political parties should be able to review technology through the lens of their specific world-views and formulate political positions accordingly. A party that has no position on how their values relate to digital technology (or the environment) cannot be expected to develop any useful agenda for the challenges we are facing in the 21st century. “The technology we are building and the environment we are destroying constitute the main political challenges for the 21st century” says Data School project leader Mirko Tobias Schäfer.
“The technology we are building and the environment we are destroying constitute the main political challenges for the 21st century”
Ethics help us in guiding our decisions where law does not apply or does not sufficiently cover the diverse options. If it comes to data projects, the complexity and the long term implications are often underestimated. Developers, users and policy makers alike underestimate the impact these practices can have on personal livelihood, knowledge development and most profoundly the results these processes are expected to develop. Microsoft was surprised how their AI powered Twitter account Tay Tweets turned quickly into a white supremacist hate spitting troll.
After running out new software for evaluating social welfare claims,the Michigan Unemployment Agency noted an unusual high return of applications labelled as fraudulent. The indicators the algorithm was supposed to check for credibility were impossible to live up to in a real world, where addresses of applicants change, were forms are filled with errors or are simply incomplete.
ProPublica’s excellent review of the algorithm for estimating risk of recidivism revealed a racist bias and incorrect risk assessment of defendants. Frank Pasquale’s book The Black Box Society sheds light on the dangers of corporate algorithms in the financial industry. In Weapons of Math Destruction, data scientist Cathy O’Neil presents a wide range of examples how results from analysis processes do not represent the empirical reality and how algorithms enforce a feedback loop that affects personal livelihoods. Virginia Eubanks argues in Automating Inequality how analysis processes for profiling and prediction in social service programmes target poor people and enforce their precarious situation. This incomplete list is to indicate that algorithms need accountability and that there is a growing debate calling for deliberation and regulation.
The tech companies respond to these challenges by claiming they would built new technology that tackles all the downsides of the technology they have built in the first place. And ethics come in as an apologetic discourse often intended to evade regulation and political debate. At Data School, we are convinced that ethical decision making is essential in data projects. But we also argue that it should lead to a culture of accountability. Stakeholders, politicians, critical audiences and journalists must be able to call for scrutiny of tools, projects and services, especially when they perform public administration tasks or profoundly affect individual livelihoods. We do not only need ethics, we also need politics!