Over the past six months or so I have been focused on my work for the UnBias project which is looking at the issues of algorithmic bias, online fairness and trust to provide policy recommendations, ethical guidelines and a ‘fairness toolkit’ co-produced with young people and other stakeholders. My role has been to lead the participatory design process on the Fairness Toolkit, which has involved devising and facilitating a series of workshops with young people in schools and a community group, as well as with stakeholders in the ICT industry, policy, civil society and research fields. My colleagues in the Human Centred Computing group at the University of Oxford and the Horizon Digital Economy Institute at the University of Nottingham, as well as Informatics at the University of Edinburgh, have been wonderful collaborators – providing a rich intellectual and pragmatic context for developing the tools.
The co-design workshops with two schools (in Harpenden and in Islington) and with a young women’s group in Oxfordshire explored what their levels of awareness of the issues were, how relevant to their own lives they perceived them to be, and what they thought should be done. In each workshop, and with each group, we consistently encountered quite different perceptions and experiences – often unexpected and surprising – whilst also observing certain commonalities, which were echoed in the findings of the Youth Juries which our colleagues at Nottingham have been running for UnBias since late 2016. Many of the young people expressed a certain fatalism and lack of agency regarding how they use technology which seems to foster a sense of isolation and inability to effect change. This was coupled with a very limited sense of their rights and how the law protects them in their interactions with service providers, institutions and big companies. Unsurprisingly, they often feel that their voice is not listened to, even when they are the targets of some of the most aggressive marketing techniques.
The tools have thus been informed and shaped by young people’s perceptions and their burgeoning understanding of the scale and depth of algorithmic processes affecting modern everyday life. The tools have also been designed to address the atomising effect that personalised technologies are increasingly understood to have – whereby the increasing personalisation of platforms and services isolates our experiences of media and the mediated world from each other. Where broadcast technologies used to be understood to have a homogenising effect on societies, networked technologies, and the highly personalised software services running on them, are creating a sense of isolation from other people’s cultural and social experiences as they serve each of us something more bespoke to our own tastes and preferences. Recent controversies over the use of targeted advertising in US and UK elections has exposed the iniquitous consequences of such hyper-specific campaigning, and offered a new set of insights into the wider, and deeper social and cultural impacts happening around us.
I have tried to design a toolkit that could build awareness of these issues, offer a means to articulate how we feel about them, and provide a mechanism for ‘stakeholders’ (in the ICT industry, policymakers, regulators, public sector and civil society) to respond to them. What has emerged is something I call a ‘civic thinking tool‘ for people to participate in a public civic dialogue. By this I mean a mode of critical engagement with the issues that goes beyond just a personal dimension (“how does this affect me?”) and embraces a civic one (“how does this affect me in relation to everyone else?”). And then, when we participate in a public dialogue about these issues, it is not simply conducted in public, but it embraces the co-construction of our society and acknowledges everyone as having a stake and a voice within it. It is about trying to find co-constructive and non-confrontational means to engage people in critical reflection about what kind of world we want to have (and the roles algorithmic systems in particular should play in it).
On Monday we held a workshop to preview the first draft of the toolkit and seek feedback from a variety of stakeholders. Take a look at the presentation below to find out more:
The response has been very encouraging – highlighting the strengths and revealing weaknesses and areas that need additional development. The next stage is to start a testing phase with young people and with stakeholders to refine and polish the toolkit.
We are also developing relationships with “trusted intermediaries” – organisations and individuals who are wiling to adopt and use the toolkit with their own communities. As the UnBias project concludes in August, our aim is to have the toolkit ready for deployment by whoever wants to use it this Autumn.