Tag Archives: UnBias

Civic Agency: a vision & plan

Civic Agency is an initiative aimed at encouraging people, at grassroots level, to engage with the social, cultural and political issues at the heart of our increasingly automated and divisive digital world. Social media and hyper-personalisation of digital experiences are becoming ever more prevalent as the interface between us and society. So, as we come to rely ever more on digital systems and technologies to run everyday life, we are coming to realise that society needs new ways to face the issues that these present; and new strategies for people to successfully navigate the implications.

AI, Machine Learning, Personalisation, Algorithm Bias, Automated Decision Making, Big Data
* * *
Ethics, Regulation, Responsible Innovation, Rights, Information/Media/Digital Literacy

Above are some of the headline issues and below, some of the proposed solutions. However we believe that more needs to be done to engage ordinary people in developing their own critical and civic thinking skills: to identify potential harms and to make better informed choices about what they do online, which services they use and how their data is protected from exploitation.

Our aim is to help people feel that they have agency and are empowered to make good decisions and choices, and for them to feel that their voice is being listened to and heeded in the corridors and places of power where laws, rights and regulations are determined. Practical Ethics at grassroots level, meeting in the middle with top down regulation and codes of practice in industry and public institutions.

Enabling Literacy

Awareness and literacy are crucial for people to be able to navigate our increasingly mediated world – Stéphane Goldstein has recently written an excellent argument for why this matters so much now.

“We cannot act wisely without making sense of the world and making sense of the world is in itself a profoundly practical action that informs how we experience reality, how we act, and the relationships we form. Without questioning our worldview and the narrative that has shaped our culture, are we not likely to repeat the same mistakes over and over again?”
Daniel Christian Wahl, Designing Regenerative Cultures

In the workshops I ran with young people that informed the creation of the UnBias Fairness Toolkit, it was clear that they had only the vaguest understanding of what their rights as children were (and would soon be as adults), and what laws already existed to protect them. The general sense of disempowerment when using online services (like buying clothes, shoes or other products) went as far as statements to the effect that they were powerless and unprotected whenever they interacted with the big internet companies (GAFA) or even small online retailers. Almost as if all digital services were a gift of the companies involved and could not be challenged even if they were doing wrong or questionable things. The young people had almost no conception of the scale in which they are being tracked online, across multiple sites and services, no matter what devices they use. When we created mappings of what they did online and how their personal data was being distributed across a huge range of platforms and services they were shocked and, to some degree, incensed – that they had been duped in some way, to give up their data so freely, every time they go online. 

On the positive side, at least in one school, the young people felt it was their duty to challenge this and to call for a safer internet. I think this was an early indication that this generation are more empowered to speak up and demand to be listened to, as the recent SchoolStrike4Climate/ FridaysForTheFuture protests have demonstrated even more palpably. It is possible that the seeds already exist for a society which expects ‘responsible’, sustainable innovation and development to be the default for designers and developers, no matter whether they work for a public institution, a non profit organisation or a profit-making corporation. We have seen the consequences of unbridled, irresponsible innovation play out and cause tremendous damage to democracy and to the societies we live in.

Now public dialogue and deliberation needs to be stimulated and to bring ordinary people’s concerns and desires to the same level of consideration as the privileged influence of gatekeepers, corporate lobbyists and policy makers. We are all stakeholders in this society, and we must not let lobbyists capture the agenda and subvert democratic principles. Concepts such as duty of care and the precautionary principle – pro-active and a priori approaches – could be baked in to the culture of innovation and development, not tacked on as after-thoughts or funded through marketing and corporate social responsibility budgets. Digital Safety, not digital security – social justice, not breaking things because they get in your way.

A Plan for Grassroots Engagement

Our proposal is simple: using the UnBias Fairness Toolkit as our building block, we propose to stimulate civic agency through:

  • Access : place copies of the toolkit in every school, in public libraries and make them available to any community that wants to get to grips with these issues for themselves.
  • Literacy : create an organic train-the-trainer programme and additional facilitation tools that lay the foundation for a participatory and grassroots-based approach to de-msytifying the issues – making the abstract tangible and actionable.
  • Engagement : train teachers, youth & community workers and public librarians in using the toolkit to engage people in developing their critical and civic thinking skills;
  • Collaboration : establish an organic network of people who can guide others to learn more and devise their own strategies – to have agency.

Expanding the Frame

Alongside this it is important that the toolkit can be adapted for a variety of different contexts and situations, age groups and experiences. For instance to discuss very specific topics such as security; online banking and finance; medical ethics and patient data. And for the training materials to be templates that people can build on themselves, not just rely on us to define and deliver.

We propose to collaborate with other key participants in these spaces to develop additional materials – Expansion sets – that make the toolkit modular and useful to more people (for example, new Example Cards for specific issues; a much expanded set of Glossary Cards etc). We may create additional worksheets and materials for teams to use as practical ‘responsible innovation’ tools. There may also be other tools and toolkits we can introduce and share.

How?

The tricky part is funding something like this – rather amorphous, profoundly unbusinesslike and with a Return On Investment that will definitely not be financial. I’ve been finding fellow travellers and talking with a variety of public and private organisations whose interests align with some of the above. But what this needs is resources to make it a reality. We have the basic toolkit, we just need funds to roll out the rest of the process, bit by bit.

Get in touch if you can help [giles at proboscis dot org dot uk].

Stimulating and Inspiring Civic Agency

Over the past couple of weeks – at the V&A Digital Design Weekend and the UnBias Showcase at Digital Catapult – I’ve been sharing and demonstrating the UnBias Fairness Toolkit to people from all kinds of walks of life. The response has been enormously enthusiastic as people have immediately imagined using it in the contexts of their own working lives and interests. They have instantly grasped its power to stimulate critical thinking, find and share people’s voices on these issues (bias, trust and fairness in algorithmic systems) and see how this can contribute to a public civic dialogue that involves industry, government, the public sector and civil society too.

What the Toolkit Offers

  • A pragmatic and practical way to raise awareness and stimulate dialogue about bias, trust and fairness in algorithms and digital technologies.
  • It is designed to make complex and often abstract ideas tangible and accessible to young people and to non-experts across society.
  • It supports critical thinking skills that can help people feel empowered to make better informed choices and decisions about how they interact with algorithmic systems.
  • It helps collect evidence of how people feel about the issues and what motivates them to share their concerns by contributing to a public civic dialogue.
  • It provides a communication channel for stakeholders in industry, policy, regulation and civil society to respond to public concerns about these issues.
  • It can also be used by developers of algorithms and digital systems to reflect on ethical issues and as a practical method for implementing Responsible Research and Innovation.

Where Next?

The next stage is slowly becoming clear – what I believe we need is a national programme to train people, especially those working with young people, in using the toolkit, and to inspire people working in industry, regulation and policy to understand how to use it as an applied responsible research and innovation tool. We want to get the toolkit into as many schools, libraries and other places where young people, and others of all ages, can enhance their awareness, their critical thinking skills and understanding of the issues we face for digital literacy and the profound effects on our society and democracy that digital technologies are having.

Over the coming months I will be sounding out potential partners and sponsors/funders to make this possible.

This would be the first step in a more expansive programme on enabling agency, building on this, and much of my and Proboscis’s previous work. Its not something I expect to achieve alone – so I am hoping to bring like-minded collaborators together under the umbrella of this concept of civic agency to grow our capabilities and capacities for engaging people in new forms of critical thinking, autonomous and collective action to address the challenges we face as communities and as a society today and for the future.

Civic Thinking for Civic Dialogue

Over the past six months or so I have been focused on my work for the UnBias project which is looking at the issues of algorithmic bias, online fairness and trust to provide policy recommendations, ethical guidelines and a ‘fairness toolkit’ co-produced with young people and other stakeholders. My role has been to lead the participatory design process on the Fairness Toolkit, which has involved devising and facilitating a series of workshops with young people in schools and a community group, as well as with stakeholders in the ICT industry, policy, civil society and research fields. My colleagues in the Human Centred Computing group at the University of Oxford and the Horizon Digital Economy Institute at the University of Nottingham, as well as Informatics at the University of Edinburgh, have been wonderful collaborators – providing a rich intellectual and pragmatic context for developing the tools.

The co-design workshops with two schools (in Harpenden and in Islington) and with a young women’s group in Oxfordshire explored what their levels of awareness of the issues were, how relevant to their own lives they perceived them to be, and what they thought should be done. In each workshop, and with each group, we consistently encountered quite different perceptions and experiences – often unexpected and surprising – whilst also observing certain commonalities, which were echoed in the findings of the Youth Juries which our colleagues at Nottingham have been running for UnBias since late 2016. Many of the young people expressed a certain fatalism and lack of agency regarding how they use technology which seems to foster a sense of isolation and inability to effect change. This was coupled with a very limited sense of their rights and how the law protects them in their interactions with service providers, institutions and big companies. Unsurprisingly, they often feel that their voice is not listened to, even when they are the targets of some of the most aggressive marketing techniques.

The tools have thus been informed and shaped by young people’s perceptions and their burgeoning understanding of the scale and depth of algorithmic processes affecting modern everyday life. The tools have also been designed to address the atomising effect that personalised technologies are increasingly understood to have – whereby the increasing personalisation of platforms and services isolates our experiences of media and the mediated world from each other. Where broadcast technologies used to be understood to have a homogenising effect on societies, networked technologies, and the highly personalised software services running on them, are creating a sense of isolation from other people’s cultural and social experiences as they serve each of us something more bespoke to our own tastes and preferences. Recent controversies over the use of targeted advertising in US and UK elections has exposed the iniquitous consequences of such hyper-specific campaigning, and offered a new set of insights into the wider, and deeper social and cultural impacts happening around us.

I have tried to design a toolkit that could build awareness of these issues, offer a means to articulate how we feel about them, and provide a mechanism for ‘stakeholders’ (in the ICT industry, policymakers, regulators, public sector and civil society) to respond to them. What has emerged is something I call a ‘civic thinking tool‘ for people to participate in a public civic dialogue. By this I mean a mode of critical engagement with the issues that goes beyond just a  personal dimension (“how does this affect me?”) and embraces a civic one (“how does this affect me in relation to everyone else?”). And then, when we participate in a public dialogue about these issues, it is not simply conducted in public, but it embraces the co-construction of our society and acknowledges everyone as having a stake and a voice within it. It is about trying to find co-constructive and non-confrontational means to engage people in critical reflection about what kind of world we want to have (and the roles algorithmic systems in particular should play in it).

On Monday we held a workshop to preview the first draft of the toolkit and seek feedback from a variety of stakeholders. Take a look at the presentation below to find out more:

The response has been very encouraging – highlighting the strengths and revealing weaknesses and areas that need additional development. The next stage is to start a testing phase with young people and with stakeholders to refine and polish the toolkit.

We are also developing relationships with “trusted intermediaries” – organisations and individuals who are wiling to adopt and use the toolkit with their own communities. As the UnBias project concludes in August, our aim is to have the toolkit ready for deployment by whoever wants to use it this Autumn.

Fairness and Bias in an Algorithmic Age

unbias-logo2

Last month a new research project of which I am part got underway – UnBias: Emancipating Users Against Algorithmic Biases for a Trusted Digital Economy. Its a collaboration between the Universities of Nottingham (Horizon Digital Economy Institute), Edinburgh (Informatics) and Oxford (Human Centred Computing) funded by the EPSRC through its Trust, Identity, Privacy and Security in the Digital Economy strand. Over the next two years it will look at the complex relationships between people and systems increasingly driven by personalisation algorithms and explore whether, and to what degree, citizens can judge their trustworthiness.

My role will be to lead a co-design process that will create a ‘fairness toolkit’ : raising awareness about the impact of algorithms on everyday behaviours; devising pragmatic strategies to adapt around them; and engaging policymakers and online providers. We will be working with schools and young people to co-develop the toolkit – following in the wake of previous projects exploring young people and social media, such as Digital Wildfire.

For me this project cuts to the quick of concerns at the heart of today’s society: empathy, agency, transparency and control. I will be bringing ideas and practices to the project I have been exploring from a number of different trajectories over the past few years, from my work on the Pallion project to data manifestation and reciprocal entanglements. I am particularly excited as this marks my first formal collaboration with Oxford’s Human Centred Computing research group with whom I’ve been in dialogue for a couple of years.