Tag Archives: AI ethics

Daemons of the Shadow World

It is my firm belief that my role as an artist is to imagine the unthinkable – to perceive beyond the horizon of the probable and to stretch one’s imagination beyond the limits of the normative everyday. By opening up such vistas it becomes possible to anticipate impacts and consequences of actions and decisions – acquiring uncommon insights into potential futures we may come to inhabit.

Daemons of the Shadow World is a proposal for an artwork that recasts privacy and the role of individual or personal data; that rethinks how data subjects are commodified; that explores what it could be like to unbalance how power is expressed and exercised through data analysis and use.

Almost every aspect of modern life is now measured, sensed, datafied, transmitted, analysed and transacted. Those transactions bloom like flower banks to encompass not just each individual’s data profile and traces, but everything and everyone they are connected to. This quantification and measurement of each interaction – the inferences that are drawn, the biases that result and the effects which ensue – is propelling us towards an ever more normative society. A social and cultural entropy. Each individual is becoming ever more tightly defined, less fluid. We are being reduced to a singular concept of identity, one that assumes repetition is truth, and that predictability is a desirable quality.

But, of course, the history of humanity is also that of diversity, divergence and struggle: especially by those upon whom power is exercised by those who wield it. There are many ways of enforcing conformity through such means as religions and ideologies, conventions and traditions. These have the habit of making people behave in a predictable and controllable manner – consumerism and the digital society is merely another manifestation of this. The inducements offered in our consumer society to accept socially normative concepts of identity are like a feedback mechanism that reinforces itself and entrenches asymmetries of power. In the same way, it discriminates against those for whom fluidity of identity is a necessity – people who are often the most vulnerable in society : anyone who diverges from the norm, whether by virtue of age, ethnicity, gender, sexuality, or status – such as refugees.

Data profiling clearly is having normative effects, reinforcing and entrenching privileges for those who are already best served by society and status quo. What about those for whom no singular identity is possible or desirable? Those whose identities are fluid, in construction or even deconstruction. These are the people most at risk of being excluded, segregated and even criminalised. The subtleties, quirks and nuances that allow us to defy definition are all too easily captured, measured and sorted into data points to be exploited.

Any transparency in data traffic goes only one way – those of us who share our data with the big systems are not privy to how those acquiring it use that data to commodify our behaviours ever further, ever deeper – to trade them with whom they choose, and to extract whatever benefits they desire. We are only just beginning to become aware of how egregious such uses have been – from the manipulation of voting intentions in elections via social media, to ‘nudge’ systems adopted by governments and public agencies, to total digital surveillance by the Five Eyes network of intelligence agencies. Different cultures have markedly different attitudes to ‘privacy’ – as evinced by China’s state-sponsored social credit system (itself perhaps less different from Western commercial data capture, monetisations and behavioural nudges than we might suppose).

Privacy, as commonly defined in Western industrial societies, is itself a relatively modern concept – most likely emerging in Europe in the context of the reformation and counter-reformation period. Its roots are bound together with the rise of mercantilism and the equally modern concept of the individual. It found early articulation in the shifts in domestic architecture from the 1500s on – the creation of private spaces (such as rooms) in shared households, especially where there was a need to worship in secret as religious conformity began to fracture between Protestantism and Catholicism. It also found articulation in the commonplace books where a newly literate populace began to record their internal, private thoughts, interests and reflections. This individual subjectivity reaches a critical mass in Descartes’ formulation of the self as a discrete entity separate from all else.

It should, however, be no surprise that now, in an age of near total surveillance, privacy is on the verge of a complete reconfiguration. It is, coincidentally, happening alongside the realisation that western industrial capitalism is also facing its own zero-sum game in which not just humanity, but all life teeters on an edge. Unbridled consumption of finite resources, leading to rampant ecocide and mass extinction, presents a distinct trajectory that humans, our cultures, societies and civilisations, cannot sidestep.

To safeguard individuals and their personal data, privacy has for sometime been proposed as a human right that should be inalienable. But what if an alternative, perhaps even complementary, strategy could be to turn the tools of data analysis against those who seek to define us and measure us as singular commodities by synthesising a plurality, a multiplicity of identities – camouflage of a kind? What if privacy is re-thought as a condition not a commodity – a dynamic sequence of states that we flow through rather than a static position to cling on? How do, and have, other cultures navigated the duality of individuals within communities and shared spaces? What might we learn from cultures which do not privilege the sense of individuality as ours does?

MyLoki – a daemon for digital dazzle

This project is a thought experiment exploring how it might be possible to devise ‘autonomous agents’ (daemons) that synthesise and propagate additional data – using neural networks and employing techniques such as ‘generative adversarial networks’ – to mask our data traces and transactions across systems. In effect creating a ‘data dissensus’ in the accuracy of our individual ‘shadow profiles’, undermining their statistical value through massive duplication and generation of duplicitous activities that resemble our actions but, in effect, create multiplicities of possible identities. Overwhelm the algorithms of oppression with too many statistically similar variables that confound their ability to ‘predict’ and shape our behaviours.

Instead of referring to “Artificial Intelligence” and anthropomorphising it with qualities it is far from having, let’s call such software a “Model for Partial Statistical Probability”. How could we devise and use such programs to act as software agents – daemons – for each of us, to dazzle the data harvesters with a blizzard of statistically probable profiles, endlessly generated to camouflage the data traces of our actions and behaviours in the digital world? Whereby each  would become a portal to an infinite number of selves, all bifurcating in myriad ways – perhaps by just a hair’s breadth – each one polluting the value of our data trail by injecting just enough uncertainty to render the value of the data as junk. A Trickster, like the Norse god Loki, working on our behalves to frustrate the will of the corporates, political parties, special interest groups and governments that seek to use personal data to commodify us and profit by our, often unwitting, collusion.

What could the features of such agents be? What limitations might need to be placed on their use? How might we need to re-think our entire digital economy – not to see data as a commodity, but as condition?

I invoke Loki and the figure of the Trickster, precisely because they are ambiguous – causing mayhem but bringing luck and fortune. Sometimes misfortune. Uncertain. Are they not the type of gods we might want to align ourselves with against the patrician, all seeing, all knowing Olympian Algorithmic gods of our datafied society? Or perhaps like a kind of Orphic mystery wherein the exuberance of multiple data selves being propagated into the shadow digital world allows us a moment of escape from being subjectified and commodified ad nauseam?

A Conceptual Projection

Since it is unlikely to built (from a technical standpoint) and could possibly present unknown dangers if released (from an ethical perspective) – this thought experiment requires some kind of conceptual prototype. It might take the form of a set of ‘blueprints’ for the conditions under which a MyLoki daemon might be activated and operate; or a diagram of the actions and possible consequences for what could happen when individual’s data become pluralities; not just duplicitous but multiplicitous.

And this is the next part – with such a set of blueprints, I would need to devise a forum or space in which I can invite a group of people with knowledges and skills from a range of disciplines and sectors to come together and explore the ramifications of such an idea. What theoretical frameworks could emerge from such an unreasonable, improbable and irrational possibility?

By proposing something, that is as lateral and excessive a conceit for resolving the conundrum of privacy and personal data as the Judgement of Solomon was for determining the maternity of a disputed child, I hope to explore things which might indeed be truly unthinkable in our current situation. If we can think beyond the bounds of reason and the horizon of the probable, what uncommon insights mights emerge that we cannot fathom now?

Giles Lane
London, October 2019

(Originally developed with the support of the Open Data Institute‘s Data as Culture Copy That theme, December 2018)

A Calculation is Not a Judgement

When human judgement is drained from a system and reductionist rules are applied to complex situations, the results can lead to terrible injustices and harms. If we privilege the procedural outcomes of artificial systems over the importance of humanity, life and experience and the more-than-human world, we will likely face a self-reinforcing feedback loop of such effects, not unlike the existential threat of runaway climate change.

I wish to advance a proposition… a distinction. Namely, that there is a significant gulf between the mathematical operation by which a calculation can be arrived at, and the emergent process of evaluation by which a judgement is made. I think this is an important distinction for our times, because it describes the difference between a procedure of abstraction and a process of conscious deliberation. A calculation can be determined by a non-sentient entity following a series of steps to accomplish an end (such as an algorithm). Humans have created machines that can do this at scales and speeds far beyond our own individual capabilities. A judgement, however, requires a sentient being, imbued with consciousness and the capacity to exercise discernment and perception, to arrive at an authoritative opinion.
And time.
It takes time to absorb and reflect, to ruminate and pass judgement. What I hope to argue here is that consciousness itself is an irreducible constant fundamental to fair and trustworthy judgement.

I am, perhaps, re-treading old ground: the argument between quantitative and qualitative methods has rolled on for at least two centuries – rooted in the slow rise to dominance of a kind of scientism as the prevailing order of knowledge and worldview. Both methods have merits, and their integration or synthesis can lead to remarkable achievements. Both are rooted in very human beliefs and traditions of how knowledge comes about. Wielded together, they stimulate extraordinary benefits but, when asymmetrical in influence and power, the drawbacks are considerable.

We now live in a world where the quantitative has achieved ascendancy in almost all areas of life, where computations and automated decision-making affect the everyday lives of billions of people. Tremendous advantages in speed, efficiency and technical capabilities across the panoply of human activity have resulted. But they also amplify injustices and inequalities, or compound environmental and ecological over-exploitation and destruction. In doing so their scale and speed disempower and degrade the intrinsic agency of human beings in favour of inflexible and unfeeling systems. It is crucial to see that it is a deliberate choice to quantify and sort the world in this way, not an impartial effect of some immutable logic that cannot be challenged.

As Oscar Wilde might have framed the distinction, the difference is one of knowing “the price of everything and the value of nothing” (from The Picture of Dorian Gray). It is trivial to calculate the price of something according to a formula of tangible inputs and costs – yet far more elusive to judge its value. That demands a broader spectrum of parameters, such as context, emotion, culture and other intangibles. Our human fallibilities lead us to both extremes. Judgements, too, can be unsound. Intention and ethos determine how and why we adopt a particular trajectory – as much as our adherence to one method or another, one disciplinary process or another.

The predicaments outlined above are, I believe, at the very root of the proliferating existential dilemmas which humans, indeed all life, now face. The stabilities of our ways of living are being challenged everywhere by changes in natural forces we have clearly, recklessly, contributed to – possibly beyond our capability to re-balance, notwithstanding an irrevocable and devastating loss of biodiversity. I believe that the over-exploitation of the natural world, of other creatures and lifeforms has been facilitated by precisely the unfeeling calculation of systems based on abstracting life into discrete parts that can be separated from a complex whole and used indiscriminately without repercussion. It is a brutal and destructive alienation that does not factor in to its calculation of profit and loss the consequences and costs of its atomistic unravelling of mutual interdependence. We see the results across the planet in the systematic extraction of specific resources causing catastrophic loss of entire environments and ecologies surrounding them. There is no doubt that this can only persist for so long.

We also see it in the human sphere when bureaucratic systems over-emphasise adherence to rules above consideration of individual, or even collective, circumstances. One of the most appalling examples in recent years has been the terrible injustices and harms inflicted on the Windrush Generation by Theresa May and the UK Home Office’s “Hostile environment” policy. And these are just the most visible examples of intentional applications of the technology of bureaucracy, and its component methods and tools, to harm the vulnerable. They are almost certainly intended more as a distraction, or sleight of hand, whilst other yet more egregious activities are kept in the shadows. It seems to me that much of this is being done as a climactic frenzy of industrial capitalism – to squeeze every last drop of advantage from a system that is so weighty with its own entropy that it cannot possibly endure indefinitely. Banking the last pennies to hedge against an uncertain future where, it is assumed, the wealthiest will command the most safety, luxury and authority.

But, I doubt it will go the way anyone currently anticipates – the speed of environmental and ecological transformation will most likely confound our best models and projections, since none can reliably forecast the full range of interdependent, interwoven forces and factors we have interrupted.

I doubt the wisdom of focusing our civilisation’s faith too closely on systems that use automated, statistical calculation of probabilities to make future-facing decisions on our behalf, let alone in the here-and-now. It would itself be a further profound disconnection from our very humanity to hope that such technologies will ‘save’ us from the profound disconnection of the human from the more-than-human natural world. This has been gathering pace for hundreds of years, since at least the European discovery of the Americas and the growth of modern industry and global capitalism. Our technologies are reflections of our cultures and societies, not simply neutral, inevitable outcomes of rational enquiry and engineering. They arise out of our cultures, beliefs, behaviours – they are value-driven… the products of choices, intentional or unconscious.

Evidence is growing (as documented by ProPublica among others) that demonstrates how algorithmic decision-making has a tendency to amplify existing biases leading to exacerbated injustices and inequalities, as well as other pernicious effects. Instead of the promise of impartiality that has justified an increasingly reliance on both bureaucracy and algorithmic systems, we have come to realise that they have all of our human fallibilities coded in, but with the additional twin enhancements of speed and scale – rippling the effects out further and faster. Now would be an apposite time to check the headlong rush to automate how we manage our societies and everyday lives, especially as we must shift our economies and industries from extractive and destructive activities to ones which preserve and maintain life and ecologies. The two are inextricably linked.

… some are already engaged in experiments that try to make the possibility of a future that isn’t barbaric, now. Those who have chosen to desert, to flee this “dirty” economic. war, but who, in “fleeing, seek a weapon,” as Deleuze said. And seeking, here, means, in the first place, creating, creating a life “after economic growth,” a life that explores connections with new powers of acting, feeling, imagining, and thinking.”
Isabelle Stengers, In Catastrophic Times (2015)

Knowledge, Skill Acquisition & Competence

Stuart and Hubert Dreyfus’ model of skill acquisition is a useful guide in discerning the distinction between a calculation and a judgement, through tracing the path from novice via advanced beginner, competent, proficient through to expert. It describes how, in the early stages, the novice must learn the rules and understand how to use them. As their experience grows (and presumably confidence in their ability to apply the ‘right’ skills), they rely less on formal analytical application of the rules and more on their intuitive knowledge of what will work best in the given situation.

“Dreyfus and Dreyfus’ essential point is to assert that analytical thinking and intuition are not two mutually conflicting ways of understanding or of making judgements. Rather they are seen to be complementary factors which work together but with growing importance centred on intuition when the skilled performer becomes more experienced. Highly experienced people seem to be able to recognise whole scenarios without decomposing the into elements or separate features.”
Mike Cooley, Architect or Bee? The Human Price of Technology (1980)

This model complements the four stages of competence (often attribution to Abraham Maslow), which describes the path from Unconscious incompetence via Conscious incompetence, then Conscious competence to Unconscious competence. Again, from a baseline of lack of ability, and even a lack of awareness of inability, there is a trajectory towards competency becoming innate. It becomes embodied not just in the mind, but absorbed into a whole sense of self such that the delivery of expertise is often described as the expert having an intuitive feeling for the right thing to do.

Experience then becomes the key to transcending the application of rigid rules-based approaches and developing craft, skills and expertise. It is also the domain of art and creative practices. What this amounts to, is another order of knowledge that Michael Polanyi called “tacit knowledge”. It is not the procedural, codifiable, step by step, “explicit knowledge” approach that calculation and computation are so excellent at, but something transmitted through experience itself so that the learner eventually acquires the ability to judge what is right to do. Not simply a linear problem-solving trajectory, but a holistic awareness of the whole problem or task. It is committed and informed, acquired by desire and often with passion and with care – a praxis established through dialogue and reciprocal exchange. Being relational, it is a foundation for cooperation and collaboration.

“While tacit knowledge can be possessed by itself, explicit knowledge must rely on being tacitly understood and applied. Hence all knowledge is either tacit or rooted in tacit knowledge. A wholly explicit knowledge is unthinkable.”
Michael Polanyi, Knowing and Being (1969)

Irrational Logics

The Judgement of Solomon (Hebrew Bible/Old Testament 1 Kings 3: 16-28) offers a classic example of wisdom in a judgement that realises justice not through a direct procedure, but through what could be described as an irrational logical path. The story tells of King Solomon called to make a ruling between two women, both claiming a baby as their own, as to who is the actual mother and should keep the infant. With no other way to tell which woman was the true mother, his perverse solution was to propose cutting the baby in half, dividing it equally between them. His wisdom is reflected in the story when one woman gives up her claim to save the life of the child, thus revealing her as the (most likely) true mother.

The story is of a classic type that has parallels in the literatures and storytelling traditions of other cultures. Such stories illustrate how, sometimes, there is no rational path to truth or a just decision but, instead, an irrational, counter-intuitive approach can reveal it in unexpected ways. It is imaginative and transgressive, employing techniques familiar in creative, artistic practices – excessive, surreal and disturbing. These are not quantities but qualities of imagination. It may be perfectly possible to compose a fiction or a piece of music or an artwork to order, by following rules and formulae (for instance the ‘police procedural’ novel or many a three minute pop song). Yet something else is needed for it to become art or literature that transcends the skeleton of its construction and rises above hackneyed cliché and routine prosaicness. Our entire mode of existence and civilisation now hinges on dilemmas as, or even more, knotty and seemingly irreconcilable as the problem faced by Solomon. We are going to need the wisdom of irrational logics and unfettered imaginations to devise visionary, engaging and realistic ways to resolve them.

“Hard times are coming, when we’ll be wanting the voices of writers who can see alternatives to how we live now, can see through our fear-stricken society and its obsessive technologies to other ways of being, and even imagine real grounds for hope. We’ll need writers who can remember freedom – poets, visionaries – realists of a larger reality.
Ursula K. Le Guin, “Freedom” in Words Are My Matter (2016)

Beyond Measurement: the incalculable heart of humanity

Fairness and trust are both qualities or conditions of human experience rather than fixed rules that can be applied indiscriminately. Neither are particularly amenable to formulaic measurement, indeed they are often critiqued precisely because they are almost impossible to quantify. In the context of automated algorithmic decision-making systems (e.g. in Artificial Intelligence and Machine Learning) this lack of fixity and highly subjective nature is frequently alluded to. The lack of stable frames of reference for what is at any one time fair, is a feature of its contingent nature. Likewise with trust – what constitutes the nature of trust in any given situation is highly contingent and almost impossible to codify into a stable matrix of elements and factors.

Yet we instinctively know what feels fair or unfair, and what trust feels like, as equally, when it switches to distrust. Thus it appears that consciousness is also a necessary factor in experiencing fairness and trust, just as I reckon it is for arriving at a judgement. And, since feeling is such an important aspect of both fairness and trust, it could be that these two conditions, like our human intelligence, are bound up not only in the mind and thinking, but are co-located and co-created in our embodied experience of knowing. Perhaps neither are at all suitable for programmatic calculation.

What then, drives some to persist in trying to automate trust and fairness in an effort to remove the human from the loop in deciding what is fair or trustworthy? It seems perverse to me to be using such technologies to replace the human, instead of defining alternatives that could enhance our understanding and judgement by doing what computers and systems do best – classify, sort and order huge quantities of information to reveal patterns that are not immediately obvious. The analysis and calculation of data could then inform human-derived judgements that encompass broader contexts and situations including mitigating factors and contradictory states not suited to binary classifications. Better together, one might say.

The upshot of the successes of Deep Blue against Gary Kasparov in 1997 and AlphaGo against Lee Seedol in 2016 has been to invigorate both chess and go with new approaches and strategies, enhancing the potential and pleasure of the process of playing. The successes of these systems have not diminished either game, but suggested new possibilities. And here there may be a lesson in determining the difference between a sentient player with consciousness for whom the playing itself may be the point, and a procedural system wholly focused on achieving a finite goal: winning. By focussing on the objective of an end as the goal, those seeking to train “artificial intelligences” might be missing the fundamental point – and value – of playing; that is, the sensations it provides a sentient being of being alive and of existing in relation to something other than themself. A continuity of consciousness.

A deeper question to be addressed is cui bono? Who ultimately benefits from the increasing automation of aspects of our society? Just as the Industrial Revolution and factory production reduced the independence and skills of many craftspeople, so too the automation of everyday life is removing ordinary people from participating in decision-making. It places the definition of how parameters are set higher and higher within a social hierarchy increasingly isolated and removed from the experience of living among ordinary people. Such a rarefied extraction of authority without direct connection to context and situation also shrugs responsibility, and provides an effective insulation against culpability. Witness the degeneration of our politicians and political system – how lies, deceit and incompetence have become normalised, even venerated, without meaningful consequence.

I perceive there to be a parallel between the political imposition of strict rules and the mechanistic fallacy of atomising everything into discrete parts without perceiving the crucial balance of relations between them. They both ignore the basic truth of life that, while everything is indeed made up of the same elementary particles, their unique composition into the infinite variety of matter and life is absolutely particular. Local specificity is a feature of life’s mutability – how everything is in constant flux and adaptation in relation to its local context and environment. Scale seems to be a crucial issue here – universal laws function well at the atomic level and at the cosmic, but clearly not so unambiguously at the scales we inhabit as lived reality. There, diversity and locality are contingent on achieving any effective equilibrium.

Complex living systems just don’t seem to obey laws and rules that are based on reductionist concepts. Perhaps at the extremes it is possible for static rules to operate seamlessly, but in the elastic middle we need flow and dynamism. As atoms themselves are held together by the forces, or relations, between electrons, protons and neutrons, so all of matter and reality are bound by the multifarious forces and relations that govern the natures of different entities. To overlook the reality of our relational existence and to reduce everything down to inert and unconnected bits is, inevitably, to be missing a key part of a whole equation – a series of calculations that will never completely add up. Connection, interdependence, reciprocities are the fundamental forces that bind and make whole the matter of life.

We will need all of humanity’s diverse knowledges and skills, from poetry, art and music together with mathematics, physics and many others, to find the necessary paths to a fair future for all life on the planet. We shall need the ability to calculate and build machines that can help us sort and make sense of vast amounts of data, whilst simultaneously we must retain our independent, fluent, human capability to judge – soundly – what decisions are most appropriate for each circumstance we encounter and must respond to.

The Uses of Not
Thirty spokes
meet in the hub.
Where the wheel isn’t
is where it’s useful.

Hollowed out,
clay makes a pot.
Where the pot’s not,
is where it’s useful.

Cut doors and windows
to make a room.
Where the room isn’t,
there’s room for you.

So the profit in what is
is in the use of what isn’t.
Lao Tzu, from Tao Te Ching: a book about the way and the power of the way
(a new English version by Ursula K. Le Guin, 1997)

London, July 2019