Three young people holding smartphones, with emoji's above their heads

Shutterstock

What is the probability of artificial intelligence running amok and causing havoc in an extreme way?

This idea has increasingly been hitting the headlines and was given a serious place at the table of world politics with the first international AI Safety Summit this month.

Regulation is obviously worth pursuing, but what about the values embodied by AI itself? Rather than focusing on prohibition, how can we proactively rig the system for good? AI trains on data sets, and these data sets are shaped by human behaviour – meaning sin is built into the algorithms. The result is that people are manipulated, commodified, and drawn into patterns of behaviour that are addictive, mindless, or even malicious.

Instead of simply accepting that we suffer from the ‘sin of the parents’ (Exodus 20:5) the question of whether specific virtues should be deliberately placed into the data sets has begun to be addressed. The Christian philosopher Dr Rachel Siow Robertson and her colleagues have been looking at how we can engage more fruitfully online by focusing on joy. Their working definition of joy is ‘an intense feeling of fulfilment and a deep alignment between some good in the world, and oneself and others’.

This principle has been used to develop an alternative framework for testing technology called MIIND, with criteria motivation, such as creativity, and promotes healthy integration with the world, self, and others. What is the user’s intensity of experience? How is the product normative (i.e. establishing moral or aesthetic norms), and does it enable users to recognise they’re dependent on external factors for wellbeing?

Other measures of the impact of technology on individuals are somewhat ‘thin’, looking at short-term satisfaction and the ‘stickiness’ of applications such as news apps that can keep users doom scrolling for hours. MIIND, however, helps developers look at ‘thick’ user experiences that enable people to pay loving attention to others. For example, virtual reality can be used to raise awareness of issues in a way that leads to hopeful action rather than capitalising on attention-grabbing stories of suffering.

Jesus modelled a way of relating to people that recognised them as individuals worth paying attention to, helping them to grow in their unique character and capabilities. So let’s ask a new question: how can we be a voice and a catalyst for realistic practical action to both use and create AI-based technologies to build lasting joy?

 

This article was originally published on the London Institute for Contemporary Christianity blog, and is reproduced here with permission

The academic work of Rachel and her colleagues is linked here.