Human-AI relationships - are they already our companions?!

Human-AI relationships are something that have long fascinated us, with the notion of AI companions repeatedly appearing in the world of science fiction. The 1986 Stanley Kubrick epic 2001: A Space Odyssey brought us HAL, a supercomputer that after becoming self-aware and paranoid – in a dark exploration of the ethical considerations of human-AI relationships – then began to murder its human crewmembers. And in 2013, Spike Jonze’s Her depicted a man falling in love with his AI operating system. These are very different depictions of the future of human-AI relationships, but with AI now becoming a reality and no longer restricted to science fiction, it’s time to ask if AI systems are already our companions.

Talking with AI

A huge part of the psychology of human-AI relationships comes from the fact we can have conversations with these systems that, through innovative AI and natural language processing developments, can communicate with us. These adaptations are not exactly new. ‘Siri’ has been taking questions from iPhone users for nearly a decade, and for some time Alexa has been a fixture in millions of homes around the world. Going further, AI and user experience innovations have seen the role of chatbots increasingly become the norm. These are a step further from the simple questions we may ask Siri or Alexa, and a move towards something more interactive and collaborative. A well-coded chatbot will ask questions back, informed by the responses already recieved.

Defining this as an AI-human emotional relationship is a stretch. If you have a conversation with your bank’s customer service chatbot it will very much be limited to a script and the parameters coded to this. If you started speaking nonsense to it, it wouldn’t know what to do. And it wouldn’t get annoyed.

However, AI is becoming so advanced that chatbot conversations are now being elevated to the next level. Systems can be coded to recognise human emotions and responses, and given enough data they can create and generate responses based on any number of prompts. This has led to an increasing number of emotional companion AI systems being launched. Like 2013’s Her, in which Joaquin Phoenix’s character falls in love with an AI operating system voiced by Scarlett Johannson after she is able to make him laugh and feel understand, these systems are being created to combat human loneliness.

In a world, where even though technology has made us more connected than ever, loneliness is rife, and many are still looking for a human connection. The irony is machines, and more specifically AI, are starting to fill this void.

AI sentience

This is where things get (even more) complicated. AI and machine learning are demonstrating remarkable feats in engineering, with ChatGPT showing the world the power of mass data combing and then generating intricate results from prompts in near instant time. And though AI may be able to create a recipe or even hold a conversation to the point it mimics human emotional breakthroughs, this is not AI personhood.

AI is able to do things because of the data it scans. If you ask ChatGPT for a lasagne recipe, it will instantly scour every lasagna recipe on the planet. It will go through cooking videos, interviews with celeb chefs and even reviews online to give you the best lasagna recipe possible. This is very much the same way in which companionship AI systems are created – these systems will scour and learn from other human interactions, jokes, romance novels, rom com films etc.

The responses generated are fascinating, and very impressive. But this is not a blurring of AI and consciousness, or even AI and identity. And what of AI and free will? ChatGPT – as of yet – will not have an emotional response. It won’t ignore a question, be sarcastic, ask a question back in irritation. The question then turns to what makes a human, and what shapes our personalities in terms of emotions and psychology?

Mimic or replacement?

AI is impressive and it can certainly mimic humans and their interactions, with things like deepfakes starting to increasingly fool people. However, this is not the same as the real thing. In the same way AI can write a song or play, it can only do that by scouring already created examples and then generate a mimic of these using the trends and patterns identified. AI has not created anything because art is created by humans, and inspired by the things that impact the human experience – love, trauma, fear etc. AI can understand every single thing about The Sistine Chapel, but it won’t feel anything about it.

The same could be said for love and companionship with humans. An AI companion could very well do a serviceable job of creating a relationship with a human, understanding their personality and generating responses, jokes, affirmations and affections based on that individual. This is surface-level stuff though and for an AI system to form a real relationship with a human, it would likely need to have its own personality and all the emotions that come with this. Could this happen one day? This is certainly a possibility, and the last 18 months have shown that breakthroughs in AI are indeed at our doorstep – but artificial consciousness and emotion may be much further away at present.

To get more insights from the Cardaq team as they’re published, sign up to a newsletter below:

Don’t miss out on fintech insights, company milestones, and expert tips

Subscribe now to stay ahead in the fintech world!