Skip to content

Proximity is an Active Ingredient: Thoughts on the Future Portrayal of AI in Speculative Fiction

With her new novel inspired by the impact and power of language in society and the lively debate around AI in the context of language and the written word, Karen Langston considers how speculative fiction writers can ethically engage with that favourite human-vs-machine trope now the future is here.

Since the launch of OpenAI’s ChatGPT in November 2022, Artificial Intelligence (AI) has leapt in the social consciousness from background hum to air raid klaxon. A large part of the noise is the rate of escalation in open source and generative AI. Now, attention is on generative text-to-image models and deep learning capability. In little over a year, machine learning has moved up a class – or three.

No wonder, then, the klaxon. No wonder the social preoccupation with the pace, the power and the potential of the horse that has already bolted.

Escalation in AI and the dynamic between human and machine is old news in the context of science fiction. For decades, the cautionary tale has been a familiar one: teach a machine to learn, and it might just exceed your intelligence and assume control. The trope works in fiction, playing to our existential fears and the expectation for a conflict-centred, binary relationship of human versus machine. Timely to continue the trend, then. Or is it?

Writing while the gap between science fiction and science fact is narrowing

As the gap between science fiction and science fact appears to narrow, the immediate reality of AI poses a challenge to this doomsday portrayal in the context of speculative fiction. To my mind, as a writer and reader of speculative fiction, what is more compelling than a far-future, worst case scenario that is unlikely to play out, are the far more plausible ways in which future AI might impact society, culture and the human-machine dynamic.

It is worth clarifying what I mean by ‘speculative fiction’, as opposed to purely science fiction, which may or may not be speculative. I refer to non-realist fiction that examines some aspect of reality within the creative freedom of an imagined otherworld – an alternative past, present, or future where the impossible is possible and feels plausibly real. 

The important element here is proximity. It is the nearness of possibility that charges our engagement with the imagined and unreal. Plausibility is set in bold relief by the real-world relevance of the fictional, encouraging its relation to the now of human consciousness and experience.

It is from this perspective that I view the portrayal of AI in speculative storytelling as one of valuable opportunity. The pros and cons posed by AI are myriad and diverse. Alarmist headlines aside, there are real and immediate threats, such as risks to jobs, intellectual property, data security, misrepresentation and bias reinforcement. There are also evident benefits, such as improvements in medical diagnoses, fraud detection, advanced data analysis, improved weather forecasting and natural disaster prediction.

Speculative fiction’s role in probing the possible

Outside of Big Tech, there is a palpable sense of gearing up in relation to the use of AI tools in the workplace across most sectors. It feels pragmatic, but not entirely front-footed. There is, perhaps, a contribution speculative fiction can make here: lab testing scenarios, simulating the what-ifs, probing the possible outside the context of productivity, capitalization and competition. Viewing the results through the lens of society and humanity.

Indeed, there is a pervading human element to speculative storytelling, the thread that links the narrative back to us, that can challenge assumptions, provoke thoughts, reframe narratives. AI is already here, and it is here to stay. As the collaborative dynamic of man and machine evolves, a broadening of representation, of alternative portrayals, could make a meaningful contribution to the cultural frame of reference around AI.

Many writers are already taking this approach. The AI characters, Lovey/Sidra and Owl, in Becky Chambers’ Wayfarersseries, are a heart-warming example. The relationship between Meg and Roger, an AI ‘blot’ in Kate Folk’s short story, Big Sur, is a relatable portrayal of AI, characterised in the context of the humans that created it. The motive in the creators’ design is the only monster in the machine. 

The question of agency is a pertinent one. At present, machine intelligence is in its infancy. AI does not have human cognition or emotional intelligence. Machines cannot feel. However, as machine intelligence deepens, potential ethical implications emerge. If AI eventually develops a degree of self-awareness, there will be ethical considerations regarding the relationship between humans and the machines they create. For example, how AI is treated, maintained and, eventually decommissioned or made obsolete – a concept that Ted Chiang explores with tender insight in his novella, The Lifecycle of Software Objects.

I have particular concerns about AI. However, as a speculative fiction writer, I see an opportunity to park the AI singularity narrative in favour of probing the vast, fertile territory beyond it. Proximity is the active ingredient. The reaction is already happening. I believe there is value in future thinking through speculative storytelling, a fictional gearing up for the changes to come.


In this piece I have written about AI within fiction and not as a tool to create it. The use of AI applications in the creation of literature, art and music is a personal choice. My choice is that I will not use AI tools in my writing. In addition, I am concerned about the potential risks and detrimental impact of their use in content creation, in particular, the impact on artists, performers, other practitioners and all those who support them, working across the creative industries. I support the Human Artistry Campaign and its core principles for AI applications.

Photos by Drew Dizzy Graham, David Spiers, and Taiki Ishikawa on Unsplash

Meet the author

Karen Langston is author of Klova, a work of speculative science fiction in which all language is artificial and has started to disappear, and Bluemantle, speculative dystopian fiction inspired by her love of live music. Both are published by The Book Guild Ltd. 

For more information about the author and her work, please visit:

Photo: Rachel Hayes /

Leave a Reply

Your email address will not be published. Required fields are marked *