Yesterday I read a short story in the Guardian
My story 'Emily Presents' is in today's Guardian! I had no idea. Just heard from a Facebook friend. Funny old industry, this one. Enjoy! @GriffithReview @scribepub— Alison Gibbs Author (@aligibbsauthor) January 2, 2022
Short summer fiction: Emily Presents by Alison Gibbs https://t.co/LWnY6cvQyg
I enjoyed the story - without giving too much away it imagines a world in which dead authors are re incarnated via AI as blade runner like avatars and wheeled out to various 'meet the author' events like those found at literary festivals world wide.
It's fiction, but it got me thinking about things that I hadn't thought about for a long time.
It would be entirely possible to train an AI bot on any large corpus of material - the digitised letters and notebooks of Charles Darwin - for example and have it reply to questions based on that corpus of material.
What exactly you would get out of it I'm unsure, but it would at least generate replies in the style of Charles Darwin. It would of course get things hilariously wrong, which is how we would know it was a computer program and not a reincarnated version of Old Beardy himself.
Chatbots and interactive digital assistants are of course all around - Siri's an example, as is Alexa, but one of the more interesting examples is Codi, the one Telstra uses.
If you ever have the dubious pleasure of contacting Telstra about a problem you first of all have to have a chat session with Codi to try and route the problem appropriately rather than have you transferred between numerous helpdesks.
Personally I've always found Codi completely useless, and got better service from the humans, who always seem to be called Ben or Susi, but whom you suspect are really Ravi and Sunita sequestered in a cube farm in Bangalore.
What is interesting is the way that Telstra have tried to make Codi part of the workflow and indistinguishable from a human.
All these bots and assistants take their inspiration from ELIZA - the first chatbot program - a program that takes its input from a human, processes it and replies based on whatever model it has.
So Codi replies on the basis of whatever world model Telstra has created and the fictional Emilybot replies based on the corpus of Emily Bronte's writings.
Now a long time ago Adrian Desmond wrote a book called the Ape's Reflexion, in which he argued that all these 1960's experiments in which they attempted to teach chimpanzees American Sign Language were failures.
What Desmond argued was that apes are very bright, so that if you made a particular series of gestures and then presented a slice of watermelon, the apes would get the idea that if they made the gestures they would get more watermelon, and that given apes are very bright the complexity of learned responses are indistinguishable from a true linguistic response.
Which of course opens up a whole can of philosophical and psychological worms about the nature of language and communication.
Let's take a simple case, when my cats want to go out they sit by the back door - they know they're only allowed out the back door into a fenced off play area. If you ask them what they want they'll look at you and then the door and if you point at the door they'll get up and walk towards it expecting to be let out.
No one taught them this, they worked out for themselves that a particular set of interactions would result in them being allowed out. Is it language? No. Communication ? Definitely.
And so it is with digital assistants. The Eliza effect may lead us to think that they are cleverer than they are, but fundamentally they are only responding to the inputs that they parse in a certain way and respond on the basis of the dataset that they hold.
So if there was ever such a thing as an Emilybot, she would appear to tell you things about Charlotte say, but only what was in both her and Charlotte's letters and diaries, but to an uncritical observer might think they were being told secrets, but really only stuff that you could find in Juliet Barkers' book on the Bronte sisters.
She wouldn't, for example, be able to tell you if her sister Charlotte smelled or any other personal secrets ...