A lot of people think I’m batshit crazy,
says Justin Harrison of Grieftech.
I don’t. I think he’s a delusional ghoul.
Harrison has cobbled together a chatbot that uses an imitation of his late mother’s voice and predictive text built from her online communications, and he thinks that it is a cure for grief, because it enables him to talk to his “mother”. It doesn’t. There is no person there. It’s a kind of selfish version of grief, where he can deny her death and pretend it’s OK because his superficial, fake emulation of his mother can pay attention to him. It’s gross and creepy.
In the last few years, I’ve lost my mother and a brother; in years gone by, I’ve lost my father and a sister. They’re dead. The grief comes from the loss of living, human, thinking, behaving human beings who can’t be resurrected by some fraud with a collection of words they may have uttered. But this shallow idiot thinks a chatbot is a substitute.
Harrison is being interviewed, and he thinks he’s being clever by throwing some publicly recorded videos of his interviewer into the chatbot’s database, and then conversing with the computer. The interviewer is not impressed. So Harrison and some other team member argue with him to say that the computer used a spot-on turn turn of phrase
. I guess if all we are is a series of turns of phrases, then the simulacrum is perfect. Except we aren’t. There’s no person, no thinking mind, behind the chatbot.
Then the interviewer goes off to talk to a series of people: one who imagined seeing a dead person after taking drugs, another who dreamed that they were visited by the ghost of their father, a medium who claims, with many weird jerky expressions, that they can communicate psychically with a friend. They’re all the same thing: frauds, liars, or deluded people who have convinced themselves that their loved ones are nothing but superficial reflections of their own minds. Justin Harrison is just more of the same, a phony like all the other phonies who have leeched off other people’s honest grief for profit.
After I’m dead, at least I’m reassured that no ghoul is going to be tormenting me with banalities; I’ll be gone. Don’t be fooled that my chatbot copy’s banalities are coming from me, though.
kome says
Psychics and mediums have been pulling this grift for centuries, and now techbros have found a way to automate it. Say what you will about that ghoulish piece of garbage Sylvia Browne, but at least she had to interact directly with the people she was exploiting.
StevoR says
There was an episode of Star Trek TNG that already discused pretty much this exact scenario. It didn;t end well
Source : https://en.wikipedia.org/wiki/Silicon_Avatar
cag says
Billions of people believe that they are talking to god/allah/jesus. What a business opportunity.
profpedant says
After my father died there were several times that I ‘saw my Dad’ in my peripheral vision. I knew perfectly well that his ‘being there’ was my brain misinterpreting what was actually in my peripheral vision, but it was nice to be able to (sort of) see my father again. No magic involved, just a pleasant misinterpretation of reality.
HidariMak says
I imagine that the porn industry will be early adopters of this. Pay by the minute to talk with a “Debbie Does A.I.” chatbot, for starters.
acroyear says
Once again, “Max Headroom” (the series itself, not the character) was hugely ahead of its time.
An episode in season 2 was called “Deities” and it involved an ex- of Edison Carter who ran a ‘religion’ based on a simpler idea of the tech that created Max Headroom from Carter’s memories and personality. Of course, they couldn’t scale up, so the whole thing (like all religions) was something of a money scam preying on the vulnerability of the elderly.
Well, here we are – using tech to give a very rough (and poor) simulation of a loved one to scam money from the elderly.
“Oh, that’s WONDERFUL, isn’t it?”
Raging Bee says
Billions of people believe that they are talking to god/allah/jesus. What a business opportunity.
Coming up next: Churches offering Zoom or MS Teams meetings with Jesus Christ, who will of course assure everyone listening that [church organizing the meeting] is the true representative of his Word and Will.
Larry says
cag @ #3
Have you been trying to talk with Jesus but you get nothing back. God’s not returning your calls? Well, suffer no more! ChristOnLine®, a new app from Ronco, maker of Mr. Microphone, uses AI on your phone or device to allow you to actually speak to and interact with your deity of choice. Available in the app store now for only $29.99/month. Install it today!
AstrySol says
I think the need for psychological support is real for a non trivial number of people, so if there is a way that can provide some kind of relief there honestly, IMO it is not necessarily bad on its own.
However, tech bros tend to be cheapskates, make half-assed products, skirt regulations, lie about them and charge exuberant prices. And I think that is the problem.
It’s like quality magic shows vs whatever is done by those con artists.
Akira MacKenzie says
Didn’t Kayne West do something like this a couple of years back as a birthday present to Kim Kardashian; he had an AI “hologram” of her late father made to wish her a happy birthday and tell her what a great guy West was?
(Checks Goggle)
Yup, he did.
https://www.bbc.com/news/entertainment-arts-54731382
Tethys says
It’s slightly less creepy than the Norman Bates method of keeping his Mom around. It’s hardly surprising that a chatbot parrots her in a “spot on turn of phrase”, given that the bot was given his Mother’s texts as a training set.
Pretending your Mother hasn’t died is not a healthy method of dealing with grief.
Jean says
That’s similar to the other deluded people who want to “upload” themselves into a computer. Just because you may at some point be able to create a simulacre of some person that could fool others does not mean you have a real self-conscious individual.
tacitus says
Everyone processes their grief in their own way. My mom lost her husband of 66 years last year — they were very close — and continued to “feel” he was alive for several months afterward even though she was well aware that he was dead.
As long as you’re not duping anyone into believing their loved one is still alive, I suspect an AI simulacrum of their lost loved one will be quite popular, and significantly beneficial to some. I doubt my mom would have wanted one, but her sister-in-law lost her husband a few years ago and her life essentially ended the day he died since she never got over his loss. All she ever wanted to do was be with him, and she became a recluse after he died.
Ideally, robust mental health care is the answer of course, but we’ve never had that level of care in the US or UK in all the decades I’ve been around, so if affordable AI companions (simulating loved ones or not) become a reality, there are millions of elderly, lonely, and housebound individuals who could stand to benefit from the company.
No doubt there will be plenty of research on their potential uses in healthcare in the years ahead, but not everyone is as fortunate as my parents have been to have access to friends and family to help them and encourage them through an increasingly complex world as their faculties start to decline.
spiderj says
It starts as science fiction.
https://youtu.be/LU6U2B4VBqQ?si=LapChc7g3eisWoCn
Jazzlet says
I have had many dreams about deceased relatives over the years, but that’s all they were dreams. Which is fine by me as a lot of them were not pleasant dreams, the one where my mother was eating and talking, missed her mouth with the fork which then hit her cheek was particularly disturbing
I mean it
Sure?
as her cheek fell apart like a slow braised cut of beef while she continued to eat and talk. That dream happened something like forty years ago, but it still disturbs me when I remember it.
Jazzlet says
Sorry obviously didn’t close the tag, italics should stop at end of penultimate sentence.
seachange says
Universities are going from tenured professors to adjunct professors to TAs all while charging more to fill their proliferating administration’s already fat pockets.
You already have some recorded genetics lessons. We here have been privileged to access some of them.
…
…
…
LykeX says
Does he? Or is he just so emotionally stunted that he can’t tell the difference? Is he so lacking in humanity that, to him, it IS a substitute?
nomdeplume says
Every day, new evidence of a world gone mad.
KG says
London Standard’s AI imitation of Brian Sewell proves art critics cannot be easily replaced. Sewell was a rather nasty piece of work IMO, but he didn’t deserve this.
John Morales says
[OT]
In the news: https://www.theguardian.com/world/2024/sep/26/nazca-lines-peru-new-geoglyphs
gijoel says
Blasphemy!!??!?! Repent now or Roko’s basilisk will torture your chatbot at some indeterminate point in the far future. :)
On a personal note when I was a teenager my mother floated the idea of freeze drying after her death. The plan was that my brother and I would share her corpse and keep her in our homes for six months a year. She got really angry when we both said no.
bcw bcw says
The series “Black Mirror” explored this idea in
“Black Mirror” Be Right Back (TV Episode 2013) – IMDb – Series 2 episode 1.
Where a robot is programmed with all the digital records of a pregnant woman’s dead husband and configured to imitate his body.
The wife both wants and hates the non-real version of her husband. The show does a good job of showing both the attraction and the agony of the imitation. She eventually ends up with a odd middle choice.
bcw bcw says
The loss of the person was really brought out by my fathers slow death to Alzheimer’s where the death of the person preceded the physical death and included a period of a similar but different person in my father’s body once he had no memory of us or his life.
vucodlak says
A couple of decades ago, my best friend and I decided to get married. It’s kind of a funny story, one which involved an old waterbed, a poorly-secured ankle knife, and a metric fucktonne of swearing, but our engagement was a short one.
Two weeks after that magical night, she was dead, her life cut short by a drunk coming off an all-night bender. The bitter irony is that she was on her way to her old home town down in Texas, to find us a place to live far away from her horrible, abusive drunk of a father, and the threat he posed to her well-being.
I don’t get grief. I don’t understand how one is supposed to grieve. My own grief, throughout my childhood, was greeted with scorn and contempt by my family, so I got the message that never, ever supposed to share one’s grief, or discuss it with others. As such, I never learned how to grieve. This, you are saying, is another wrong way to do it.
So what’s the right way to grieve?
I’ve tried drinking, but that just makes sad and sick. I’ve tried hurting myself, and the physical pain is sort of a release, but it doesn’t last long, and hiding the evidence is a hassle. Crying, screaming, and talking about it is clearly unacceptable. It cannot be shared. And now deluding oneself is apparently off the list.
So what’s left? Do we hold that pain inside of us, nursing that poison in our hearts until they swell up with sickness and finally explode in a welter of pure rage and hate, and all we can think about is how much pain and suffering we can inflict on everyone we can reach until we’ve shown everyone what it means to lose, before we’re shot down in the street like a rabid dogs?
I never tried that, obviously, but I’d be lying if I said it didn’t cross my mind every time some fucker asked me why I was so gloomy all the time.
One day, a few months after my beloved’s death, I got an IM (Instant Message) on the very same service that she and I used to talk on. It was her name. I felt this sense of vertigo, like the world had fallen away beneath my feet. The avatar of the person contacting me even bore a vague resemblance to my beloved.
This person immediately engaged me in a sexual chat, and I just… went along with it. The “conversation” ended relatively quickly, with my chatting partner throwing a link to some sketchy sex chat site.
It was a bot. I knew it after the first reply, but still I engaged with it. It came back a couple of weeks later, with exactly the same script, and I engaged with it again. I kept engaging with it, because I was so desperate to talk to her one last time, even though I knew that this bot was just a bot.
Finally, I begged it to stop contacting me. It wasn’t helping me. Its rote recitation of lewd acts was no more responsive than a tombstone. Something I said must have put me on the bot’s do-not-contact list, because it stopped, and never bothered me again.
Was it delusional of me to engage with this thing? To try to talk to it as though it was her? Sure. But, also, so fucking what? I’m supposed to be rational about emotional pain, now? That’s not how it works.
Grief is not a rational process.
Is there such a thing as an unselfish version of grief? For that matter, is it so wrong to be a little selfish in grief?
It’s gross and shitty to try to make a company based around selling things like the AI in question but, as far as him creating one for himself and using it… I gotta say, I don’t see the problem. It may or may not be helpful or healthy, but that kind of thing is hard to measure. Who the fuck am I, or anyone else, to tell someone else how to process their pain over losing a loved one?
As long as they don’t go on to hurt anyone else, I don’t care if someone takes drugs or holds a séance or builds a chatbot. It’s wrong to take advantage of grieving people, as the techbro in this post seems to be planning on doing. I judge him for that. But, as far as his personal grief goes, I don’t judge him for his “delusional” or “selfish” way of coping.
I don’t see this as being any worse than talking to a tombstone or photograph of the dead person. Just because you wouldn’t do it doesn’t make it bad or wrong.
shermanj says
Speaking of Artificial Intelligence, guess who is calling for Harris to drop out immediately
without giving any real reason? –
http://theartsinarizona.org/SenPotatoBrain.png
NitricAcid says
Wasn’t that also part of The Fall of the House of Usher?
Nevermore.
Nevermore.
Nevermore.
microraptor says
Great, now AI companies are going to make money off your dead friends and family.
cartomancer says
I think there is a debate to be had over the role of this kind of thing in processing grief. Instinctively I find it a creepy and distasteful idea – particularly if it’s going to be monetised for evil capitalist purposes – but as has been said upthread, people do all kinds of strange things to cope with their grief. I would expect a lot of the people who buy into this in a moment of stress and desperation to realise, as things calm down, that it’s just a poor facsimile of a loved one and gain no further benefit from it. If it were a free service then that doesn’t seem too harmful.
cartomancer says
I am reminded a little of ancient Roman funerary customs by this. Aristocratic Roman families usually kept wax masks of their departed family members in the atrium of their houses – probably cast from their faces when alive or only recently dead. When a family member died and a new mask was added to the display, they would take the collection out to the graveside and have living family members wear them – taking care to choose a person who was as close as possible in stature and demeanour to the deceased. These family members then recited the great deeds of the ancestors in order down to the present day, whereupon the newly deceased’s deeds were added to the roll call.
It seems to me that this kind of thing has a similar memorial purpose.
benedic says
« Desine, Paule, meum lacrimis urgere sepulcrum:
panditur ad nullas ianua nigra preces; ———-As Propertius , Maecenas friend wrote « the black gate won’t open to any prayers »
shermanj says
Slightly off topic for this post, but:
This is an article I think PZ (and the rest of us) might be interested in:
http://www.smirkingchimp.com/thread/amanda-marcotte/111672/rfk-jr-s-tour-with-jordan-peterson-make-america-healthy-again-shows-why-alt-medicine-went-maga
Paul K says
vucodlak, @25: My own grief, for those I’ve lost in terrible ways — particularly to suicide — is just a part of who I am. I was also raised by people whose job it was to love and care for me, but who actually, and literally, beat it into me that I’m garbage, so the trauma of that makes sinking into despair, when feelings of grief rise up, all too easy.
I have no way to help you, other than to sympathize and acknowledge the pain you described. I hope you have ways and times that you get beyond the pain. For me, I look to the good. Nature, especially the night sky, help me. So do good people, who I know are both here with me, and out there in the world, including on this blog of fierce righteousness, which I come to many times a day, for a dose of moral clarity.
trollofreason says
There was a friggin’ Dr. Who episode about why this is a terrible idea, & some witless, delusional techbro with no understanding of human mental health or psychology education prolly saw that cautionary tale & went, “No, but for real.”
Raging Bee says
I can see watching a video of a deceased person giving last bits of advice, encouragement or whatever to their surviving loved ones, if said person had chosen to record any such thing in advance. Anything beyond that — especially anything involving “AI” software cobbled up by techbros — should be considered fishy at best.