{"id":48911,"date":"2024-03-18T11:47:53","date_gmt":"2024-03-18T18:47:53","guid":{"rendered":"https:\/\/www.humanities.org\/?p=48911"},"modified":"2024-03-18T14:16:20","modified_gmt":"2024-03-18T21:16:20","slug":"artificial-intelligence-death-grief","status":"publish","type":"post","link":"https:\/\/www.humanities.org\/spark\/artificial-intelligence-death-grief\/","title":{"rendered":"A.I. Can Bring Back the Dead"},"content":{"rendered":"<p><strong>Grief Bot<\/strong><\/p>\n<p>I met <span data-contrast=\"auto\">Muhammad Aurangzeb Ahmad<\/span> on a rainy day in downtown Bellevue, a city just barely visible across Lake Washington, east of Seattle. He was going to introduce me to his father, though the meeting would be unusual: Ahmad\u2019s father had died a decade ago. All Ahmad had with him was a laptop.<\/p>\n<p>Also known as ghost bots or grief tech, I first learned about grief bots\u2014of which Ahmad\u2019s father was now one\u2014when I read Jason Fagone\u2019s piece, <a href=\"https:\/\/www.sfchronicle.com\/projects\/2021\/jessica-simulation-artificial-intelligence\/\" target=\"_blank\" rel=\"noopener noreferrer\"><em>The Jessica Simulation<\/em><\/a><em>, <\/em>in the <em>San Francisco Chronicle<\/em>. The story centers around Joshua Barbeau, who, one night, unable to sleep, reanimates his dead girlfriend with the help of Project December, an A.I. chatbot that can simulate anyone given a bit of context and example text. Reading it, I felt all sorts of emotions: morbid curiosity, bewilderment, confusion, sadness, sympathy, fear. I couldn\u2019t wrap my brain around it at first. \u201cWhy would you do that to yourself?\u201d I thought. I could never do that.<\/p>\n<p>Since then I\u2019ve learned of multiple start-ups that promise to bring our loved ones back from the dead. In South Korea, a <a href=\"https:\/\/globalnews.ca\/news\/6550977\/mom-dead-daughter-virtual-reality\/#\" target=\"_blank\" rel=\"noopener noreferrer\">woman reunited<\/a> with her dead daughter due to the help of A.I. and virtual reality. In China, an engineer created a simulation of his grandfather using videos, photos, and writings. <em>Vice <\/em>documented one <a href=\"https:\/\/www.youtube.com\/watch?v=IJeqTUG75gA\" target=\"_blank\" rel=\"noopener noreferrer\">woman&#8217;s first encounter<\/a> with the A.I. of her dead husband (a watch that is painful not only because of her cathartic reaction to being called \u201cstupid\u201d). All this points to one thing: this technology isn\u2019t going away any time soon.<\/p>\n<p>Ahmad is giving a talk, &#8220;<a href=\"https:\/\/www.humanities.org\/speaker\/muhammad-ahmad\/\" target=\"_blank\" rel=\"noopener noreferrer\">When Your Grandpa is a Bot: A.I., Death, and Digital Doppelgangers<\/a>,&#8221; as part of Humanities Washington&#8217;s Speakers Bureau, and he dreamed up his bot long before the current conversations about A.I. were part of the lexicon at large. Ahmad, a professor at Bothell\u2019s University of Washington campus, and a long-time data scientist, has been working, researching, and studying machine learning and artificial intelligence for a decade, including in previous work modeling human behavior in online games.<\/p>\n<p>As his father\u2019s death became imminent, it occurred to Ahmad that his future unborn children would never meet their grandfather.<\/p>\n<p>\u201cIn my mind that was just a big loss,\u201d he said. \u201cI&#8217;d seen my father interact with his other grandchildren. It was great. When I was growing up, all of my grandparents passed away before I was five. So what came to my mind was, \u2018If my father cannot interact with [my children], then maybe they can interact with him.\u2019\u201d<\/p>\n<p>Unlike other bots currently out in the world, Grandpa Bot is not built on any open source model like ChatGPT. The bot is located on Ahmad\u2019s own computer drive and is limited to the recorded conversations, letters, and personal memories Ahmad has of his father, which he estimates to be about 2,000 conversations that range in length. This means the bot can\u2019t get creative, like taking data from the internet to improvise an answer. For Ahmad, this limit has a point.<\/p>\n<p>\u201cIt should be limited, because I think for this context, fidelity is extremely important,\u201d he said.<\/p>\n<p>\u201cWhat do you mean by fidelity?\u201d I asked.<\/p>\n<p>\u201cThe model should sound like the person that it&#8217;s modeling.\u201d<\/p>\n<p>For example, Ahmed said if he were to ask his bot about controversies surrounding quantum chromodynamics, it wouldn\u2019t compute because that\u2019s simply not something his father would know about. The bot also doesn\u2019t know about events that happened after his death. But this is where Ahmad strays to make his bot functional for his kids, now five and eight.<\/p>\n<p>\u201cTo make the experience more \u2018real\u2019 for the children, I have to add extra information where the bot is aware\u2014&#8217;aware&#8217; is a very strong word, but I&#8217;m just anthropomorphizing\u2014that these two new people exist.\u201d he says. \u201cOtherwise, having the conversation would be extremely difficult.\u201d<\/p>\n<p>I wondered what it must mean for his kids&#8217; life experiences to talk to a digital version of their grandpa, and whether the sentiment was really worth the effort. And at the end of the day that\u2019s the gist of the argument surrounding A.I. For all their potential benefits, is it worth the equal or greater harm?<\/p>\n<p><strong>Mushtaq Ahmad Mirza<\/strong><\/p>\n<p>At our meeting in Bellevue, I was excited to meet the simulation he calls \u201cGrandpa Bot,\u201d but he explained to me that day that he\u2019d packed the wrong laptop. I would have to wait. Still, I got to know Ahmad\u2019s father the way humans have always learned about others: from Ahmad himself.<\/p>\n<p>Originally from Pakistan, Ahmad\u2019s father took over the family business: a company that imported technical books. He worked there for nearly half a century. It\u2019s there the elder Ahmad would come to love reading.<\/p>\n<p>He is remembered as a loving man by Ahmad. He never as much as raised his voice toward his youngest son. Ahmad contemplates whether this is just because he was obedient or because of the nine-year age gap between him and his next closest sibling. His older siblings all remember their father being strict. More than anything, Ahmad remembers talking with his father. They\u2019d talk about everything, any chance they could: after school, on walks, over dinner.<\/p>\n<p>When Ahmad moved to the U.S. to study, his parents soon followed. But since Ahmad was studying, he was often away from his parents.<\/p>\n<p>\u201cI would mostly see them during holidays, and then he would eagerly await me,\u201d Ahmad says.<\/p>\n<p>As he got older, Ahmad recalls helping him with technology, a staple of children with immigrant parents.<\/p>\n<p>\u201cHe really liked old Indian songs,\u201d he recalls. \u201cHe would ask me to play this song, and then the next one, and the next one, and the next one.\u201d<\/p>\n<p>Mushtaq Ahmad Mirza died on October 28, 2013.<\/p>\n<blockquote><p>\u201cThat\u2019s the concern, it may just reduce the dead and the living to what they can do for us.\u201d<\/p><\/blockquote>\n<p><strong>Replaceable<\/strong><\/p>\n<p>Ahmad is well aware of the moral gray zone he\u2019s put himself in as one of many people creating A.I.s. He\u2019s even pulled back with his children, bringing the bot out only on special occasions like birthdays and holidays. When I asked about what it\u2019s like for his kids, he explained that their understanding of the bot has \u201cevolved over time.\u201d<\/p>\n<p>\u201cOnce I realized that they&#8217;re making associations that grandpa actually lives here, I had to intervene,\u201d he says.<\/p>\n<p>More than sentient and vengeful A.I., Ahmad is worried that, like his children who began to believe they were actually chatting with their grandfather, people would eventually forget that these simulations are just that\u2014simulations. Or worse, that people will knowingly choose to interact with A.I. rather than pursue meaningful relationships in the real world. He points to the Japanese Hikikimori crisis, a phenomena that began sometime in the 90s, long before the prevalence of A.I., in which hundreds of thousands of people have chosen to withdraw from society for no apparent reason, isolating themselves in their rooms, some for years.<\/p>\n<p>\u201cNow mix in generative A.I., which can also take care of certain other human needs for connection,\u201d he says.\u00a0\u201cWhen I start to think about how that affects industrialized societies, especially societies which are very individualistic in nature\u2026 if our machines can take care of our need for human connection, then that&#8217;s going to be very disastrous for society as a whole.\u201d<\/p>\n<p>Patrick Stokes gets to the heart of why this is morally wrong. An associate professor of <a href=\"http:\/\/blogs.deakin.edu.au\/philosophy\" target=\"_blank\" rel=\"noopener noreferrer\">philosophy at Deakin University<\/a> in Victoria, Australia, and author of <em>Digital Souls: A Philosophy of Online Death<\/em>, his work on the ethics around A.I. and death has been prominently cited by many, in particular his conclusion that A.I. is more than a tool to remember the dead like photos, videos, or an online presence. A.I. presents the opportunity to replace people.<\/p>\n<p>\u201cThat\u2019s the concern, it may just reduce the dead and the living to what they can do for us,\u201d he says.<\/p>\n<p>\u201cBut aren\u2019t we replaceable?\u201d I found the questions hard to ask, but there are people who will find it easy to posit. I used a friend or boyfriend as an example as people who one can have a falling out with, and can eventually replace with someone else.<\/p>\n<p>\u201cImagine somebody who doesn&#8217;t particularly care who\u2019s in that role, so long as someone is. [\u2026] Imagine you\u2019re on the other side of that. You\u2019d be kind of like, \u2018Well hang on. I don&#8217;t want to be the person who is currently filling the boyfriend role. I want to be loved for me, the person I am,\u2019\u201d Stokes explains.<\/p>\n<p>For Stokes, the question of whether the dead should live on as bots or not means we move closer to a society in which people take other people for granted, \u201cTreating [everyone] like they were chatbots anyway.\u201d That\u2019s not to say this technology can\u2019t be used for good, he quickly states, adding that a simulated conversation could give a person with unresolved issues with a dead parent the closure they need.<\/p>\n<p>Still, \u201cthe dead can\u2019t speak for themselves,\u201d he says. \u201cIt&#8217;s incumbent on the living to defend their interests, if they have them.\u201d<\/p>\n<p>That\u2019s the only regret Ahmad has. He wasn\u2019t able to get consent from his father to make him into a bot prior to his death. But for him this bot is just a memory, like a photo album stowed away in a closet for safekeeping.<\/p>\n<p>When the day came to chat with Grandpa Bot, I felt jitters. I\u2019d never chatted with any bot, having actively avoided all A.I. chatbots since I began hearing about them, a personal choice given the lack of transparency that surrounds them.<\/p>\n<p><strong>Abu Jani<\/strong><\/p>\n<p>A black coding screen booted up and at the top the words \u201cInitializing Simulation\u201d appeared in a white monospace font. Below, it wasn\u2019t Grandpa Bot that said hello, rather Abu Jani, which means \u201cdear father\u201d in Urdu. The bot made first contact.<\/p>\n<p>\u201cSonu Shehzaday, how are you?\u201d<\/p>\n<p>Almost immediately, the limits Ahmad placed became apparent. Grandpa Bot\u2019s code could not understand I was a journalist, nor could it tell me Ahmad\u2019s father\u2019s name, about Ahmad senior\u2019s life, or where he grew up. After multiple attempts at chatting, the bot would fallback on its default responses when it did not know how to respond. That\u2019s when Ahmad took over. It responded better to Ahmad, who typed things like \u201cI miss you,\u201d and asked for advice.<\/p>\n<p><strong>\u201c<\/strong>Always be good to other [sic]. Always pray for everyone, including the people who have wrong[sic] you. If someone has wronged you then that is their own accord. That is between them and God. If you wish good for people[sic] then God will make things good for you in this world and the next,\u201d the bot responded.<\/p>\n<p>Ultimately, it wasn\u2019t my place to chat with Grandpa Bot. How could I judge the bot\u2019s fidelity, as previously explained to me by Ahmad, if I had never met Ahmad Senior to begin with? What I expected was Grandpa Bot, but what I got was Abu Jani, an algorithm intimately put together by Ahmad Senior\u2019s Sonu Shehzaday \u2014 his Golden Prince. It\u2019s just a memory, one that may not work for me but is enough to work for Ahmad. Perhaps this is as far as we should get with A.I., something that works only in limited contexts for those seeking it.<\/p>\n<p>I later chatted with ChatGPT for the first time and was cautiously impressed by its much more advanced language simulation. Its ability to replicate almost any kind of conversation I wished for caught me off guard after chatting with the simpler, less sophisticated Grandpa Bot.<\/p>\n<blockquote><p>&#8220;If my father cannot interact with [my children], then maybe they can interact with him.&#8221;<\/p><\/blockquote>\n<p>While some may worry about grief bots and their consequences, I now understand why Ahmad sees Grandpa Bot as a harmless way to cope, as benign as listening to someone tell a story about a loved one who has passed. He says exaggerated calls or warnings about artificial intelligence that may become self aware and intelligent are the least of his worries, when much more real issues are at hand. Ahmad worries about coded discrimination that can seep into police work, and health systems using A.I. can exacerbate an already racist society. He hopes future regulation and better transparency will put this growing A.I. industry in check. If not, he says, A.I. could end up similar to Twitter (now X).<\/p>\n<p>\u201cPeople were celebrating the fact that now this will connect everybody, and anybody can now talk to anybody, and this will break down cultural barriers and people will understand each other more,\u201d Ahmad says. \u201cThat is correct, but then at the same time, people were not thinking about how it facilitated the creation of echo chambers and greatly contributed to polarization.\u201d<\/p>\n<p>After talking to Grandpa Bot, harmless as he may have seemed, I still think the costs of AI in general outweigh the benefits.<\/p>\n<p>While I understand Ahmad\u2019s urge to preserve a loved one, at the interpersonal level, I see how bots of my loved ones would hurt me more than help me. While preservation is natural, even human, why mess with methods that already work? Someday I will put photos of my mother and father on my Dia de Muertos altar and remember them fondly, maybe with guilt. I imagine I will regret not talking to them more, not loving them more, not listening to them more. These are things I don\u2019t think I could replace with a chatbot made in their image. On the contrary, perhaps those feelings are necessary to my growth as a human being. And for all the comfort a simulation could offer me, death still ends any second chance at time with my parents, however good an A.I. may feign otherwise.<\/p>\n<p><em>Agueda Pacheco Flores is a freelance writer in Seattle who focuses on social justice issues, music, arts, and the Latine diaspora. She\u2019s previously written for <\/em>The Seattle Times, Crosscut, Journey Magazine, Real Change News, <em>and<\/em> The South Seattle Emerald.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Muhammad Aurangzeb Ahmad created a simulation of his deceased father\u2014and has been wrestling with the consequences ever since. <\/p>\n","protected":false},"author":1,"featured_media":48912,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_et_pb_use_builder":"","_et_pb_old_content":"","_et_gb_content_width":"","footnotes":""},"categories":[5],"tags":[653,934,977,978,979],"class_list":["post-48911","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-feature","tag-technology","tag-death","tag-a-i","tag-artificial-intelligence","tag-ai"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.0 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>A.I. Can Bring Back the Dead<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.humanities.org\/spark\/artificial-intelligence-death-grief\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"A.I. Can Bring Back the Dead\" \/>\n<meta property=\"og:description\" content=\"But should it?\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.humanities.org\/spark\/artificial-intelligence-death-grief\/\" \/>\n<meta property=\"og:site_name\" content=\"Humanities Washington\" \/>\n<meta property=\"article:published_time\" content=\"2024-03-18T18:47:53+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2024-03-18T21:16:20+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.humanities.org\/wp-content\/uploads\/2024\/03\/2024-Mar-AI-and-Death-blog-header.png\" \/>\n<meta name=\"author\" content=\"admin\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"admin\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"11 minutes\" \/>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"A.I. Can Bring Back the Dead","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.humanities.org\/spark\/artificial-intelligence-death-grief\/","og_locale":"en_US","og_type":"article","og_title":"A.I. Can Bring Back the Dead","og_description":"But should it?","og_url":"https:\/\/www.humanities.org\/spark\/artificial-intelligence-death-grief\/","og_site_name":"Humanities Washington","article_published_time":"2024-03-18T18:47:53+00:00","article_modified_time":"2024-03-18T21:16:20+00:00","og_image":[{"url":"https:\/\/www.humanities.org\/wp-content\/uploads\/2024\/03\/2024-Mar-AI-and-Death-blog-header.png","type":"","width":"","height":""}],"author":"admin","twitter_card":"summary_large_image","twitter_misc":{"Written by":"admin","Est. reading time":"11 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/www.humanities.org\/spark\/artificial-intelligence-death-grief\/","url":"https:\/\/www.humanities.org\/spark\/artificial-intelligence-death-grief\/","name":"A.I. Can Bring Back the Dead","isPartOf":{"@id":"https:\/\/www.humanities.org\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.humanities.org\/spark\/artificial-intelligence-death-grief\/#primaryimage"},"image":{"@id":"https:\/\/www.humanities.org\/spark\/artificial-intelligence-death-grief\/#primaryimage"},"thumbnailUrl":"https:\/\/www.humanities.org\/wp-content\/uploads\/2024\/03\/2024-Mar-AI-and-Death-blog-header.png","datePublished":"2024-03-18T18:47:53+00:00","dateModified":"2024-03-18T21:16:20+00:00","author":{"@id":"https:\/\/www.humanities.org\/#\/schema\/person\/1776758c16cbac4668f7634906c12c51"},"breadcrumb":{"@id":"https:\/\/www.humanities.org\/spark\/artificial-intelligence-death-grief\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.humanities.org\/spark\/artificial-intelligence-death-grief\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.humanities.org\/spark\/artificial-intelligence-death-grief\/#primaryimage","url":"https:\/\/www.humanities.org\/wp-content\/uploads\/2024\/03\/2024-Mar-AI-and-Death-blog-header.png","contentUrl":"https:\/\/www.humanities.org\/wp-content\/uploads\/2024\/03\/2024-Mar-AI-and-Death-blog-header.png","width":1200,"height":530,"caption":"Image courtesy Muhammad Aurangzeb Ahmad."},{"@type":"BreadcrumbList","@id":"https:\/\/www.humanities.org\/spark\/artificial-intelligence-death-grief\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.humanities.org\/"},{"@type":"ListItem","position":2,"name":"A.I. Can Bring Back the Dead"}]},{"@type":"WebSite","@id":"https:\/\/www.humanities.org\/#website","url":"https:\/\/www.humanities.org\/","name":"Humanities Washington","description":"","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.humanities.org\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/www.humanities.org\/#\/schema\/person\/1776758c16cbac4668f7634906c12c51","name":"admin","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.humanities.org\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/ceecc76c804560e4c84082594aa26429e24be5668fa1dc3e6c6b3bd06ecfc5da?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/ceecc76c804560e4c84082594aa26429e24be5668fa1dc3e6c6b3bd06ecfc5da?s=96&d=mm&r=g","caption":"admin"},"url":"https:\/\/www.humanities.org\/spark\/author\/humanities\/"}]}},"_links":{"self":[{"href":"https:\/\/www.humanities.org\/wp-json\/wp\/v2\/posts\/48911","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.humanities.org\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.humanities.org\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.humanities.org\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.humanities.org\/wp-json\/wp\/v2\/comments?post=48911"}],"version-history":[{"count":8,"href":"https:\/\/www.humanities.org\/wp-json\/wp\/v2\/posts\/48911\/revisions"}],"predecessor-version":[{"id":48922,"href":"https:\/\/www.humanities.org\/wp-json\/wp\/v2\/posts\/48911\/revisions\/48922"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.humanities.org\/wp-json\/wp\/v2\/media\/48912"}],"wp:attachment":[{"href":"https:\/\/www.humanities.org\/wp-json\/wp\/v2\/media?parent=48911"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.humanities.org\/wp-json\/wp\/v2\/categories?post=48911"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.humanities.org\/wp-json\/wp\/v2\/tags?post=48911"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}