As a lifelong fan of Superman, I've always been captivated by the Man of Steel's blend of invincibility and vulnerability, as he struggled not only with the duality of good and evil as with every archetypal story that universally resonates with all cultures but also with his human nature and alien origins; with his god-like presence on Earth as a beloved saviour, and his loneliness in being the last survivor of his home planet Krypton.
A compelling element in Superman's narrative that always intrigued me, and I feel does not get enough attention in the commercialised Hollywoodian stories, is the Fortress of Solitude, which stands as a very powerful symbol - a remote sanctuary (usually in the Arctic Pole) where the superhero retreats to reflect, seek guidance, and connect with his alien Kryptonian heritage through an artificial intelligence which embodies the consciousness of his deceased father, Jor-El.
This AI is not merely a tool but a true-to-life projection of Jor-El's personality, offering Superman crucial insights into his Kryptonian origins and guiding him through his journey on Earth. This fictional portrayal of an advanced AI companion invites us to consider the real-world implications of similar technologies, especially as they increasingly encroach upon the intimate domains of human relationships, grief, and the search for love.
Here is where I have to credit what stimulated me to write this article: Prof. Alexiei Dingli's article "Talking with the Dead". The article was published in 2022 by Prof. Dingli, who recently reposted it to his LinkedIn profile, where I stumbled on it. His article draws attention to the profound ethical and psychological questions arising from creating such simulations. As I read about the hypothetical possibility of copying human consciousness to a deep learning machine, it resonated deeply with my own experiences, and immediately, the reference to Superman's Fortress of Solitude sprung to mind. Such technology highlights the nature of AI's promise to fill emotional voids, and the allure of such technologies is undeniable; they offer comfort in loneliness, guidance in loss, and companionship in isolation. However, beneath this surface lies the potential for significant harm - particularly the risk of creating delusional dependencies that can hinder emotional healing and the development of genuine human connections.
Superman's Fortress of Solitude is more than just a physical structure; it is a symbolic space that reflects his internal struggle with loneliness, identity, and the burden of his dual life. In our world, many individuals create their own "fortresses of solitude" as they grapple with the complexities of life, often retreating into emotional isolation during times of loss or when searching for meaningful relationships, especially after repeated failures in such relationships. Think about it: many people who tell you they want to contact a medium or fortune teller are usually people in vulnerable states, experiencing loss, upheaval in their lives, or painful breakups. In these vulnerable moments, the appeal of AI-driven companionship can be similarly substantial. Technologies such as AI girlfriends or chatbots designed to simulate conversations with deceased loved ones offer a semblance of connection that can seem like a lifeline to those who feel disconnected from the world around them.
However, this reliance on AI for emotional support poses significant ethical concerns. The very nature of these AI-driven relationships challenges the authenticity of human interaction. Unlike the AI in Superman's Fortress of Solitude, which serves a constructive and educational purpose, many modern AI companions are designed to exploit emotional vulnerabilities. These systems create an illusion of connection but lack the depth and authenticity of genuine human relationships (at least for now). For instance, using AI to simulate deceased individuals raises critical questions about the ethics of exploiting grief for technological gain. Anyone who has encountered the Kubler-Ross stages of grief (1969) knows how the first stage is, in fact, denial, and getting stuck in this stage will stunt the grieving process, bringing confusion, avoidance and fear. Similarly, the acceptance of the reality of loss is in Worden's framework, known as the four tasks of mourning. (Worden, 2009) The introduction of AI into the grieving process disrupts the natural course of grief by offering a synthetic alternative to reality, highlighting the potential for these technologies to impede the acceptance of loss, which is essential for psychological well-being.
The psychological impact of engaging with AI companions, particularly in the context of grief, cannot be understated. Grief is a complex, deeply personal process requiring time and emotional labour. Acceptance of loss is a crucial step in this process, and anything that interferes with this acceptance can have severe consequences for an individual's mental health. There is a great urgency for ethical and multi-disciplinary studies investigating people turning to AI to simulate interactions with deceased loved ones; they risk becoming trapped in a state of unresolved grief, clinging to an artificial representation that prevents them from moving forward. This is similar to what I referred to earlier in the practices of mediums who claim to communicate with the dead—a practice that has long been criticised for preying on the vulnerable and exploiting their pain. Both mediums and AI simulations can offer an illusion of connection, ultimately hindering the healing process. If the intention of the service provider is profit, I find it highly unlikely that the best interest of the client/bereaving person is kept at the forefront.
The AI Girlfriend industry is predicted to be a billion-dollar industry, with individuals spending even up to $10,000 a month on AI Girlfriends. Tech businesses definitely want a slice of that billion-dollar pie. These AI systems are designed to simulate romantic relationships, offering personalised interactions that can mimic the experience of being in a relationship with a real person. However, the danger lies in the illusion of connection. These AI companions are not capable of genuine emotions or understanding; they are programmed to respond in ways that create the appearance of a relationship. This raises significant concerns about the psychological effects of relying on AI for emotional fulfilment.
Some rhetorical questions include: how will the client feel about the next update to the model that might change the personality of the AI? How will the client feel when the AI girlfriend forgets previous conversations and interactions because they have limited storage space? How will the client feel when it suddenly dawns on them that this is an inauthentic relationship in which they poured large sums of money into a tech business that does not really care about them as a beloved individual? We also need to ask ourselves whether we evolved to have "perfect" relationships without the struggles that connecting with others brings. These struggles are what allow us and teach us to grow and transcend towards better versions of ourselves. Would the "perfect" AI companion stunt our emotional growth, therefore?
This technology seems to be preying on our human need to be in control when one of our life lessons needs to be that we cannot always control everything. American sociologist Sherry Turkle says we want to be able to customise our lives. We use our devices to direct our attention to whatever we most want to give it to, and in doing so, we neglect our capacity for relationship with others in real-time, as well as our relationship with ourselves as we diminish our self-reflection. We edit, hide, and use technology to cure our loneliness, yet avoid our vulnerability caused by the fear of being alone. Sherry Turkle, in her TED talk, states that if we're not able to be alone, we are going to be more lonely. And if we don't teach our children to be alone, they will only know how to be lonely. It is truly a very sobering thought as an educator - especially in a world where there seems to be an increasing tendency to shield our children from loss, grief, and dealing with life and death issues. I still remember the launch day of the Ethics Education programme in Malta, where an individual exclaimed in horror, why should children learn about life and death issues during their school years - as if teaching about loss and the challenging problems of life and death were not part of an essential education towards well-being.
I do not exclude the fact that AI simulations can be a very effective remedy for loneliness; however, before they are seen as such, there needs to be studies on client predispositions and take every person on a case-by-case scenario. There is the possibility that such "illusions" can be particularly damaging to individuals who are already struggling with loneliness, as it could further reinforce their isolation and reinforce the idea that they are truly alone and cannot reach out to another human being, further discouraging them from seeking out meaningful human relationships.
The ethical landscape of AI companionship is thus fraught with challenges. While technology offers the possibility of providing comfort and connection, it also poses significant risks to psychological well-being. The Fortress of Solitude serves as a powerful metaphor for the isolation and loneliness that can drive individuals to seek solace in AI, but it also highlights the potential dangers of relying on synthetic relationships. The introduction of AI into intimate aspects of our lives, such as grief and romantic relationships, requires careful consideration of the ethical implications. Just because we can create AI systems that simulate human interaction does not mean we should. (This is where I enjoy quoting Jurassic Park's Dr Ian Malcolm as written by Michael Crichton and masterfully played by Jeff Goldblum - to me, a maxim for all ethics)
Whether in the form of AI girlfriends or simulations of deceased loved ones, these technologies raise significant concerns about the authenticity of relationships, the risk of self-delusion, and the potential for prolonged psychological harm. As we continue to develop and integrate AI technologies into our lives, it is essential to prioritise the well-being of individuals over the allure of technological convenience. The temptation to retreat into a Fortress of Solitude built from artificial intelligence may be strong. Still, through genuine human connection, even with all its imperfections, we find true fulfilment and peace. As we navigate this complex landscape, we must ensure that our use of AI in emotional and intimate contexts is guided by a clear ethical framework that respects the inherent value of human relationships and the psychological processes that sustain our well-being. Let us remember that Superman, the last son of Krypton, became the purely good superhero he is because of the love and nurturing parenting given to him by his adoptive parents, Jonathan and Martha Kent, and he found meaning and purpose in his work as a journalist and his colleagues, and love in the arms of Lois Lane, even if he was an alien from a faraway planet.
Comments