Digital Necromancy is a Two Pound Gimmick and You Are the Ghost

Digital Necromancy is a Two Pound Gimmick and You Are the Ghost

The bargain-bin afterlife has arrived. For the price of a cheap cup of coffee, you can now purchase a "digital clone" of your deceased grandmother or your childhood golden retriever. The tabloids call it a breakthrough. The tech bros call it "democratizing immortality."

I call it a high-speed collision between predatory marketing and psychological self-harm.

The current narrative around "grief tech" is built on a fundamental lie: that data is equivalent to presence. We are being sold the idea that if you feed enough WhatsApp logs, voice memos, and photos into a Large Language Model (LLM), the resulting output is somehow a continuation of a soul. It isn't. It is a statistical approximation of a corpse.

The Latent Space of the Departed

Let’s be precise about the mechanics. When you pay £2 for an AI clone, you aren't buying an entity. You are buying a specialized "fine-tuning" of a pre-existing model.

The software isn't "remembering" your father’s laugh. It is calculating the mathematical probability of which word should follow another based on the syntax of his old emails. If your father used the word "brilliant" frequently, the model will weight that token more heavily.

This creates a "uncanny valley" of the spirit. The output is close enough to be recognizable, but just far enough away to trigger a visceral sense of wrongness. We are witnessing the birth of a new form of cognitive dissonance where the bereaved try to force a chatbot into the shape of a person, ignoring the glitches because the alternative—silence—is too heavy to bear.

I’ve watched venture capitalists pour money into these startups, and the pitch is always the same: "Comfort at scale." But comfort is the enemy of recovery. Grief is a functional biological process. It is the brain re-wiring itself to account for a permanent absence. By inserting a £2 simulation into that process, you aren't "preserving" a memory. You are stalling the hardware update. You are keeping the brain in a state of perpetual haunting.

The Myth of Digital Immortality

The competitor pieces focus on the "affordability" of these tools. They want you to think it’s a win for the consumer that "immortality" is no longer for the elite.

This is a category error.

True immortality requires agency. It requires the ability to change, to react to new stimuli, and to grow. A digital clone is a closed loop. It cannot learn your new favorite song. It cannot react to a world that changed after its data source died. It is a frozen snapshot, a wax figure made of silicon.

When you interact with these "clones," you aren't talking to the dead. You are talking to a mirror. The AI is simply reflecting your own prompts back at you using the vocabulary of the deceased. It is a sophisticated form of ventriloquism.

The Security Risk of the Soul

Beyond the philosophical rot, there is a massive technical vulnerability that nobody in the "lifestyle" press wants to discuss: data persistence and identity theft.

When you upload your dead relative’s life to a £2 service, where does that data go?

  • Training Sets: Most of these low-cost services are built on the backs of open-source models or API calls to major providers. Your personal history becomes fodder for the next iteration of the model.
  • The Zombie Identity: We are creating a goldmine for social engineering. An AI that can perfectly mimic a dead patriarch’s voice and writing style is the ultimate tool for defrauding the living heirs.
  • The Terms of Service Trap: You don't own the clone. The company does. If the startup goes bust—which 90% of these "dead-tech" firms will—your "digital loved one" vanishes or, worse, is sold off as a distressed asset.

Imagine a scenario where your mother’s digital likeness is sold to a debt collection agency or used to shill life insurance because you didn't read the fine print in the £2 app. This isn't science fiction. It is the logical endpoint of turning human identity into a subscription service.

Pets and the Devaluation of Reality

The mention of "immortalizing" pets is perhaps the most cynical part of this entire industry. A dog’s presence is 90% non-verbal. It is scent, touch, and rhythmic breathing. Reducing a pet to a text-based chatbot or a generated voice is an admission that we no longer value the physical reality of living things.

We are training ourselves to accept "good enough" substitutes for everything. If you can replace a dog with a script, you can replace a friend with a script. We are optimizing for convenience at the cost of connection.

The "lazy consensus" says this tech is a tool for healing. The data suggests otherwise. Prolonged exposure to "simulated" presence in bereavement is linked to complicated grief—a state where the individual remains stuck in the acute phase of loss, unable to integrate the reality of the death into their life.

The High Cost of a Two Pound Ghost

If you want to honor someone, write a book. Paint a picture. Build something in their name. Do the hard, manual labor of remembering.

Using an LLM to bypass the pain of loss is like using a local anesthetic on a broken leg so you can keep running. You might feel better for twenty minutes, but you are destroying the limb.

The industry wants you to believe that "cloning" is an act of love. It’s actually an act of ego. It’s the refusal to let go, packaged as a technological triumph.

Stop trying to resurrect the dead with cheap code. They aren't in the machine. They aren't in the £2 app. They are gone, and the most disrespectful thing you can do to their memory is to replace their complex, flawed, beautiful humanity with a predictive text algorithm that has the same depth as a weather report.

Delete the app. Feel the silence. That is where the actual healing starts.

NB

Nathan Barnes

Nathan Barnes is known for uncovering stories others miss, combining investigative skills with a knack for accessible, compelling writing.