
This website uses cookies
We use Cookies to ensure better performance, recognize your repeat visits and preferences, as well as to measure the effectiveness of campaigns and analyze traffic. For these reasons, we may share your site usage data with our analytics partners. Please, view our Cookie Policy to learn more about Cookies. By clicking «Allow all cookies», you consent to the use of ALL Cookies unless you disable them at any time.
The fantasy of digital immortality today looks less like a religion and more like a product roadmap: “a personality can be saved.” Not metaphorically - literally: a full dump of memory, habits, tastes, speaking style, fears, vocabulary, small beloved quirks. Press Backup, then Restore - and an entity appears that talks “like a human,” remembers “like a human,” jokes “like a human.” AI is the first technology that makes this fantasy feel technically plausible: large language models already imitate speech, motives, and even “character” well enough to enable social substitution. But that is exactly where a deeper substitution hides: the story confuses content with identity. Content can be copied endlessly. Identity cannot. Put bluntly, without mysticism: the “I” is not a file. The “I” is a key.
A file is an object that can be read, rewritten, moved, and launched elsewhere as “the same thing.” A personality is not a static object. A personality is a process that exists only if the causal chain remains continuous: state -> next state -> next state... without a cut.
AI is excellent at something else: reconstructing a description of a personality from traces - texts, voice, habits, likes, messages, decisions. But a description of a process is not the process itself. A snapshot of a waterfall can be perfect. A snapshot does not flow.
In cryptography, there is a simple structure: a private key (a secret) and public data (everything that can be shown to the world). A signature produced by the private key can be verified by anyone, but the key must not leave its device.
Applying that logic to a human clarifies what AI is actually doing:
Texts, voice, manner of speaking, biography, preferences, “typical reactions” - these are public signals. They can be collected, approximated, and modeled.
The lived continuity of the “I” is a secret - not “data about a person,” but the fact of ongoing execution.
AI is turning into a machine that produces plausible signatures without the key: deepfake voices, “digital twins,” chat-versions of people, simulations of executives’ decisions, “consultants” in the style of dead authors. To the outside world, the signature can be convincing. But the cryptographic meaning of authenticity is lost: a signature no longer guarantees the presence of the key.
Selfhood is built on continuity: “was” and “is” are connected without an editing cut. This line is not reducible to memory: memory gaps are possible, sleep is possible, anesthesia is possible - yet the process can remain continuous as the uninterrupted execution of a living carrier.
Any “upload” scheme in AI logic does something else: it creates a new process that starts from a saved state. That is not a transfer; it is cloning (even if extremely accurate).
Transfer is impossible without continuity. Everything else is launching a copy.
AI amplifies the main trap: external sameness starts to feel like internal sameness. If a model speaks in the same voice, remembers the same stories, reproduces the same thinking style - for observers, that is often enough to say “it’s him/her.”
But social recognition does not solve the problem of the subject’s identity. AI can make a personality reproducible as an interface. It does not yet show a mechanism that makes the subjective stream continue. In other words, AI can build a “facade of personality” so well that the question of the “key” looks like philosophy - until it suddenly becomes law, security, and trauma.
If the same state is reproduced in two instances, at launch they match. A second later, they diverge: different inputs, micro-randomness, contexts. Two subjects appear, each with equal grounds to say: “this is me.”
Then the fork is inevitable:
either the “I” can be two parallel subjectivities (and identity collapses),
or the “I” is a specific line of continuity (and the copy is not a continuation).
AI turns this test from an abstraction into an upcoming industry: issuing “versions of a person” becomes cheap. That is why the concept of identity cannot survive the product metaphor of “restore.”
One might object: “what if everything is scanned perfectly?” But a personality does not live only in structure - it lives in the continuous work of the body as a feedback system: brain <-> hormones <-> breathing <-> heart <-> tension <-> fatigue <-> pain <-> pleasure <-> temperature <-> motor patterns. This is not “background.” It is part of the computation.
AI can simulate answers to questions. Harder to simulate what makes answers lived: interoception, bodily boundaries, vulnerability, biochemical regimes of choice. Vulnerability here is not a bug. It is what turns a “description of a personality” into “subjectivity.” That is why talk about the “soul” suddenly becomes talk about hardware: not spiritual vs technical, but technical vs marketing.
For now, a “digital twin” is a media toy. But in an economy of agents and services, it becomes infrastructure: negotiations, sales, hiring, support, legal work, “personal assistants,” “personal representatives” - all of this will be delegated to agents.
And a new threat emerges:
signatures without keys become normal (anyone can sound “like anyone”),
trust moves from content to attestation (protocols for liveness, provenance, authority),
identity starts being protected as an asset: not copyright, but the right to identity and continuity.
AI will make personality cheap to reproduce, but expensive to verify.
If the thesis “the body is the key” is accepted, practical questions stop being philosophical:
What does consent to a digital twin mean: permission to imitate a signature or permission to “continue a person”?
Who is responsible for the actions of an agent trained on a person’s traces and speaking in their voice?
How can it be proven that communication is with a subject, not a simulation: liveness, attestation, cryptographic signatures, trusted devices?
How can the market of “resurrections” be limited when it sells comfort as identity?
AI will inevitably force new social and technical norms - not because “the world went crazy,” but because signatures no longer imply keys.
The most provocative demand in the age of AI is honest terminology. AI already copies the outer layer of personality so well that society will start calling it “continuation.” But continuation is not plausibility. Continuation is continuity. The body is not a container for the soul. The body is the private key of identity, making the continuity of the “I” possible. AI can forge signatures, launch copies, simulate character and voice, and produce “versions” of a human as a product. But to “transfer” the key would mean preserving the continuity of subjective execution. For now, AI mainly does something else: it makes a copy so convincing that the death of the original becomes invisible - and therefore especially dangerous.
