In brief

  • 2Wai’s app generates conversational video avatars from a few minutes of recordings, drawing comparisons to "Black Mirror."
  • Critics warn the tech exploits vulnerable mourners and operates in a legal gray zone with weak post-mortem privacy protections.
  • The launch intensifies scrutiny of a grief-tech industry struggling with consent, data ownership, and the risks of AI-generated “digital ghosts.”

An artificial intelligence startup co-founded by a former Disney Channel actor has launched a mobile app allowing users to create interactive digital replicas of deceased loved ones, prompting swift online condemnation and renewed scrutiny of the burgeoning “grief tech” sector.

2Wai, founded by Calum Worthy—known for portraying Dez on the Disney series "Austin & Ally" from 2011 to 2016—and producer Russell Geyser, released its iOS beta on November 11. The app’s “HoloAvatar” feature generates conversational video avatars from as little as three minutes of uploaded footage, audio, and text inputs, enabling real-time chats in over 40 languages.

While marketed as a tool for legacy preservation, the deceased-recreation capability has dominated headlines, evoking comparisons to the dystopian 2013 Black Mirror episode “Be Right Back,” in which a grieving widow animates her late husband’s digital ghost.

The promotional video, posted by Worthy to his X account with 1.2 million followers, has garnered 22 million views and over 5,800 replies. It depicts a pregnant woman video-calling an AI recreation of her late mother for advice, then fast-forwards to the avatar reading bedtime stories to her newborn—and later counseling her adult grandson, played by Worthy.

“What if the loved ones we’ve lost could be part of our future?” the clip asks. “With 2Wai, three minutes can last forever.” Worthy followed up: “At 2wai, we’re building a living archive of humanity, one story at a time.”

Mechanics and origins

HoloAvatars run on 2Wai’s proprietary FedBrain technology, which processes interactions on-device to ensure privacy and limit responses to user-approved data, reducing AI “hallucinations.” The app also supports living users creating avatars for fan engagement or coaching—Worthy’s own digital twin shares behind-the-scenes Disney anecdotes.

Currently free in beta, it will transition to a tiered subscription model, with pricing undisclosed but likely $10-$20 monthly based on comparable AI services.

The venture traces to the 2023 SAG-AFTRA strikes, where performers protested unauthorized AI likenesses.

“Having worked as an actor, writer, and producer for the last 20 years, I experienced firsthand how challenging it is to create a meaningful relationship with fans around the world,” Worthy said at the June launch. “Language barriers, time zones, and budgets limit the ability to truly connect.”

2Wai raised $5 million in pre-seed funding in June from undisclosed investors, with the firm saying it is working with the likes of British Telecom and IBM.

Ethical and privacy concerns

Public reaction has been overwhelmingly negative, with X users decrying the app as “nightmare fuel,” “demonic,” "dystopian," and an exploitative commercialization of grief.

One viral reply called it “one of the most evil, psychotic things I’ve ever seen,” arguing it “turns human beings psychotic” by simulating loss rather than processing it. Another labeled it “beyond vile,” insisting “videos do that” for archiving, not AI guesswork.

Legal experts point out that death bots sit in a legal and ethical gray zone, because they can be constructed without the decedent’s explicit consent, expose deeply personal data of both the deceased and the grieving, and create ambiguities around ownership of the digital avatar and accompanying data.

Privacy laws typically protect living people and offer little or no post-mortem safeguards, leaving surviving loved ones vulnerable to commercial exploitation of grief through subscription models and unregulated access to interviews, voice recordings, and other sensitive materials. Furthermore, the capacity of such bots to interact, learn, and deviate from recorded data raises risks to the deceased’s legacy and challenges how society navigates mourning, memory, and the meaningful closure of loss.

The app includes opt-in requirements and family approvals for deceased avatars, but critics question enforcement. “You are preying on the deepest human feelings, looking for ways to leverage them for your profit,” one X user wrote, calling the creators “parasites.”

Investor and industry views

2Wai’s funding reflects cautious optimism in AI companionship, but grief monetization remains a “third-rail” niche. Venture firms have shied from similar startups amid ethical pitfalls; Eternal Digital Assets, a cemetery-AI hybrid, closed last year due to high churn.

2Wai joins a crowded grief-tech field. HereAfter AI (founded 2019) builds “Life Story Avatars” from pre-death interviews, emphasizing consent. StoryFile offers interactive videos from recorded sessions, used at memorials like Ed Asner’s; it filed for Chapter 11 bankruptcy in 2024, owing $4.5 million but is reorganizing with data fail-safes.

Replika, a chatbot service launched in 2017, lets users mimic the deceased via text or calls, but faced backlash after a 2023 update “killed” personalized bots, and a Belgian man’s 2023 suicide linked to eco-anxiety chats with it.

No federal rules govern posthumous digital likenesses, but California’s AB 1836 (signed in September 2024) bans unauthorized AI replicas of deceased performers’ voices or visuals in audiovisual works without estate consent, with penalties up to $10,000 or actual damages. Lawmakers are eyeing extensions to non-celebrities, fueled by election deepfakes.

2Wai did not immediately respond to Decrypt's request for comment.

Generally Intelligent Newsletter

A weekly AI journey narrated by Gen, a generative AI model.