Skip to Content
Humans and technology

I just watched Biggie Smalls perform ‘live’ in the metaverse

An avatar of the singer, who died in 1997, performed with live rappers on Meta’s Horizon Worlds.

December 16, 2022
still from "The Notorious B.I.G. Sky’s The Limit: A VR Concert Experience" showing the avatar of Biggie Smalls on the virtual stage
Courtesy of Meta

For a moment on Friday, Biggie Smalls was the only man on stage. A spotlight shone on him in his red velvet suit, and amid pre-recorded cheers, he rapped the lyrics to “Mo Money Mo Problems,” swiveling to the beat in his orange sneakers.

You wouldn’t be wrong to be confused. Smalls died in 1997 when he was shot at the age of 24, leaving an outsize musical and cultural legacy as one of the greatest rappers of all time. But Smalls—whose real name was Christopher Wallace—was in fine form on Meta’s Horizon Worlds metaverse platform on Friday: heaving between stanzas, pumping his fist rhythmically, and seeming very much alive. The performance can be seen here but may require logging into Facebook.

Smalls’s hyperrealistic avatar is not just an impressive technical feat. It is also a crucial test of two big questions we’ll soon face if metaverse platforms gain traction: whether people will pay to see an avatar of a dead artist perform, and whether that business is ethical.

Smalls isn’t the first dead artist to be resurrected. Hologram performances have long been a controversial but popular way of reanimating musicians who have passed away: Buddy Holly, Whitney Houston, Michael Jackson, and Amy Winehouse have all been turned into holograms for gigs held after they died. One of the most notable hologram shows was by Smalls’s rival Tupac Shakur, who died in 1996 but “performed” at Coachella in 2012.

Holograms, however, are inherently limited. They require audiences to sit at a specific angle to get the illusion of the artist performing in 3D. The metaverse offers a way for people to see a more lifelike avatar and even potentially interact with it—something the team behind Smalls’s gig hopes to be able to offer in the near future.

What’s remarkable about Smalls’s performance on Friday was the realism. His moves, mannerisms, and facial expressions were stunningly lifelike. 

But there were some hiccups to remind viewers that Smalls was an avatar. In scenes with live rappers, Smalls seemed to stumble into his co-performers. When other rappers supported his lyrics, Smalls would sometimes wander out of the central circle where he was performing, not responding to his fellow rappers the way a living human performer would.

Smalls’s avatar was more “natural” off-screen, in pre-recorded digital segments where his likeness roamed through ’90s-era Brooklyn. His movements weren’t unnatural, his clothes were wrinkled, and his head turned and hands moved in ways that made it hard to tell this person was a digital creation.

The technology behind this visual feat has been years in the making, says Remington Scott, the VFX director responsible for creating the Smalls avatar. Scott is the founder of Hyperreal, the studio behind the motion capture that made Andy Serkis’s Gollum character come to life in The Lord of the Rings. (In this new case, an actor was used, but the avatar incorporated the same techniques.) “When we used this technology in feature films, it would take six months and millions of dollars,” Scott says. “Now, we can do it in six weeks and at a much lower cost.” 

The team gathered dozens of hours of footage from home videos and family photos to help create Smalls’s avatar, Scott says. This reference imagery was used to incorporate minuscule details into the avatar, down to the corners of Smalls’s eyes or the way his skin furrowed when he made certain expressions.

The team created a database of “micro-expression reference materials,” analyzed “pore-level resolution imagery,” and tracked the elasticity of sub-skin layers to understand how Smalls’s facial skin moved, Scott explains. Those minute changes in facial expression were crucial to creating as real an avatar as possible. 

All that research paid off. “I have seen the avatar throughout the process of building … and it looks very real to me. I see my son’s characteristics in the detailing,” his mother, Voletta Wallace, said via email. “The avatar turned out to be all that I hoped for.” Scott says that when the team unveiled Smalls’s avatar to Wallace, she said, “That’s my Christopher.”

“There wasn’t a dry eye in the room,” Scott recalls. “At that moment, we surpassed any technical achievements we were striving for and were in the realm of emotionally real simulations.”

Part of the reason Smalls was a prime contender for a VR concert was that he was a star with no live recorded performances. “Biggie lived through two albums and never went on tour,” says Elliot Osagie, founder of Willingie, a digital media company that collaborated on the event. The virtual performance was an opportunity for fans to finally see their hero live—and introduce a new generation to a legendary rapper.

That’s where Wallace, who is also executor of his estate (estimated to be worth around $160 million), comes in. Although it was an emotional project, there’s no question that it was also a business opportunity: Scott says that Wallace and her son’s estate had been searching for “opportunities to bring him back to reengage with his fans and build a new fan base.” The latter part is particularly important: Smalls’s peers are Gen Xers who are only getting older. Putting Smalls in the metaverse, an arena that is dominated by younger generations, could expand his audience. Wallace confirms this: “I envision more concerts, videos of his music, commercials, animation, films, and more opportunities in the metaverse.”

Wallace, Hyperreal, Willingie, and Meta refused to disclose how much Wallace’s estate paid for the avatar, or how much Meta paid for exclusively hosting the VR concert. Meta also did not respond to MIT Technology Review’s questions about its role in the concert but did insist that the event—which was held on the company’s flagship metaverse platform, Horizon Worlds—was not held in the metaverse, but rather in virtual reality. When asked to clarify what the metaverse was, Meta did not respond.

However, Scott says that what differentiates his company’s avatars from traditional ones is ownership. With other avatars, “the actors and performers don’t subsequently have rights,” he says. “But our model is to flip that. We create digital identities for talent and then move forward.” In Smalls’s case, his estate had full input into creating his digital twin.

But how do you ensure an artist has a say in what can or cannot be reproduced? “That’s the million—or should I say billions-of-dollars question,” says Theo Tzanidis, a senior lecturer in digital marketing at the University of the West of Scotland, where he has written about the hologram and metaverse music business. 

For the most part, celebrities and artists do not currently include clauses in contracts or wills about how they would like their likeness used in the metaverse or by artificial intelligence, but Tzanidis would not be surprised if the practice were to begin soon. 

We have no possible way to know if Smalls would have consented to this use of his likeness, though—and there is no way he could have conceived of a platform like Horizon Worlds.

To Osagie, it’s important to make sure an avatar remains true to a given artist’s era and doesn’t do anything that person couldn’t have conceived of. He uses an upcoming metaverse project with a jazz legend as an example: “Miles Davis had a career that lasted decades. If you wanted to tell a story about his music, that’s cool. If you wanted to animate his avatar and have him playing cards with Drake—well, that’s not something that could have happened. The real line for me is that the artist is doing what they were doing.”

That may make sense. But in a future where avatars become increasingly lifelike, business expands, and the line between the metaverse and real life is blurred, it may be entirely possible for Miles Davis to play cards with Drake, with or without the approval of either person’s estate.

Even the creators of Smalls’s concert took creative liberties. One scene showed Smalls’s avatar on the balcony of what is presumably his apartment; the camera pans over a portrait of former president Barack Obama embracing Smalls, an event that could not have happened because Obama was elected more than 10 years after the musician’s death. At least twice, Smalls is shown answering a smartphone, a product that wasn’t available during his lifetime.

Tzanidis thinks the lack of legal framework is problematic. And it goes far beyond the traditional confines of art, in his opinion: “What if you could return back and ask people [historical figures] what they did? What if you could get training from people in your field? What will happen when we can re-create previous timelines?”

That vision is already happening: a digital version of the American golfer Jack Nicklaus is set to launch soon on an as-yet-undisclosed virtual platform. Fans will be able to interact with him, and he’ll offer golfing tips and stories recounting his wins.

Nicklaus was fully involved in the creation of his avatar. But Smalls wasn’t. And there is no way to confirm that his wishes matched his mother’s. “For the metaverse, there is no rulebook, no rules,” Tzanidis says. “There should be.”

Osagie says that Thursday’s concert is not the end for Smalls’s avatar. He and Scott are exploring expanding into other gigs and games, as well as putting on a Coachella performance by Smalls. Scott is excited by the prospect. “The metaverse is another reality, and within this one, Biggie is still alive, and I love that world,” he says. “I think a lot of fans will love that world.”

Deep Dive

Humans and technology

Unlocking the power of sustainability

A comprehensive sustainability effort embraces technology, shifting from risk reduction to innovation opportunity.

Building a data-driven health-care ecosystem

Harnessing data to improve the equity, affordability, and quality of the health care system.

Let’s not make the same mistakes with AI that we made with social media

Social media’s unregulated evolution over the past decade holds a lot of lessons that apply directly to AI companies and technologies.

People are worried that AI will take everyone’s jobs. We’ve been here before.

In a 1938 article, MIT’s president argued that technical progress didn’t mean fewer jobs. He’s still right.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.