Home Tech Chris Umé, the man behind the Tom Cruise deepfake

Chris Umé, the man behind the Tom Cruise deepfake

by admin

Chris Umé is in the finale of “America’s Got Talent,” after an impressive deepfake act in which the show’s male judges performed the opera number “Nessun dorma” via artificial intelligence. Knack spoke with the Flemish AI artist earlier this summer. Reread here.

This interview was already published in July 2022 in Knack.

Artificial intelligence technologies are getting closer to the consumer. For example, Dall-E 2 is a real hype on the Internet. Enter a few words and Dall-E 2 “captures” them. “In terms of creativity, this is a huge step forward,” says Flemish AI artist Chris Umé.

Type in ‘cuckoo’, ‘strawberry’, ‘Alphorn’ and ‘omelet’ and see what Dall-E does with them. The platform does not search for existing images of the entered words, but creates completely new creations itself. That the results can be quite absurd and comical is no surprise. But it’s also a form of creative intelligence that taps into endless new possibilities.

“Dall-Es are computer models trained on millions of images of everything that exists,” says Chris Umé (32) from his current home in Bangkok. Since the Limburger went viral last year with his Tom Cruise deepfakes, he has become world famous as a pioneer of deepfake technology. “If you enter Picasso, Marilyn Monroe and black/white, you get a black and white picture of Marilyn Monroe in the Picasso style. What I like is that it’s a huge leap forward in terms of creativity.

What possibilities do you see there?

Picasso-style Marilyn Monroe © Dall-E/Metaphysics

Chris Ume: Personally, I would use the tool for inspiration. For a film or an album cover, for example. You enter keywords, start playing around with them a bit, and then use them as a base to work from. For example, if they have to draw scenes for a movie like the Lord of the Rings and want to create a certain atmosphere, you can try this app endlessly.

Are you working on creating such an application yourself? The Midjourney Research Lab is also doing wonderful things with it.

Ume: I’ve used Midjourney a few times to experiment. I don’t know if it’s the same model as Dall-E, but it’s the same. But we are not busy making such a tool ourselves, we specialize more in hyper-realistic deepfakes of people. We are doing something completely different and yet a bit the same, but with synthetic people. We can create people who don’t exist, using keywords or parameters.

We are now mainly interested in face replacement, which allows you to change a face into someone else. And we reproduce the voices to sound exactly the same. For example, Tom Cruise can say an English advertisement for Douwe Egberts in Dutch, Chinese or Spanish, his mouth also moves perfectly.

Along the lines of synthetic humans we are also working on avatars directed to the metaverse. The metaverse is a very vague thing for a lot of people and the idea we have is that you can create an avatar that you own the rights to.

Maybe first explain what metaverse is.

Ume: The term is used to describe the virtual world made up of various websites, social media platforms, and games where people can connect with each other online. It’s like cyberspace, but much more advanced and expansive.

Now about those rights to your avatar, we are working on a platform where companies can buy or rent avatars. If a company like Nike is looking for an actor for an ad, there will soon be a way to do otherwise. The company can buy an avatar, insert a fake actor and you won’t see any difference with a realistic actor. It also opens up a lot of possibilities. You can change the age and appearance a bit, but that person’s identity remains.

It’s almost now face replacement, but what about the body?

Ume: I dream of having breakfast with my grandparents in Belgium in X years while I’m in Bangkok. We would then have a kind of contact lenses, I am sitting at my table with my lenses and they are at their table with their lenses. This way they see me realistically at their table and we can talk to each other as if we were together. If we can transform the data we’re using now for faces and full heads into the whole body, I can completely transform into the metaverse and I’m – as we now talk to each other via WhatsApp – having a digital breakfast with my grandparents. And if you go a little further, in 20 years I will be able to browse this memory with my children – after all, everything is digital in the cloud.

This is yet another tool that can be abused. But this abuse applies to all technologies.

It sounds fantastic, but very far.

Ume: But technology is changing rapidly. Just look at our company Metaphysic. I was working in South Park when the Deep Tom Cruise videos went viral in March of last year. My mailbox exploded, offers poured in. In May 2021 I decided to stop and move into the field of hyper-realistic synthetic media. So that was last year. Today I work with about 38 people. I just want to point out that it doesn’t stop, it’s absurd how fast it’s going. An example: we can bring people back. Take Elvis. We “train” his body and his voice on realistic images, then we do a live concert of him at his peak. It’s kind of like Abba holograms but super realistic – Abba is 3D. If we depict a hyper-realistic Elvis, it is based on his real images.

In the meantime, you have amazed the world with your act of deepfake America has talent, in which you project jury member Simon Cowell on vocals and 20 years younger.

Ume: America’s Got Talent seemed to us to be the biggest public platform where we could present our technologies. If you combine this technology with live entertainment, it will be very interesting for the future. At the same time, we also want to make people aware of the possibilities that technology offers and where we are headed. Because there are always pros and cons. We want to show what the creative possibilities are. In a month, we will go to the semi-finals. (Laughs) Told me last year – before I even did the Tom Cruise deepfake – that I’d be on stage from America’s Got Talent I would have declared you crazy. But it’s a good publicity stunt.

As you say, there are also downsides to deepfakes. You yourself received a lot of criticism after the Tom Cruise films.

Ume: The criticism is mainly due to ignorance. “It’s illegal, it should be banned”, things like that. Now I’m in touch with universities that are researching detection capabilities, and we’ve created a platform, Synthetic Futures, where we poll different panels every few months. These panels are made up of lawyers, experts in AI detection and abuse, and many other subject matter experts. We try to guide people to the future we are working on and how we can approach it ethically. It is a very difficult problem because who, what or which country will be able to officially do something against this abuse?

This is yet another tool that can be abused. But this abuse applies to all technologies. I think we need to teach people to be skeptical and not take everything for granted. For journalists, that means checking everything. The media will have to provide very solid evidence and it will be quite a task. But he can no longer be stopped.

Chris Ume

– 1990 born in Paal, Limburg

– Camera studies, editing and special effects at Syntra

– Production of films and animations for Qmusic, then for VTM

– Became world famous in March 2021 with his Tom Cruise deepfake videos

– Created his company Metaphysic in May 2021

– Will be in the semi-finals this summer America’s Got Talent with an act of deepfake

– Lives in Bangkok

You may also like

Leave a Comment