Arnold Metahuman + DeepFake (113,000 iterations)

Subscribers:
2,810
Published on ● Video Link: https://www.youtube.com/watch?v=XBRHdmY9WlQ



Duration: 0:48
529 views
3


Here is the Terminator Metahuman + DeepFake test. This result is after 113,000 iterations (but will run over night still. While it will get better quality as I let it run, I am realizing that I will need to get higher quality (2k+ source video) and make sure I get a full range of motion. Finding 2k+ source footage is pretty tricky, as I'm trying to capture Arnie in his youth... and most films from that time are less than 2k.

A lot of this source footage is from Terminator, which doesn't give me a lot of blinking or looking down (he really keeps his head up a lot). You can see the results of this when he looks down, and the face kind of morphs a bit.

I think it might be time to start working on the Metahuman version of him rendered in Unreal Engine. For that, I'll be using Xsens and MANUS™ motion capture technology, along with Faceware software plugged into Unreal Engine. I'll be putting his clothes together by simply using some CG assets (like a leather jacket) and conforming it to his body. Once his final facial movements are tweaked and done in Unreal Engine, I'll apply some more DeepFake to the final output, giving a very realistic result.







Tags:
software
quality
technology
ai
amd
radeon
6900XT
machinelearning
unrealengine
ue5
metahumans
deepdfake
hardware
pc
ryzen
project
manus
xsens
arnold
terminator
schwarzenegger
metahuman
creator