Adobe Character Animator Uses Facial Recognition to Animate in Real Time
If you have been watching Late Show with Stephen Colbert, you might have seen cartoon Donald Trump get interviewed from time to time. What if I told you the software they use you can get for only $49.99 a month?
Adobe has been beta testing the new Character Animator with anyone that has a Creative Cloud subscription. Character Animator allows you to bring a Photoshop or Illustrator image in to make the animation happen. Using facial recognition, you can control the character you make in real-time. I got to check out this program at NAB this week.
You can talk through the character like Terry Fator talks through his puppets. This brings a new level of video production as audio podcasters can now create videos without having a camera on them – well, sort of.
Other animation options allow you to move arms and legs to give a full body experience.
With NewTek NDI support, you can bring these characters into programs like a Tricaster or Wirecast 7.6 (Interview from yesterday).
Twitch creators have been animating themselves to match the video game they are recording. Parents are bringing their kids’ artwork to life. One parent drew out his child, then recorded his voice to import into the program.
An animation’s voice can also be pre-recorded and imported into the program. Using a webcam, your actor doesn’t even need to be in the same country to voice the animation.
If you want to try this program, you will need Creative Cloud. Getting the full suite is recommended so you can create characters in Illustrator or Photoshop, then use media encoder to create pre-recorded content.