Alongside upgrades to the likes of After Effects and Premiere Pro, Adobe has revealed a new application called Character Animator.
Based on the same facial capture technology than underpins AE's new Face Tracker and Premiere Pro's new Morph Cut, Character Animator takes your facial expressions and applies them to rigged characters. It can also lip-sync mouth movements to audio, animate a character's bodies using your mouse, and create physics-based animations.
When will Adobe Character Animator be released?
Even when made available - and Adobe hasn't said when this is yet - Character Animator is an early beta, so it might be more for fun than for professional-level animation at this stage.
Read on to learn how to create animated characters with Character Animator.
Adobe Character Animator: how to create characters
To create a character, you design them in Illustrator or Photoshop with different body parts on separate layers - or you can buy parts from the Creative Cloud Market. You need to name each layer to match the body part, and Adobe says it will be automatically rigged when brought into Character Animator.
There's a live link between Illustrator/Photoshop and Character Animator, so updates made to characters in the former applications will be automatically updated in the latter.
Adobe Character Animator: how to animate characters
To animate a character's face, you can capture expressions on your face using a webcam, or record your voice to generate mouth movements. You can also manually trigger actions such as blinking using keyboard shortcuts.
Animating your character's body is done using the mouse.
After you've recorded your animations, you can trim and arrange the clips on a timeline - and export the final animation to After Effects (with animation data intact) or to Premiere Pro for editing.
Other tools include physics-based animations such as hair, automated motions such as breathing, and control over which parts of a character can be deformed and which are fixed.