Know about Character Animation, by Adobe

Hello guys! Do you know what does Character Animator do?

 Well, Character Animator (CA) is a new application by Adobe, that allows you to easily animate characters, using a webcam, a microphone and a set of simple to use tools. It is a wonderful program for creating character driven short animations, explainer videos, presentations and many more.

2.­let me tell you ..What do you need to use it?

Adobe Creative Cloud Account:

To work with Character Animator you need to have the Adobe service Creative Cloud– a service that gives you access to different Adobe apps, according to your Creative Cloud plan. To use the Character Animator app, you will need not only it, but also either Adobe Illustrator or Adobe Photoshop in your Creative Cloud plan. Follow this link to learn more about the service or choose a plan according to your needs:

Although it can be used as a standalone program, generally Character Animator is part of Adobe After Effects CC 2015 or later. That means if you have After Effects you already have Character Animator. To open it, click File -> Open Character Animator.

A character:

How to create a working and properly set up character is a complex process, that is why we will dedicate the following tutorials of these series to it in detail. For now you need to know that Character Animator works with artwork created in Illustratorand Photoshop only.

The way it works is when you import your character in CA, and you make changes on your character’s Illustrator (or Photoshop) file and click save, these changes are automatically adopted by Character Animator. In this way you can change anything on your character (colors, size, whole body parts) and the changes will instantly appear in CA.

A webcam and a microphone:

A webcam and a microphone are essential for working with CA because the program uses your face and voice to generate the movement of your character.  The way it works is simple! If your character artwork in Illustrator or Photoshop is properly set up, when you import it to Character Animator, the program recognizes and automatically tags the facial features of your character. They now are rigged to correspond to your own facial features. The features that are automatically recognizable are: pupils, eyebrows, eyelids, blinking, face movements and the different mouth shapes. The movement of those is generated by your performance at the webcam. That means, you blink – your character will blink too. The one exception is the lip sync (the different mouth shapes changing according to sound), which is generated by your voice in the microphone. Also, instead of recording your voice performance, you can import an mp3 file and generate lip sync according to it.
Enjoy watching this video!

Want to get more film making quotes. Join , Creative Filmmakers platform shortfundly.