In the Properties panel, under the Limb IK behavior, use the Apply to drop-down to choose from arm, legs, or both.
Limb IK controls the bend directions and stretching of legs and arms. For example, pin a hand in place while moving the rest of the body or make a character’s feet stick to the ground as it squats in a more realistic way. Use this behavior to control the bending of elbows, knees, and ankles and make the movements look natural. IK stands for Inverse Kinematics, a method for calculating, for example, entire arm positions by specifying only the hand position.
In the Properties panel, under the Limb IK behavior, use the Apply to drop-down to choose from arm, legs, or both.
For each arm, add handles at the locations of the shoulder joint, elbow, and wrist. For each leg, add handles on each hip, knee, heel, and toe.
For Arms: apply the Left Shoulder, Left Elbow, and Left Wrist tags to the left arm’s handles, and the Right Shoulder, Right Elbow, and Right Wrist tags for the right arm. The handle with the Wrist tag is usually the same one that will be moved, such as with a Draggable tag. The Shoulder tags are best placed in the parent group of the arms.
If Left Shoulder and Right Shoulder handles are not present, the Neck-tagged handle will be used instead.
The location of the Elbow handle relative to the line between the Shoulder and Wrist handles will determine the direction that the arm can bend. The placement of the Elbow handle is important if the arm was drawn straight (for example, arms in an A- or T-pose).
For Legs: apply the Left Hip, Left Knee, Left Heel and Left Toe tags to the left leg’s handles, and the Right Hip, Right Knee, Right Heel, and Right Toe tags for the right leg.
Limb IK and Walk behaviors on the same puppet might cause issues because both want to control the same handles. If you do have both behaviors applied, consider turning off Limb IK when animating walk cycles, and turning off Walk when doing general body animation. Limb IK should work fine in side and 3/4 views.
This behavior allows you to drag a region of a puppet away from the rest of it (for example, to wave its arm). It is applied by default for new puppets, but works only if a location on the puppet is set up for control via mouse or touchscreen.
Assign the Draggable tag to a specific handle or to groups to affect its origin. If imported artwork has a guide or a layer with the word "Draggable" in its name, the Draggable tag is applied to the corresponding handle automatically.
The Dragger tool in the Puppet panel can create Draggable-tagged handles without needing to modify the original artwork file.
Drag near the location in the Scene panel. The nearest Draggable handle location on the puppet moves to match the relative changes in mouse position while dragging.
The Dragger behavior records each Draggable handle that you move as a separate take group, so that multiple performances for a specific handle compose together, and don’t affect performances for other handles that you drag. By grouping Dragger takes by dragged handles, you don’t need to use multiple Dragger behaviors to capture multiple dragged handles. The Timeline panel shows each Dragger takes grouped by handle name as "Handle (handle-name)".
The Dragger behavior lets you control Draggable-tagged handles only when dragging within some distance from the handles. This optional control can help when you only want to control Draggable handles when very close to them, as opposed to whichever one is nearest. It is also useful when you have more than one Dragger behavior applied to different parts of the puppet — to avoid both activating on a single drag.
If you have a touch-enabled display, you can control Draggable handles by touching the display. Multiple handles can be controlled at the same time. The following actions can be performed at the handle location:
This behavior has the following parameters:
Each Draggable handle can have different After Move and Return Duration settings. For example, you can use Hold in Place to pose one draggable hand of a character, switch to Return to Rest and drag the other hand, then change the Release Duration to be longer and then drag a necklace.
This behavior uses the webcam, mouse, arrow keys, or touch-enabled display to control the movement of a puppet’s pupils for finer control and recording of eye gaze. You can dart the puppet’s pupils directly to the nine common positions shown below.
You can smooth and pause eye gaze and blend recorded eye gaze takes.
Organize and tag layers similar to those for the Face behavior. However, you only use the Head, Left Eye, Right Eye, Left Pupil, Right Pupil, Left Pupil Range, and Right Pupil Range tags.
You can control eye gaze with the webcam, mouse, arrow keys, or touch-enabled display.
If you are using camera input to control eye gaze, make sure the Camera & Microphone panel is open, your webcam is on, and the Camera Input button is not disabled. The panel shows what your webcam sees.
Select the puppet to control in the Timeline panel.
To use the webcam, arm the Camera Input parameter in the Properties panel, then look around in front of the webcam.
To use the mouse or touch-enabled display, arm the Mouse & Touch Input parameter. As you move your pupils or drag with a mouse or finger, the puppet’s pupils should follow.
To use arrow keys, arm the Keyboard Input parameter in the Properties panel.
To pause pupil movements (when the Camera Input parameter is armed), hold down the semicolon (;) key.
Use this capability to , for example, have the character glance from side to side, holding the stare at each side. When you release the key, the pupils smoothly move to the currently tracked pupil positions. To slow down the transition, increase Smoothness.
If you want to only use one of the input types to control the eye gaze, disarm the other two. If you have two or more enabled, you can use the Strength parameters to control how the various inputs are blended.
This behavior has the following parameters:
100% is the default transformation for each parameter, but you can decrease it to 0% to dampen the transforms or increase above 100% to exaggerate them.
Tip: Temporarily disarm behaviors controllable via the mouse (for example, Dragger or Particles) if you want to control eye gaze with the mouse.
You can also blend together Eye gaze recordings (takes).
When the Camera Input and Microphone Input buttons in the Camera & Microphone panel are disabled, they override (disarm) the Arm for Record setting for Eye Gaze behavior. This allows you to view, while stopped, the results for a selected puppet with recorded takes for these behaviors without needing to deselect the puppet first.
This behavior uses the face-tracking results from your webcam to control the position, scale, and rotation of named facial features in your puppets.
Specify the facial features to control
Organize and tag layers as described in Body features.
A puppet's eyes can blink in two ways — swapping to different artwork for a closed eye or sliding opposing eyelid layers together. The former gives you more control over the look of a closed eye, especially if it cannot be represented as two adjacent layers, and is easier to set up. However, no scaling occurs for partially closed eyes. The latter gives you continuous movement/scaling of eyelids, but requires more set up.
Use unique closed eye artwork
Assign separate Left Blink and Right Blink layer tags within the Left Eye and Right Eye tagged layers. When the Face behavior detects a closed eye, the Eye artwork is swapped for the Blink ones.
Use separate eyelid artwork
Assign tags for Left Eyelid Top, Left Eyelid Bottom, Right Eyelid Top, and Right Eyelid Bottom layers inside the respective Left Eye and Right Eye layers.
Create a handle at the bottom edge of each Eyelid Top layer, and another handle at the top of edge of each Eyelid Bottom layer.
The vertical distance between these handles determines how far to close and open the eyelids (i.e., Character Animator tries to move the top/bottom eyelid layers together to simulate closing of the eye).
As you raise or lower the eyebrow of your puppet, you can also tilt them for more expressiveness. You can tilt the eyebrow inward at the low point to emphasize a scowl, or tilt it outward at the high point to make your puppet look surprised.
To adjust the intensity of eyebrow tilt when raised or lowered, adjust the Raised Eyebrow Tilt and Lowered Eyebrow Tilt under Face behavior.
Note: Both these options are relative to the drawn orientation of the eyebrow at the rest pose, with 100% or -100% tilting the eyebrows vertically.
The Move Eyebrow Together option under the Face behavior is set as default in Character Animator. This means that the eyebrow movement happens in sync with each other. You can deselect the option for independent control of the eyebrows.
This behavior captures your facial expressions from your webcam and animates the puppet based on your facial movements.
For some configurations with both internal and external webcams, the internal camera might not be the first (default) camera, or the intended external webcam might not be the next video input, so you might need to switch to the intended webcam. Also, sometimes you might need to reset or retrain the face tracking to the current position and orientation of your face so that the initial appearance of a puppet is as intended.
Mirror Camera Input
Use the Mirror Camera Input option in the Camera & Microphone panel menu to control if the camera image should be flipped horizontally before being used. Note that the option is checked by default.
Choose a specific webcam (video source)
Choose Switch to Next Camera from the Camera and Microphone panel until the intended webcam is active in the panel, or choose Switch to Default Camera to reset to the first webcam.
If you have multiple webcams or your webcam isn’t the first video source found (for example, if you have a video capture device), cycle through available video sources to choose the intended one.
If the number of video sources changes during or between sessions, you might need to reselect the intended source.
Improve tracking accuracy of your facial performance
Calibrate for your face
Look at the center of the Scene panel, place your face in what you consider a default, rest pose, then click Calibrate in the Camera and Microphone panel or press Cmd/Ctrl+P.
Recalibrate the red face and blue pupil tracking points
In case they no longer follow your facial features, try moving your head within the field of view of the camera, double-click in the Camera & Microphone panel, or push the red dots towards your face using your hand.
Smoothening facial movements
If your facial movements in front of the webcam are uneven or if lighting conditions cause facial tracking points to move unexpectedly, you can try compensating for these movements by having captured camera information smoothened over time. To smooth out facial movements, increase the new Smoothing parameter’s value in the Face behavior. The default value does some smoothing, but you might want to decrease it if you prefer to have your puppet react instantaneously to quick motions, including rapid eye blinks. Mouth replacements are not affected.
Capture the initial frame of the performance
To ensure that a punched-in performance (starting a recording during playback) for the Face or Lip Sync behaviors captures the initial frame of the performance, deselect the Pause During Playback option in the Camera & Microphone panel menu.
A character can have multiple Head-tagged group , each with its own set of views (and Head Turner behavior). For example, you might have one set of views by default, but then use a keyboard trigger to switch to a different set of views.
Set up switchable sets of Head Turner views
Create a Head group (with Head tag) that contains the different views (Frontal, Left Profile, etc. tags), and add the Head Turner behavior to the Head group.
Repeat step 1 for the other sets of views, and assign a key trigger to each of these other Head groups, with Hide Others in Group checked.
Make sure the Face behavior is on a parent puppet of these Head group.
As you press the key trigger to show a head, you can then turn your head to trigger the different views.
See Wendigo in the Character Animator Examples download for a working example that you can modify.
You can produce pose-based movement from the webcam automatically, emphasizing key poses you make with your head and facial features using the Face behavior. Adjust the Pose-to-Pose Movement parameter to control how much to pause the head and face motion. The higher you set the parameter, the more the system will hold the key poses. Setting the parameter lower will cause the key poses to change more frequently.This parameter does not affect lip sync. Use the Minimum Pose Duration parameter to specify the minimum amount of time that a key pose will be held. This parameter only has an effect if the Pose-to-pose movement parameter is greater than 0%.
Tip: Increase Smoothing to 60% or more to ease the transition between key poses. Lower smoothing values can produce jarring jumps between poses.
Make sure the Camera & Microphone panel is open, your webcam is on, and the Camera Input button is not disabled. The panel shows what your webcam sees.
Place the selected puppet in a scene by clicking Add to New Scene or choosing Scene > Add to New Scene. A scene named after the puppet is created, and the scene is opened in the Scene panel.
As you move or rotate your head or make different facial expressions (smile, mouth wide open, etc.), the puppet in the scene should follow.
Audio from the microphone can be used to show different visual representations of your voice, via the Lip Sync behavior (described in Lip Sync: Control a puppet’s mouth with your voice).
Generate face data from prerecorded video
In After Effects, extract face measurements data from the video, as follows:
a. Import the video footage into a composition.
b. Draw a closed mask around the face, open the Tracker panel, then set Method to Face Tracking (Detailed Features).
c. Click the Analyze buttons to track the mask to the face.
d. Set the current time indicator to the frame representing the rest pose for the face, then click Set Rest Pose.
e. Click Extract & Copy Face Measurements.
In Character Animator, select the puppet with both the Face behavior and its Camera Input parameters armed for record.
Place the current time indicator (playhead) at the frame matching the first Face Measurements keyframe in After Effects.
Choose Edit > Paste.
The face measurements data on the system clipboard is converted to a Camera Input take on the selected puppet.
Pause head movements but still allow lip sync
Hold down the semicolon (;) key. You can use this capability to, for example, have one character stationary but talking to another character that is moving in the scene. When you release the key, the puppet’s head smoothly moves to the currently tracked head position and orientation. To slow down the transition, increase the Smoothness.
The semicolon key is usable when the Face behavior’s Camera Input is armed.
This behavior has the following parameters:
This parameter affects only shape-based mouth expressions. Visemes controlled by the Lip Sync behavior are not scaled or affected.
When the Camera Input and Microphone Input buttons in the Camera & Microphone panel are disabled, they override (disarm) the Arm for Record setting for Face behavior. This allows you to view, while stopped, the results for a selected puppet with recorded takes for these behaviors without needing to deselect the puppet first.
Powered by Adobe Sensei, Body Tracker automatically detects human body movement using a webcam and applies it to your character in real time to create animation. For example, you can track your arms, torso, and legs automatically.
When Body Tracker is enabled, the countdown before recording is set to 5 seconds by default so that you can move back from the camera to get your entire body into view before your start recording. To change the countdown to 10 seconds or 20 seconds, choose Timeline > Body Tracker Countdown.
After importing your artwork into Character Animator, follow these steps to rig the puppet so that the Body Tracker can control it:
Create tagged handles for the different parts of the body’s artwork so that the Body behavior can control their movements:
Arms: Move the origin handle to the shoulder area from where the arm should rotate; but do not tag it as “Shoulder” directly on the arm. You’ll want to tag your shoulders and hips one level higher in your puppet hierarchy – commonly the Body folder or the view folder (e.g., Frontal) – to keep your limbs attached during body tracking.
Now, add handles at the locations of the elbow and wrist and apply the Left/Right Elbow and Left/Right Wrist tags to the handles.
Legs: Like the arms, move the origin handle of the leg to the hip area, but do not tag it as “Hip” directly on the leg. Add tagged handles for Left Knee, Left Heel, and Left Toe to the left leg and foot, and the Right Knee, Right Heel, and Right Toe tags to the handles on the right leg and foot.
Shoulders & Hips: The Shoulder and Hip handles should be on the body, not on the limbs. Select the parent group containing your limbs and add handles in the same shoulder and hip positions as the arm and leg origin handles. You’ll see green dots from the limbs where you can place them, or you can copy/paste the handles directly from the limbs to position them exactly. It should work fine either way. Tag these handles as Right Shoulder, Left Shoulder, Right Hip, and Left Hip.
Body and Walk behaviors on the same puppet might cause issues because both the behaviors want to control the same handles. If you do have both behaviors applied, consider turning off the Body behavior when animating walk cycles by setting the Body behavior's strength parameter to 0. The Body behavior should work fine in side and 3/4 views.
This behavior has the following parameters:
This behavior switches between groups, for example, different views like the front, quarter, and side/profile of a character, as you swivel your head or body left or right.
Specify the controllable views by tagging the layers with at least two of the following:
The number of provided views determines the distance your head or body needs to swivel. If you provide Left Profile, Front, and Right Profile, Front is triggered when you look straight at the camera, and the profile views triggered when looking to either side. If all five views are provided, you need to turn farther to the sides to trigger the profile views.
This behavior has the following parameters:
As you turn your head, face tracking accuracy for your eyes, nose, and mouth decreases, so you might want to increase Sensitivity to still have good control over facial features, or reduce Eye Gaze Strength for the Face behavior applied to the profile views. Similarly, increase the Sensitivity while using Body Tracker to have improved tracking accuracy.
You can create multiple head-tagged group and body-tagged group, each with its own set of views. For example, you might have one set of views by default, but then use a keyboard trigger to switch to a different set of views.
Create a Head group (with Head tag) or Body group that contains the different views (Frontal, Left Profile, etc. tags), and add the Head and Body Turner behavior to the group.
Repeat for the other sets of views, and assign a key trigger to each of these other Head or Body groups, with Hide Others in Group checked. Make sure the Face behavior and Body Tracker behavior is on a parent puppet of these groups.
As you press the key trigger to show head or body, you can then turn your head or body to trigger the different views.
This behavior produces lip-synced animation if the original artwork contains visemes (visual representations of your mouth as you make different sounds) and you talk into the microphone. You can also process audio in the scene to generate lip sync data for a puppet.
The Lip Sync behavior has a Keyboard Input parameter that, when armed, allows you to display specific visemes by pressing the first letter of the viseme layer’s name for example, A for the Aa viseme , D for D, W for W-Oo, etc.). You do not have to add keyboard triggers manually to those layer names.
The Lip Sync preference window allows you to control the following:
Use Lip Sync preferences to control how the lip sync engine is used for detecting and generating visemes. To open, select Preferences > Lip Sync.
A viseme is a generic facial image that can be used to indicate a particular sound. A viseme is the visual equivalent of a phoneme or unit of sound in spoken language. Visemes and phonemes do not necessarily share a one-to-one correspondence. Often several phonemes correspond to a single viseme, as several phonemes look the same on the face.
Within Character Animator, there are three shapes that are determined by the shape of your mouth in the webcam. These only show up if no audio is detected (no one is talking). Neutral is the most common to see and should be your default "rest" mouth.
The other 11 mouth shapes, called Visemes, are determined by audio. Visemes are visualizations of key mouth positions when saying common phonetic sounds. Character Animator listens for 60+ specific sounds and translates them into visemes.
Name your characters
If you name and structure your Mouth group like this, Character Animator automatically recognizes and tags these mouth shapes upon import.
Tips to create custom mouth shapes
• Lock the top jaw. Keeping the top row of teeth in a consistent place helps things look smoother.
• Cycle Layers behavior can be added to a mouth group to add a few frames of transitional animation when that sound is picked up. The mouth opening to Aa or W-Oo is a common application.
• You can add additional mouth shapes (sad, yell, etc.) into your Mouth group and show them through keyboard triggers.
• These are examples of a frontal view, but quarter and profile views can follow the same general guidelines.
Improve tracking accuracy of your speaking performance
Try boosting the microphone input level in your operating system’s Sound control panel.
Try making a "boooooo" sound to see if the mouth reliably stays on the "W-Oo" viseme, and a "la-la-la-la-la" sound to see if the viseme with the tongue appears (assuming your artwork included it).
Make sure the Camera & Microphone panel is open and the Microphone Input button is not disabled.
Place the selected puppet in a scene by clicking Add to New Scene or choosing Scene > Add to New Scene.
A scene named after the puppet is created, and the scene is opened in the Scene panel.
You can change the audio hardware references. To change them, select Edit > Preferences > Audio Hardware.
As you talk, the audio signal is analyzed and a matching viseme for your mouth is displayed. When no sound can be detected or the microphone is disabled, control falls back to the Face behavior (if present) analyzing the video signal (your mouth expressions captured by the webcam) to possibly trigger the Smile or Surprised mouth shapes.
Either import an AIFF or WAV file into the project and then add it into the scene, or record audio using your microphone (while the Microphone Input is enabled).
Add a puppet containing the Lip Sync behavior to the scene, and select the puppet’s track item in the Timeline panel.
Make sure both the Lip Sync behavior and its Audio Input parameter are armed for recording, which they are by default.
Choose Timeline > Compute Lip Sync from Scene Audio.
The Compute Lip Sync from Scene Audio command analyzes the scene audio and creates a Lip Sync take only where the scene audio overlaps the selected puppet track items. Muted tracks are not computed. Visemes are automatically generated for the audio and are displayed below the Lip Sync take bar.
Computing Lip Sync from scene audio may take time depending on the duration of your audio.
This behavior has the following parameter:
When the Camera Input and Microphone Input buttons in the Camera & Microphone panel are disabled, they override (disarm) the Arm for Record setting for Lip Sync behavior. This allows you to view, while stopped, the results for a selected puppet with recorded takes for these behaviors without needing to deselect the puppet first.
You can create Lip Sync take for behaviors that have Audio Input parameters ‐ Nutcracker Jaw and Layer Picker ‐ both of which can use audio amplitude to control them. The existing Timeline > Compute Lip Sync from Scene Audio menu command now references the behavior with an armed Audio Input parameter:
If there are multiple armed Audio Input parameters, for multiple selected puppets or multiple behaviors, the menu command will mention the number of takes that will be created. If you only want to record a take for a specific behavior, make sure the others are disabled or disarmed.
You can insert, select, trim, delete, or replace visemes.
Visemes are represented as adjacent bars below the Audio Input track bar and the gaps between these bars are moments of silence in the audio. Each bar represents a separate viseme. Each viseme bar
has the viseme name displayed on it for easy recognition. Zoom in to view these names.
To zoom into the timeline, do any of the following steps:
If the Timeline panel is zoomed too far out to see the Lip Sync viseme bars, they change to a diagonal-lined pattern to convey that viseme information can be edited if you zoom in.
To select visemes or silences, do any of the following steps:
Adjust the timing of visemes or silences
To adjust the timing of a viseme or silence, do any of the following steps:
The left edge of the viseme or silence moves earlier or later in time. You can drag the left edge of the viseme bars or silences across other visemes to replace them.
To delete visemes or silences,
When you delete a viseme bar or silence, the viseme bar or silence to the left extends to the next viseme or end.
There are multiple ways in which you can edit visemes . Visemes can be edited from the Visemes context menu and by dragging visemes manually. Several keyboard shortcuts can also be used to edit visemes when they are selected.
Note that Deleting a selected viseme automatically selects the next viseme or span of silence in time if one exists.
Right-click the viseme to be replaced and choose a new viseme from the context menu. To replace a viseme with a silence, choose Silence from the context menu.
The letters in parentheses are sounds. For example, use D viseme for sounds like n, th, and g.
Tip: Though you can play back or scrub through time, or deselect the Puppet track item to view the result, you can also disable microphone input in the Camera & Microphone panel to make changes to a viseme and see the results on the character immediately.
Depending on placement, the inserted viseme leaves silence after it. To make the inserted viseme fill the rest of the span of silence, press Alt/Option-left click to open the Visemes popup menu.
To split a viseme , do any of the following:
You can cut or copy and paste lip sync takes from one puppet or project and use it in another by following these steps:
Note: If you copy multiple Lip Sync takes, they are pasted in order of their selection.
Instead of copying the entire Audio Input take of visemes or Trigger take of triggers, you can selectively cut, copy, and paste viseme and trigger bars to reuse just the recordings you need. For more information, see Reuse (cut, copy, and paste) viseme and trigger bars.
The Lip Sync behavior can vertically move a Jaw up and down. It can be moved automatically based on the height of the current viseme , specifically the offset of the bottom edge of the viseme relative to the bottom edge of the Neutral mouth shape. With the Jaw handle along the chin of the face, as different visemes are displayed, the bottom of the face can warp to simulate the chin moving up and down.
Note: If a viseme is a cycle (i.e., Cycle Layers applied to a group of layers showing an animated viseme ), the vertical offset will be based on the tallest layer in the group. If there are multiple mouth groups, it will use the average height of all mouths with that same viseme’s tag.
The behavior has the following new parameters to control Jaw movement:
You can take Character Animator visemes into After Effects for use on different characters. For information on the steps to follow, see Lip Sync: Take export for After Effects.
With the audio selected in the Project panel, the Properties panel shows the Transcript text area where you can import or type-in the text matching the spoken words and phrases in the audio.
To import a file, click Import in the Properties panel, then select the corresponding text file.
Add timecodesby typing them manually or use a transcription program to generate an SRT file with timecodes. For an .srt file, change its extension to .txt to select it for import, or copy and paste the text directly into the Transcript text area in the Properties panel.
The audio file’s icon in the Project panel changes to to indicate it has a text transcript associated with it. The Type column in the Project panel shows Audio+Transcript for this file.
You can generate an SRT file using Adobe Premiere Pro. For more information, see Export captions.
This behavior moves the lower part of the puppet’s mouth as you open and close your mouth in front of the webcam or talk into the microphone. This can be a simpler way of making a puppet talk without needing to specify separate artwork for the different mouth shapes and visemes used by the Face, and Lip Sync behaviors. If Face, and Nutcracker Jaw behaviors are both on a puppet, the Face behavior can still control the rest of the face, but Nutcracker Jaw controls just the mouth.
This behavior isn’t applied by default to puppets, so add it first to see its effect on a puppet.
In the Puppet panel, select the layer for lower jaw.
In the Properties panel, click Tags and select the Jaw handle tag from the Miscellaneous section.
You can rotate the jaw of the puppet. To rotate the jaw, follow these steps:
Select the jaw group in the Puppet panel.
Add a handle and tag that Jaw.
If imported artwork has a guide or layer with the word "Jaw" in the name, the Jaw tag is applied to the corresponding handle automatically.
See Nutcracker Annie.psd in the Character Animator Examples and download it for a working example that you can modify.
In the Properties panel, make sure the Camera & Microphone panel is open, your webcam is on, and the Camera Input button (for visual control) is not disabled.
Open and close your mouth in front of the webcam.
In the Properties panel, make sure the Camera & Microphone panel is open, your audio input is on, and the Microphone Input button (for audio control) is not disabled.
Talk into the microphone. Try to speak loudly to see how it affects the intensity of the jaw movement.
This behavior has the following parameters:
The Nutcracker Jaw behavior can now open and close your character’s mouth as you talk into the microphone. Just make sure the Audio Input parameter is armed, and the microphone input is enabled.
In addition, the actual movement of the tagged Jaw layer can now be rotational (clockwise or counterclockwise), in addition to vertical. For rotational movement, create a handle tagged with Jaw as the pivot location; a Jaw-tagged origin handle is not working at this time.
This behavior has the following parameters:
When the Camera Input and Microphone Input buttons in the Camera & Microphone panel are disabled, they override (disarm) the Arm for Record setting for Nutcracker Jaw. This allows you to view, while stopped, the results for a selected puppet with recorded takes for these behaviors without needing to deselect the puppet first.
The Lip Sync behavior can vertically offset a Jaw handle automatically based on the height of the current viseme . For more information, see Move the jaw based on the current viseme .
The Walk behavior allows a puppet to walk across the scene by controlling the legs, arms, and body of the puppet. The behavior simulates some common walking styles such as - strut and prance.
You can create an animated walk cycle with the behavior. A walk cycle is a series of poses played on loop, which creates an illusion of a walking puppet. Walk cycles can convey different moods and emotions to enhance your animation. For example, long bouncy steps represent a happy walk.
This behavior assumes that the character is two-legged and two-armed, and is viewed in profile. But, it is possible to apply it to additional legs.
Upon applying the behavior on the puppet, the legs move through a looping series of poses to complete a walk cycle. The feet are planted at ground level and the arms swing in opposite direction of the legs while walking. Walk behavior can also adapt to long and short legs for a smooth walk motion.
For making your puppet have a more realistic arm and leg motion, use the Left Shoulder, Right Shoulder, Left Hip, and Right Hip handle tags in the characters drawn in three-quarter perspective. Instead of swinging arms from the neck and legs from a single hip location, separate left and right shoulder and hip locations can improve the appearance of these swinging limbs.
Note that you can tag left-moving views as Left Quarter or Left Profile. Similar for right-moving views being tagged as Right Quarter or Right Profile.
Tip: The Home screen includes a template puppet - Walkbot, which has been created using the Walk behavior. You can use this template puppet to view the behavior settings, and modify these settings to customize the puppet.
Setup a profile view puppet with the legs and arms as independent groups to avoid unwanted overlapping. All the independent parts to the parent puppet (profile view puppet) staples automatically to prevent limbs from detaching. For best results, use Hinge or Weld attach styles for the limbs. Next, you can add sticks to the legs and arms to keep them straight and prevent bending.
This behavior is not applied by default to puppets, so add it first to see its effect.
The puppet is made of a number of independent parts attached with the parent puppet. In the Puppet panel, identify the following locations on each leg and on the body of the puppet.
Use the correspondingly named handle tags in Properties panel to identify the locations. To find the handle tags, follow these steps:
In the Properties panel, click the triangle icon next to Tags.
Under the Body section, you can view the available handle tags. Hover over each tag to view the description.
You can create a basic walk cycle with a minimum set of tags. Before you create tags, you need to create handles, which the tags are associated with.
To create a handle, follow these steps:
Select the Handle tool and add click the part of the puppet you want to add a handle.
In the Properties panel, click Tags.
Under the Body section, select the handle for the corresponding puppet part. You can view the handle description in the tooltip.
For foot movement, use either Ankle or Heel tag and for leg movement, use either Waist or Hip tag. If you have a left-facing character walking to the left by default, also add the Knee tag. For the leg and arm handles, be sure to set the same ones on both left-facing and right-facing puppets, for example Left Ankle and Right Ankle (from the perspective of the puppet).
Separate left-facing and right-facing puppets
If you have separate left-facing and right-facing views, make sure to tag them correctly.
To add tags, follow these steps:
After you tag the puppets, switch to the Scene panel to see the puppets respond to the changes. With the Start parameter set to Left & Right Arrow keys, when you press Right-Arrow key, right-facing puppet is displayed, and when you press Left-Arrow key, left-facing puppet is displayed. If you tag only one profile puppet or no profile puppet, and press the opposite arrow key, the puppet walks backwards. For example, if you tag only the left-facing puppet, the puppet walks backwards when you press the Right-Arrow key.
By default, the puppet legs walk in place to help you preview the walk cycle of the puppet and make changes. You can make the puppet move using the Walk parameters
To make the puppet move, do any of the following:
Tip: You can apply a draggable handle to the puppet’s top-level origin handle if you need to reposition it
within the scene.
Walk behavior has the following parameters:
Tip: To ease in and out of a walk, keyframe the Position parameter using Ease. When the Position’s change over time (i.e., speed of movement) drops below 20% of the normal walking pace, the puppet’s handles will be moved with reduced strength, dropping down to zero strength when the Position stops changing completely.
Tip: Add multiple Walk behaviors with different walk styles, speeds, and phase to produce more complex motion.
The Walk behavior supports quarter-view-drawn characters. Also, a specific part of the walk cycle can be emphasized more than others, and shoulder and hip motion can be added, to produce more livelier movement.
The Walk behavior still moves the character laterally , even if drawn in quarter view.
Biped characters drawn in three-quarter perspective produces pleasing arm and leg motion when the new Left Shoulder, Right Shoulder, Left Hip, and Right Hip handle tags are used. Instead of swinging arms from the neck and legs from a single hip location, separate left and right shoulder and hip locations can improve the appearance of these swinging limbs.
The Shoulder tags are best placed in the parent group of the arms. Existing puppets with a Hip handle will get Left Hip and Right Hip handle tags associated with that single handle, but you can reassign the tags to separate handles.
You can now tag left-moving views as Left Quarter or Left Profile. Similar for right-moving views being tagged as Right Quarter or Right Profile.