Calibrating and capturing live data
To get a better result, the ARKit capturing process needs to acquire data that takes the performer’s characteristics into account. As you can see from Figure 8.13, the app has done a good job of capturing the performer’s facial proportions, clearly outlining the eyes, nose, mouth, and jaw. However, giving the app a baseline of information (such as a neutral pose) would really help to refine the overall outcome.
Figure 8.13: Calibrating the Live Link Face app
To understand how and why baselines and neutral poses work, let’s take a look at the eyebrows as an example using an arbitrary unit of measurement. If in the resting or neutral position the eyebrows are at 0 and a frown is at –10, and a surprised expression raises the eyebrows to +15, then knowing what the neutral pose is would give the app a better idea of where to determine the frown or surprised expressions to coincide with its tracking...