Using eye tracking (ARKit)
For eye tracking, as you might expect, you are given the pose transforms for each eye, which you can use to update your own "eyeball" game objects. For this feature, I'll show you how to do it, but leave the details of integrating it into the project up to you. Presently, this feature requires an iOS device with a TrueDepth camera.
To learn more about eye tracking with AR Foundation, take a look at the EyeLasers
scene given in the AR Foundation sample assets (we installed these in the Assets/ARF-samples/
folder).
The Face Prefab in the scene's AR Face Manager is the AR Eye Laser Visualizer prefab. This has an AR Face component (as you would expect), plus an Eye Pose Visualizer. This visualizer script, in turn, is given an eyeball prefab. In this specific scene, it is given the Eye Laser Prefab. This simply contains a long thin cylinder that'll be rendered to look like a laser beam. In summary, these dependencies could be depicted...