Understanding AR interaction flow
In an Augmented Reality application, one of the first things the user must do is scan the environment with the device camera, slowly moving their device around until it detects geometry for tracking. This might be horizontal planes (floor, tabletop), vertical planes (walls), a human face, or other objects. A simplistic user flow given in many example scenes is shown in the following diagram:
As shown in the preceding diagram, the app starts by checking for AR support, asking the user for permission to access the device camera and other initializations. Then, the app asks the user to scan the environment for trackable objects, and may need to report scanning problems, such as if the room is too dark or there's not enough texture to detect features. Once tracking is achieved, the user is prompted to tap the screen to place a virtual object in the scene.
This is...