Most textures in ARFoundation (e.g., the pass-through video supplied by the ARCameraManager, and the human depth and human stencil buffers provided by the AROcclusionManager) are GPU textures.
This samples shows how to acquire and manipulate textures obtained from ARFoundation on the CPU.
When using HDRLightEstimation, the sample will automatically pick the supported camera facing direction for you, for example World on Android and User on iOS, so it does not matter which facing direction you select in the ARCameraManager component. The virtual light direction is also updated, so that virtual content appears to be lit from the direction of the real light source. When available, a virtual arrow appears in front of the camera which indicates the estimated main light direction.
On iOS, this is only available when face tracking is enabled and requires a device that supports face tracking (such as an iPhone X, XS or 11). Most devices only support a subset of these 6, so some will be listed as "Unavailable." The relevant script is HDRLightEstimation.cs script. You should see values for "Ambient Intensity", "Ambient Color", "Main Light Direction", "Main Light Intensity Lumens", "Main Light Color", and "Spherical Harmonics". This sample attempts to read HDR lighting information. The relevant script is BasicLightEstimation.cs script. You should see values for "Ambient Intensity" and "Ambient Color" on screen. LightEstimation BasicLightEstimationĭemonstrates basic light estimation information from the camera frame. The relevant script is SupportChecker.cs. This simulates the behavior you might experience during scene switching.ĭemonstrates checking for AR support and logs the results to the screen. The device will attempt to relocalize and previously detected objects may shift around as tracking is reestablished.Ĭlears all detected trackables and effectively begins a new ARSession.Ĭompletely destroys the ARSession GameObject and re-instantiates it. While paused, the ARSession does not consume CPU resources. Pauses the ARSession, meaning device tracking and trackable detection (e.g., plane detection) is temporarily paused. In the case of ARCore, this means that raycasting will not be available until the plane is in TrackingState.Tracking again. If the plane is in TrackingState.Limited, it will highlight red. This uses the ARRaycastManager to perform a raycast against the plane. When a plane is detected, you can tap on the detected plane to place a cube on it. There are buttons on screen that let you pause, resume, reset, and reload the ARSession. This is a good starting sample that enables point cloud visualization and plane detection. See the AR Foundation Documentation for usage instructions and more information. Open Unity, and load the project at the root of the arfoundation-samples repository. Instructions for installing AR Foundationĭownload the latest version of Unity 2021.2 or later. For earlier versions, see the table above. The main branch is compatible with Unity 2021.2 and later.
ARFoundation turns the AR data provided by ARSubsystems into Unity GameObjects and MonoBehavours. This namespace defines an interface, and the platform-specific implementations are in the Google ARCore and Apple ARKit packages.
Apple ARKit XR Plug-in ( documentation)ĪRFoundation is built on " subsystems" and depends on subsystems defined in UnityEngine.XR.ARSubsystems namespace.Google ARCore XR Plug-in ( documentation).This set of samples relies on three Unity packages: Example projects that use AR Foundation 5.0 and demonstrate its functionality with sample assets and components.