This article will guide you through the implementation steps involved in basic AR application development using the Unity AR foundation framework:
While reading this document, you should have done basic research and you must have a fair understanding of the definition of Augmented Reality and its potential use cases.
AR applications can be either native or webapp, but we will restrict ourselves to Unity AR foundation framework and native AR apps.
What You Will Learn:
- Augmented Reality Application Development
- Setup for AR
- Frequently Asked Questions
Augmented Reality Application Development
Unity is a game engine predominantly used for gaming, high-end animation, VFX, and simulation. In recent years, it has played a vital role as a platform in developing Virtual Reality, Augmented Reality, and Mixed reality applications.
Developing AR apps using Unity involves two steps:
- Download and Install Unity HUB from the Unity site.
- Install the required version of Unity via Unity Hub.
Open your preferred browser and key in this link, which will guide you through the steps mentioned earlier. User account creation is a necessary prerequisite to downloading and using Unity. So create one user account and provide the account, where ever appropriate.
The above picture highlights the typical content of Unity Hub download.
After creating a Unity account, you have the freedom to edit the created account profile, as mentioned below.
Once the Unity hub is installed, select the “Install” menu on the right-hand side of the unity hub interface. This sub-tab would be open to provide details on “official releases” and “pre-releases” of Unity software which is ready to be downloaded and installed.
Remember to select modules like Android to build support and IOS build support to be part of Unity IDE to run the AR application on their respective platforms, as shown below.
Package Manager is one of the features of Unity IDE to import any external plugins and repositories into its ecosystem.
To develop augmented reality applications that are going to run either on Android or IOS platforms, it’s mandatory to import or install plugins like XR Interaction toolkit, XR Plugin management, AR Foundation, ARCore XR Plugin, ARKit XR plugin with preview version under AR packages section.
When you open the Package Manager in Unity editor via Menu->Window->Package Manager option, under packages you will have options like
- Unity Registry
- In Project
- My Assets
- Built In
Unity Registry will list out plugins that are available under Unity along with the version that can be downloaded as part of the project.
The Project will display packages that are part of the existing or open project.
My Assets provides details on the plugins that are downloaded from the Unity asset store and will be part of the project. Normally this will be empty and get populated as you download from the asset store.
The BuiltIn package option lists the packages that come along with the project as the default option.
Setup for AR
A Directional light component will be added as part of the project along with the main camera component. We will make a few modifications to this directional light component to support AR.
Select the directional light component, under the Inspector panel, and change the light->mode from mixed to real-time.
Change the shadow type from soft shadows to hard shadows for the benefit of performance.
In order to create an AR scene in an AR application, we need to add the ARSession game object as part of this process. The game object is responsible for
- Handling the lifecycle of an AR application as a session and AR system state.
- Start and Stop an AR session.
- Enabling the AR app to receive frames gives permission for the camera to superimpose images according to the device’s pose.
Right-click on the “Hierarchy” panel to display the menu popup and select XR, under the subdivision select ARSession to add to the project’s scene.
ARSessionOrigin game object deals with transforming AR coordinates into Unity World coordinates. ARSessionOrigin comes with ARCamera as its subcomponent, with a parent-child relationship hierarchy.
- ARSessionOrigin is responsible for placing virtual objects in the proper position in correlation with the AR environment.
- It updates the position of the virtual object along with the ARCamera movement and stays aligned with the real-world environment.
- It is mandatory to place all the game objects under the ARSessionOrigin hierarchy to be part of the AR scene.
The ARCamera component configuration in the inspector panel of Unity comes with many configurations.
Set Camera ->Clear flags value as Solid Color and Background value as Black to capture the video coming from the device camera and set the same as background in our world scene.
Camera -> Field of View value is strategically set at 60 to get the same set of views from the device and match it in our project so that it is easy to emulate the physical world in the virtual world.
Camera ->Clipping plane is responsible for processing objects for display in a 3D scene. The default value to cover objects far from the camera is set as 20cm and the near value is set as 10 cm. This is the ideal value for better and optimal performance. Increasing the “far value” for better coverage comes with a performance hit.
Camera ->MSAA turned this OFF to give some better performance boost for the AR application. MSAA stands for Multi-Sample Anti-Aliasing.
Camera ->AR Pose Driver feature represents a Virtual camera that correlates with a real-world one.
Camera ->AR Camera Manager configuration deals with managing the camera and its features like Light Estimation and Facing direction.
Camera ->AR Camera Background configuration makes the object render the video view coming from the camera into the game scene.
Placement of any graphical objects like 3D models or 2D images on a live camera feed, in accordance with Unity’s world space, is achieved in AR applications using a technique called Trackables. If you drill down on this technique, it gets subdivided into a Point cloud and Plane tracking.
Point clouds are basically a set of points that are mapped to the real world with a set of data received from live camera feeds, from multiple frames of the real-world environment, which helps in placing graphical objects in AR space.
Unity provides the option to view the point cloud via the Point cloud component, which gets added as part of ARSessionOrigin to track world space and place the 3D objects in the AR Scene.
Select the ARSessionOrigin component from the Scene hierarchy panel and click Add Component under the Inspector panel to open the menu and then select the AR Point Cloud Manager component.
Another important trackable feature is Plane Detection. With the help of Point Clouds, this feature detects and identifies the planes to place the 3D object in an AR scene.
ARSessionOrigin object provides the option to add an AR Plane Manager from the Inspector panel.
Rendering Point Clouds and Planes
We currently have point clouds and planes added as components to the ARSessionOrigin, but to view those on the camera feed, points, and planes need to be rendered on the camera user interface. This rendering is possible by adding AR Default Point Cloud and AR Default Plane game objects as Prefabs into the ARSessionOrigin component.
In the Hierarchy panel, under “sample scene”, right-click to open a pop-up menu,
XR menu-> selects AR Default Point Cloud sub-menu to add this object.
Follow the same steps to add the AR Default Plane to the hierarchy panel.
AR Default Point Cloud and AR Default Plane Prefab conversion
If you look at the AR Default Point Cloud game object’s properties in the “Inspector” panel, you have the liberty to update the values in the “Particle System” component like the duration of the points to be displayed, size, start color, the maximum number of particles, etc.
For the benefit of project structure maintenance, create a folder with the name Content under the Assets folder, which is under the Project panel tab.
Move the game object AR Default Point Cloud and AR Default Plane inside the Contents folder just by dragging and dropping, so that the game object turns to Prefab. The blue color highlighted component is an indication that the object is a prefab.
Remove the prefabs from the “hierarchy” panel, and drag and drop the prefabs from the Assets-> Content folder into ARSessionOrigin’s AR Point Cloud Manager and AR Plane Manager components.
Adding a 3D model
After setting up the AR scene with ARSession and ARSessionOrigin components with prefabs, it’s time to add a 3D model to the scene to view it in a room-scale environment.
Unity supports and works best with .fbx file format for 3D models. Liberty to download any 3D models from sketch fab, free3d, or cgtrader website. For this tutorial article, a jet engine 3D model is taken as a reference.
Import any 3D model file into the asset folder, then drag and drop the 3D model as a child to the sample scene under the hierarchy panel. Wrap the 3D model inside two empty game objects to handle collision and scaling factors.
Integrating C# scripts in the AR app
Now it’s time to make some interaction with 3D objects in the AR scene, this can be achieved via C# scripts. By integrating Microsoft Visual Studio Code editor with Unity Engine, any C# code change and its compilation will happen instantly.
Create a folder by the name “Scripts” under Assets and right-click to choose to Create C# script for script development.
Double click on ARManager.cs script to open it in MS Visual Studio editor, this script basically connects with ARSessionOrigin and its trackable features ARPointCloud and ARPlane along with 3D assets present in the AR scene programmatically.
The ideal way to bridge the script with AR scene objects is by creating gameobject and attaching the script with that gameobject. Here in this case, the ARManager script is attached with ARManager gameobject created in SampleScene under the Hierarchy panel.
In the ARManager.cs script, we exposed the ARSessionOrigin , ARPointCloudManager , ARPlaneManager, and gameobject as public and because of this implementation, these objects appear in Inspector panel of ARManager gameobjects as captured with the below illustration.
Drag and drop ARSessionOrigin object into the ARManager (Script) under the Inspector panel for relevant fields like Session Origin, Point Cloud Manager, and Plane Manager. To add a 3D object to the inspector property of gameobject , it is advisable to convert that as prefab (blue in color) by moving the 3D object to the Content of Asset folder.
Adding Event Handler Via Script
In an AR application, it’s obvious to place a 3D graphical object in the AR scene by identifying and detecting the horizontal or vertical plane surface via point clouds and plane manager. You can do that by adding an event handler in the C# script, which handles the event for plane detection and environment detection by the point cloud.
The success of an AR application highly depends on the naturalistic appearance of 3D objects or graphic elements in the real-world environment. This is possible with a technique called Raycasting.
Ideally, we need 3D graphical objects on the floor, in parallel with real-world objects. AR foundation provides a feature called Raycasting, in this technique, an invisible line or ray will be drawn from the finger touch point on the mobile screen to the plane surface identified in the real-world environment by the AR Plane Manager component.
In the above screen illustration of the script, the RaycastDetectAndPlace() method takes the responsibility of detecting the plane surface and placing the 3D object in an AR Scene.
At the touch of a mobile screen, an invisible ray is sourced from the device camera, this is handled in code line 41 with ARSessionOrigin object. Input raycast will hit and collide the plane surface and respond with metadata on position and transformation, this is covered in code line 43 via ARPlaneManager component.
Once the coordinates are received, it’s time to place the 3D object. This is handled in the code line 47.
We will focus on deploying the AR application on the Android platform in this article, when compared to the iOS platform, publishing steps on the Android platform are pretty much easy. In the PackageManager, check that all the relevant AR related packages in relation to the Android platform are all imported into the project.
It’s time to build and run the application on the Android platform, but before that, we need to do some configuration in the build settings. Click on File->Build Settings to open a panel, which allows you to add all the relevant AR scenes under “Scenes In Build” section, select Android under “Platform” section and in the “Run Device” ,select the device that is connected to the Unity interface as a development platform.
Under buildsettings, scroll down to select the Player Settings button to open the Project Settings panel with the Player option selected as default and its configuration settings feature.
Auto Graphics API
Uncheck this feature to allow us in selecting our relevant graphics API. It can be either Vulkan or OpenGL3.
Depending on your device’s graphics capability, choosing Vulkan or OpenGLES3, the ideal choice would be OpenGL and remove Vulkan from the list.
It is best to uncheck the multi-threaded rendering option for a mobile-based AR solution to boost performance in rendering complex 3D objects on the AR scene.
Android Minimum API level
Choose the minimum API level for Android as version 7.1, Android Nougat, and the target API level to be the highest.
Scripting Backend and Target Architecture
Change the Scripting backend value to IL2CPP for better performance, this will target the devices with the architecture ARM64 and ARMv7.
XR Plug-in Management
Head over to the project settings under Player Settings and scroll down to the end for this XR plug-in management feature. To do this, select the relevant plug-in to run the appropriate platforms. Select the ARCore plugin to run for the Android platform, ARKit for iOS, etc.
Upon opening Build Settings, and after selecting the desired platform like Android or iOS, or Windows, the “Switch Platform” button would be enabled, upon hitting this button all packages and libraries essential for the relevant platform build will be downloaded as part of the project workspace.
For the Android platform, when you select the Build option, it will open the created project workspace file path to generate the .apk file for that AR application. Just provide the file name for the apk and save it to the location.
Build and Run
As opposed to the build option, you can directly install the apk into the connected device and run the AR application apk after storing the generated apk in the project workspace. As a prerequisite, it is mandatory to have the device plugged in via a USB cable and it should be detected by the Unity interface.
AR Scene on Android Mobile
When you install the AR application on your Android device, this is how it looks on your mobile device. The orange color dots represent the Points cloud that identifies the surface, and the yellow color semi-transparent plane represents plane detection and identification. Jet engine is the 3D model that gets displayed on the camera.
Frequently Asked Questions
Q #1) What is Unity AR architecture?
Answer: Augmented Reality application development is supported by the Unity game engine with AR Foundation framework architecture as illustrated below.
In the illustration, Unity game provides a general SDK with XR features to integrate with different XR device platform providers. For example, ARCore XR is for Android and ARKitXR is for iOS.
XR Subsystems takes care of XR-related features and provide those as plugins, namely Raycast, Environment, Planes, Image tracking, and Object tracking.
AR Foundation is a developer tool to develop AR and MR-related applications and XRToolkit handles interactions for both AR and VR apps.
Q #2) What is AR Foundation Architecture?
Unity game engine uses AR foundation plugin to create an AR feature for an application. From the coding point of view, the below illustration is the AR Foundation architecture.
Q #3) Which Software is best for AR?
Answer: The answer to this question is based on the business requirement, if your business demands Advertising and marketing with minimal graphics, it is advisable to go for a web-based AR solution with software like A-frame, threejs and babylonjs.
Realistic graphics and high-end fluidic interaction require devices tied up with AR applications, game engines like Unity, Unreal, Godot, etc.
Q #4) Why is it better for AR Unity or Unreal?
Answer: Realistic graphics rendering in AR, for the purpose of advertising and entertainment, then the ideal choice would be Unreal. If the AR application requires logic and gaming with a fair amount of graphical elements, then the choice would be Unity.
Q #5) Is Unity completely free?
Answer: Unity comes free only for the Personal edition, other plans require a payable license.
Q #6) Does AR cost money?
Augmented reality applications are a processor-intensive process where we take the camera feed coming from the device and set that video feed as background for the AR scene. Then analyze video feeds over runtime for spatial data in order to detect and identify the environment.
It is advisable to consider the performance optimization measure with the below steps:
- Try to use transparent geometry and textures.
- Validate the alpha values of geometry.
- The default frame per second value is 60, for performance’s sake, try to bring it down to 30.
- Near and far value in AR Camera, try to keep it to a minimal value for better performance.
- Make the best possible use of Unity profiler for memory, graphic rendering, and animation timeline optimization.
As a concluding note, this tutorial article started with a general explanation of Augmented reality, the Unity engine, followed by configuration setup for AR application creation with detailed step-by-step explanations and illustrations.
If you follow this step-by-step guide patiently and diligently until the end, you will end up creating a basic AR application that can run on an Android mobile phone. Congratulations, you have stepped into the realm of AR, welcome aboard.