Header Banner
Next Reality Logo
Next Reality
Augmented & Mixed Reality News, Rumors & Dev Guides
nextreality.mark.png
Apple Snap AR Business Google Instagram | Facebook NFT HoloLens Magic Leap Hands-On Smartphone AR The Future of AR Next Reality 30 AR Glossary ARKit Dev 101 What Is AR? Mixed Reality HoloLens Dev 101 Augmented Reality Hololens How-Tos HoloLens v. Magic Leap v. Meta 2 VR v. AR v. MR

ARCore 101: How to Create a Mobile AR Application in Unity, Part 4 (Enabling Surface Detection)

May 11, 2018 10:58 PM
Carpet with a blue geometric pattern on a textured surface.

One of the primary factors that separates an augmented reality device from a standard heads-up display such as Google Glass is dimensional depth perception. This can be created by either RGB cameras, infrared depth cameras, or both, depending on the level of accuracy you're aiming for.

This understanding of depth allows our augmented reality devices to know the locations of usable surfaces. The result of this understanding is that we can place virtual objects on top of real objects in the real world. Or, at least, the functional illusion is created.

In the previous tutorial on building an AR mobile app for Android, we created our AppController to handle the overall processing and flow of the application. In this lesson, we will use point cloud data generated by the camera to create planes that help the user indicate where a viable surface is.

We will take what we learned in lesson two on prefabs and build on that, creating a prefab that starts small. As more of the real-world surface is detected, it will instantiate and attach more copies of the prefab to make the usable virtual surface larger. While that may sound a bit complex, Google and Unity have both done a good bit to make the system easy to implement.

Create a Flat Plane

For our purposes, when a real-world surface is detected, we need a way to represent this in the virtual space. In this case, we will be creating a flat plane that can be used. And while this solution will not work in every situation, as many surfaces may have objects on them, this will work in a majority of cases.

Click on the "Create" menu, select "3D Object," and then click "Plane." Name the object VisualTrackedPlane or something you will remember.

3D modeling software interface showing options for creating 3D objects.

Next, look at the Transform component of the object in the Inspector, and make sure the Position of the object is 0, 0, 0. Also, check that the scale is 1, 1, 1. If there is an offset on this object, when your prefab was instantiated, it would always be offset as well.

3D modeling workspace in Unity showing a simple landscape with properties panel.

Apply the Texture

Now we need to select a texture so the virtual surface is viewable by the user. In the Mesh Renderer component, click on the arrow next to Materials.

ARCore 101: How to Create a Mobile AR Application in Unity, Part 4 (Enabling Surface Detection)

After it expands, click on the small donut by the object's 1st material to bring up the Select Material window.

3D material selection interface with various texture options displayed.

In the Select Material window, choose the "PlaneGrid" material.

Material selection interface with various texture options in a 3D software application.

Add the Plane Renderer Class to the Object

Now, let's get to the heart of this learning experiment. We will use the provided "Tracked Plane Visualizer" class on our prefab in order to see the surface as it is detected and then extended. Click on the "Add Component" button, then type "Tracked" into the search box, and then click on "Tracked Plane Visualizer."

ARCore 101: How to Create a Mobile AR Application in Unity, Part 4 (Enabling Surface Detection)

Turn Our Object into a Prefab

To create a prefab, select the "VisualTrackedPlane" in the Hierarchy window, then drag it to the "Prefabs" folder in your Projects window.

Unity interface with highlighted elements for 3D object manipulation.

Now, delete the "VisualTrackedPlane" from the Hierarchy view.

ARCore 101: How to Create a Mobile AR Application in Unity, Part 4 (Enabling Surface Detection)

Update the AppController

Now that we have our prefab created, we need to update our AppController class to handle working with the new prefab. The code for the updated version can be copied and pasted from Pastebin. If you look through the code, you will notice three spots that have "//Lesson 4." These sections are the new code added in this part of the tutorial. We will now go through these changes just so it's easy to understand.

The TrackedPlaneVisualizer class we added in Step 3 is a class from the HelloAR_ example that comes with ARCore. Because of this, we need to add the namespace it belongs so we can use it.

Code snippet demonstrating Unity and Google ARCore usage.

Next, as you can see in the example below, we need to add a public GameObject declaration so that we can create a link between our AppController and the prefab we previously created.

Code snippet showing a public gameObject declaration in a programming context.

Finally, we have a block of code that is doing the heavy lifting.

First, on line 40, we see the Session class call the GetTrackables function of generic type TrackedPlane. This function is looking for a list of trackable objects and a filter set to new. This line of code will look for any newly created trackable planes and add them to the m_NewPlanes list.

Next, we take the total number of new planes represented in the m_NewPlanes.Count reference, and we instantiate a copy of our prefab for each one. Finally, we get the TrackedPlaneVisualizer component and run its initialize function for each new prefab. This will set the color and rotation of the prefab.

Code snippet showing a function for handling user input and event binding in programming.

Run Our Test on Our Mobile Device

We are ready to build our app and test it out. Press Ctrl+Shift+B to bring up the Build Settings. Click on the "Add Open Scenes" button to add the scene we are in. All the other settings should be ready to go from a few lessons ago. Click "Build And Run."

ARCore 101: How to Create a Mobile AR Application in Unity, Part 4 (Enabling Surface Detection)

When the Build Android window appears, type NRARCore into the field, or a name you will remember for you APK file, and click the "Save" button.

File saving dialog in a software application.

Now it will go through the build process. Follow the green progress bar.

Loading progress bar in a software application.

When it is finished, it will run the application on your phone automatically. The first time it runs, it may ask for permission to use the camera. Select "Yes." Otherwise, it will not work, as it requires the use of the camera.

If you point the camera at the floor or other surfaces around you, you will see a random-colored diamond-shaped grid appear. This is the ARCore system letting us know that the surface we are looking at is usable.

Exciting stuff, huh? And it gets even better the more you learn.

Textured floor with blue spots.

In this part of the tutorial, we created a mostly transparent prefab that will represent our surfaces. We then added functions to our AppController that updates the planes as new planes come into existence. This will allow the surface to increase in size as more area is discovered and give our users plenty of space to put virtual objects in the world around them. That is, once we add the ability to add digital objects to the world. And that, along with scaling our objects, is what we will tackle in the next lesson.

Cover image and screenshots by Jason Odom/Next Reality

You already know how to use your phone. With Gadget Hacks' newsletter, we'll show you how to master it. Each week, we explore features, hidden tools, and advanced settings that give you more control over iOS and Android than most users even know exists.

Sign up for Gadget Hacks Weekly and start unlocking your phone's full potential.

Related Articles

Comments

No Comments Exist

Be the first, drop a comment!