ARCore 101: How to Create a Mobile AR Application in Unity

How to Create a Mobile AR Application in Unity

Now that ARCore is out of its developer preview, it's time to get cracking on building augmented reality apps for the supported selection of Android phones available. Since Google's ARCore 1.0 is fairly new, there's not a lot of information out there for developers yet — but we're about to alleviate that.

Both Google's (ARCore) and Apple's (ARKit) augmented reality solutions are for their supported existing devices — without requiring additional hardware. This is a major leap forward for AR developers since there are millions of these devices already in the public's hands. So with the ability to create software that can be used by the masses "now," the race to the AR mobile market has accelerated considerably.

If you're looking for help on developing for Apple devices, check out Next Reality's ARKit 101 collection. Otherwise, let's get started with developing mobile AR apps for Android devices using Unity. However, before we dive right into the lessons, let's talk a little bit about the capabilities of ARCore.

1. Motion Tracking

Much like ARKit, ARCore has the ability to track the position and rotation of a virtual object positioned in the real world. This is known as motion tracking.

This effect is accomplished through a collection of accelerometers and gyroscopes built into what is called an inertial measuring unit, or IMU. In a relative sense, an IMU can determine the position and rotation of the object it is tied to.

Ever since my very first smartphone back in 2008, the HTC G1, these devices have had fairly sophisticated sensors built into them. While missing the gyroscopes at the time, with the accelerometers, GPS receivers, and digital compasses, the Layar browser was a cool use of AR, even back then.

Fortunately, at this point, motion tracking is more or less a solved problem, and we as developers don't really have to deal with the input and output of an IMU. Microsoft, Apple, and Google can all just give us an interface in Unity and it just works.

2. Environmental Understanding

While motion tracking is more or less solved, environmental understanding, Google's term for plane finding, or the ability for a standard RGB camera to determine surfaces such as walls and floors, is still fairly new. While based on a great deal of research done in the field of robotics to help machines understand the areas they are moving in, its application in augmented reality is undeniable.

With the combination of motion tracking and environmental understanding, the ability to place a virtual object on a table, and then move your device while having the object accurately remain in the same position on the table, creates a fairly believable illusion.

3. Light Estimation

The final tent-pole of this technology that really helps sell the illusion of virtual presence is known as light estimation. Using ARCore, the smartphone's camera can approximate the source of light in the area and closely replicate that in the application. At that point, your virtual object's highlights and shadows will appear to match the space around it.

Fortunately, the ARCore SDK plugin that Unity has created handles these technologies for us automatically. So what we need to do is get you through the basics of how to harness the powers of these tools and let you start experimenting with AR via ARCore, which is exactly what we aim to do with this series.

What You Will Learn in This Tutorial Series

This series will be aimed at getting new ARCore developers started on the process of creating AR apps. We will begin with the software installation and setup process, then we will build a Unity scene and set up the Android ARCore SDK framework in that scene. We will follow that with a breakdown of the scene controller.

Next on the docket will be surface detection, which will give us the ability to interpret the world around us in AR. And finally, we will cover the topic of the plane prefab, which facilitates our ability to position virtual objects in our real world.

As we move forward, you can refer to this post and the list below as a handy reference guide, with live links appearing on each installment in the series as we publish them.

  1. Setting Up the Software
  2. Setting Up the Framework
  3. Setting Up the App Controller
  4. Enabling Surface Detection
  5. Plane Prefab & Detection (link coming)

Of course, in the future, we can cover other, more specific things related to ARCore. Just let us know what else you would like to learn, and we'll try to accommodate you after this series has been completed. While developing augmented reality apps, you'll surely experience a few hiccups here and there, and we'll be here to help you figure them out!

Just updated your iPhone to iOS 18? You'll find a ton of hot new features for some of your most-used Apple apps. Dive in and see for yourself:

Cover image by Jason Odom/Next Reality

Be the First to Comment

Share Your Thoughts

  • Hot
  • Latest