Building Next Reality: The Terms You Need to Know to Work in ARKit

Jun 30, 2017 01:28 AM
Jun 30, 2017 01:32 AM
636343574301879920.jpg

In the first part of this series, we looked at the surface detection that is provided by the ARKit. We looked at how it worked and covered some of the tools that could help us determine what is not working; when it doesn't. Now let's take this to the next step.

Before we can delve deep into the actual code itself, which we will do in the next part of this series, we need to learn a bit about some of the terms and data structures that the ARKit is using. Otherwise, it will be tough to decipher what we are looking at in code.

SceneKit — This is a high-level development system designed to simplify and speed up the 3D app creation process by using scene descriptions to add particle effect, animations, physics simulation, and physically based rendering.

SCNView — A view for displaying 3D SceneKit content.

ARSession — This component handles the major processes needed to make the device work with the ARKit. Understanding the data from the sensors, controlling the camera and analyzing the images. This help make a good point of the work invisible to developers.

ARSCNView — The view used for displaying AR content from the ARKit, using the 3D SceneKit content. This view automatically renders the camera view as the background. The world coordinates of the SceneKit work in unison with the AR world coordinates. This SceneKit take out a lot of work by automatically using the iPhone's built-in sensors to help match movement to the movement of the world around you.

ARAnchor — This data type is a position and orientation in the real world. Similar to what other systems, like the HoloLens and Unity, use for collecting information to help tie the digital world to the physical one.

ARPlaneAnchor — A structure derived from ARAnchor, when surfaces are detected ARPlaneAnchor stores the alignment, center, and extents. The center point, in this case, would be considered its position.

hitTest(_:types:) — A method used with ARSCNView to search for real-world objects, surfaces and ARAnchors in the current camera view.

ARHitTestResults — When a surface is found, this is the information that is returned from a single point.

ARHitTestResults. ResultTypes — The possible responses from the hitTest.

  • featurePoint — A point that is considered part of a continuous surface. With no current anchor attached.
  • estimatedHorizonPlane — A planar surface with no anchor that is perpendicular to gravity.
  • existingPlane — A known plane in the scene, independent of the plane's size.
  • existingPlaneUsingExtent —A known plane in the scene, with a known size.

In the next part of the series, having some idea of what these components and data structures do, will go a great deal toward helping up understand what is happing as we develop the Building Next Reality app. See you next week when we get this party started. Until then this is Jason signing off.

Build Next Reality: Succeeded

Building Next Reality is an AR development series. In this first section of the series, we will be building our very own ARKit application for iPhone and iPad devices. Leave comments to let us know what you like, what you don't and what you would like to see.

Related Articles

637678990742649026.jpg

Google Adds Famous Monuments & Iconic Buildings to Augmented Reality Search Feature

637679169759992590.jpg

This Billie Eilish Oculus Quest 2 Video Inadvertently Offers a Peek at AR Gaming on the VR Headset

Comments

No Comments Exist

Be the first, drop a comment!