In 2017, major breakthroughs in smartphone-based simultaneous localization and mapping (SLAM) opened up new doorways for developers and users of both Apple and Android phones. Unfortunately for Android users, the solution that Google is previewing, ARCore, currently only works on three Android smartphones. But Silicon Valley start-up uSens is stepping in to fix that with its new engine called uSensAR.
The announcement was made on Monday at CES 2018 in Las Vegas by uSens, a company known for some amazing articulated hand-tracking solutions. Like Google's ARCore, uSensAR uses the standard RGB camera and the inertial measuring unit (IMU) that comes with an Android smartphone and produces a window into augmented reality. But while ARCore only currently supports the Google Pixel, Google Pixel 2, and the Samsung S8 (but not S8 Plus or Note 8), uSensAR aims to support any Android device.
Fragmentation is a word that gets used when talking about computers and smart devices, especially when Android is in the conversation. An open, adaptable operating system means that any hardware manufacturer can easily fit the OS to its hardware.
The result is the existence of over 24,000 different Android devices with vastly different hardware configurations and capabilities. Devices ranging from ultra-budget to the upper echelon of high-end. In many cases, these devices are not updated to the newest versions of the OS. This wide separation of the user base is known as fragmentation and can severely limit the ability to make sweeping changes across the entire ecosystem, among other issues.
This is in complete contrast to the Apple operating system business model, which is closed. Only Apple-made hardware runs the company's iOS. So when Apple released ARKit, supporting a limited range of Apple devices, which all have the same foundational functionality and very little fragmentation to contend with, there is a significantly smaller chance of failure.
The goal of uSensAR is to take the significant breakthrough that came with the release of both Apple's ARKit and Google's ARCore to make it accessible to everyone in the Android ecosystem. Up to that point, smartphone-based SLAM was mostly relegated to using markers or special sensors like Occipital's Structure or Google's Tango. Taking the need for external hardware out of the picture and just using the camera built into the devices opens up AR to billions of users with the devices already in their pockets.
"ARCore currently only serves about 30 million Android phones, which is just 5% of the entire Android smartphone ecosystem. For the AR industry to thrive, it is essential that as many people as possible have access to AR ready devices, which in turn will entice more AR content to be developed. With the release of uSensAR, we are allowing developers, smartphone manufacturers and developers to create those AR experiences not just for iPhone, Pixel, and Samsung S8 users, which ARkit and ARcore only serve, but the entire Android ecosystem."
In addition to the uSensAR announcement, uSens also revealed its partnership with mobile system on a chip (SoC) creator Spreadtrum Communications, taking uSensAR a step further by making additional AR functionality available to devices using the Spreadtrum SC9853 chipset.
The uSensAR feature list includes mobile inside-out motion tracking, visual inertial odometry which utilizes IMU and camera sensors for tracking, low CPU and battery consumption, advanced plane detection technology, environmental understanding via SLAM technology, millimeter-level position accuracy, and support for Unity, C++, and Java.
Currently only available to partners, uSensAR isn't accessible in an SDK form just yet. However, uSens co-founder and CTO, Dr. Yue Fei, told me that uSens is "open for collaboration," so interested parties can head to the uSens website and contact them for more info. (The release window for a uSensAR SDK has been set for Q1 2018.)
With an aim at opening AR up to 95% of the Android user-base, which includes low-end devices, uSensAR could be seen as a direct ARCore competitor. Of course, as we learned a few weeks ago, ARCore is moving out of preview and into version 1.0 "in coming months." The goal of ARCore version 1.0 is to open AR up to some 100 million Android users.
It should also be pointed out that Wikitude, which has been providing computer vision-based, markerless SLAM solutions for smart devices for some time now, will likely be in the conversation about competition in this space.
Regardless, as an Android user who had a two-year-old phone and was ready for my upgrade, seeing the announcement of ARCore pushed me to upgrade my Samsung Note 6 to a Note 8. Sadly, as I discovered a few hours later: Just because ARCore works on the Samsung S8 doesn't mean it works on the Note 8 (ever) or even the S8 plus (most of the time.)
With that in mind, I, for one, welcome any competition that makes AR on my phone more accessible.