Augmented Reality Gone Wild: a razor sharp glance into the future of AR

Augmented Reality AR Demo "AR Horse" being shown on Google Pixel phone

Augmented Reality promises to be the next major computing platform. It will change how we interact with data and with the physical world forever. Microsoft, Magic Leap and dozens of smaller companies are creating the wearable devices of the future. It’s still a number of years until these devices are really consumer friendly but the technological foundation is being laid today. When Apple released ARKit, and Google followed quickly with ARCore, it provided an essential tool to app developers to create true AR / Mixed Reality applications. Mobile AR is now widely available on consumer phones down to the iPhone 6s. With mobile AR, we can now anchor virtual objects to the physical world and view these blended worlds through our smartphone and tablet. However, to blend these worlds believably, we need the virtual objects have the same level of detail as the physical world. This is extremely challenging. It requires a tremendous amount of artistic skill and technology to create these virtual objects. And it requires technology that can actually handle and render these objects.

Since mobile devices are relatively limited in computing power, we were wondering: can we seamlessly blend the virtual and physical world? Here you see a video capture of the app AR Horse that we just released on Google Play store.

Note: AR Horse requires Google ARCore Preview and a compatible Android device. ARCore Preview is experimental software not supported by all devices. If the app crashes when starting, check that you have installed the Google ARCore Preview on this link.

At Graphine, we’ve been focusing on data compression and streaming to render extremely detailed virtual objects with interactive video game engines. It’s been interesting to see that VFX companies – companies that create photorealistic special effects for blockbuster movies – started to use game engines for VR and animation work in the last couple of years. The level of detail in a digital character for a movie is hundreds of times higher than your typical video game characters. Leading VFX companies from every corner of the world have used Granite SDK, our texture streaming middleware to automatically optimize their texture assets for a interactive 3D engine without sacrificing the detail they spend a lot of time creating.

Most of the AR content has been these miniature objects. We wanted to have a huge, life sized object. This was going to be challenging because the larger the object, the more detail you need or you get blurry (= definitely not realistic) objects. So we would need to port our streaming technology to run on mobile devices to stream in all this detail.

We’re an engineering company solving hard technical problems. We needed a great partner that knows how to create photorealistic virtual objects. Weta Digital, a leading VFX studio, has always been at the forefront of technology and did some of the pioneering work in VR as well. They immediately saw the potential of what we wanted to achieve and got on board.

Part of having a believable character is having realistic motion. Weta Digital has developed a muscles, skin and fat simulation system called “Tissue” that helps them create highly believable character animations. They used this system for blockbuster movies like Avatar and War for the Planet of the Apes. Ambitious as always, their aim was to bring over this animation to AR.

At Graphine we are specialized in rendering highly detailed surfaces so we focused on the texture detail and the AR mechanics. Weta Digital has extremely high standards and we love solving technical challenges so the result is something we are extremely proud of. We hope we have set the bar for visual quality in AR. We see a lot of opportunities for training, virtual stores, games and marketing experiences.

Want to try this for yourself? You can go to the google play store and download the app here. You’ll need a Google Pixel or Pixel 2 Android phone to run it.

This post will now go into some more detail on the technical challenges and solutions we encountered during the making of the AR Horse app.

We decided to use Unity 3D for the project. This game engine has great support for mobile platforms. We know the engine well and our texture streaming plugin has been available for multiple years on PC, Xbox one and PS4. We did a previous project on Android and most of our plugin was already ported over. We settled on using ARCore to detect and track the real-world surfaces. So guess what, Granite for Unity now fully supports Android! Obviously, iOS is now also a major AR platform so we’ll be adding support for that on request.

A key technical challenge was bringing the animation output simulated by Tissue to real-time mobile rendering. Weta initially planned on using an alembic export of the simulation. However this proved to be problematic on the android platform. First, it meant we would have to port the existing Alembic Unity plugin to Android and secondly, there were performance concerns about dynamically streaming data from the Alembic file format on Android (The Alembic format is mainly aimed to be a generic and extensible animation format rather than a fast format to stream geometry and animation from.)

In the end we settled on a blend shape based approach. Weta was initially concerned that the fidelity of the blend shapes would not be up to their animation quality standards but after some tweaking we ended up with a high quality blend shape based animation which was only a fraction of the size of the alembic approach. We then ran into a problem where the baking process was outputting blend shapes with negative weights. We worked closely with Unity to add support for these, as it’s a feature in development, and the results were great! (Otherwise we would have to duplicate the whole blend shape set so emulate the negatives). The final main horse model we shipped uses 80 blend shapes and 10700 vertices (13625 triangles). In total the asset is 21k vertices with over 150 blend shapes and some bone-based animations for certain details like the mane and tail. The animation processing is the heaviest cost on the android device taking 8~14 ms per frame.

Besides the high quality animation, the horse also features high resolution textures. These textures consist of 58 4096x4096 patches per layer. The patches are authored using the UDIM patch lay-out convention. There are three UDIM layers used by the horse’s anisotropic material: diffuse, normal and specular. This leads to a total of 174 4096x4096 patches for the main textures. (For the version downloadable from the Google Play Store we reduced some of the 4K textures to 2K resolution to download the app quicker). In addition to this we also have some lower res textures using a traditional UV layout to control anisotropy and ambient occlusion. The animation also drivers up to 4 wrinkle normal maps to add some localized normal map detail like skin wrinkles appearing. To visualize the UDIM data within the limited memory budget (128 Mb GPU memory) and without reworking the UDIM UV’s our Granite texture streaming system was adopted. Textures are stored in our optimized streaming format and compressed in memory using ASTC4 texture compression.

In the end we managed to achieve a solid 30 fps framerate on a first generation pixel phone. The average 33 ms frame is almost equally split between the Unity engine and rendering(this includes the texture streaming) (10 ms), animation updates (12 ms) and AR core image processing (11 ms).

Leave your information

×