Designing Interactions for Augmented Reality - Part 2

All Posts

This is the second part of a series talking about some the intricacies of interaction design for Augmented Reality. The kind of details that are really easy to miss unless you're hands deep down in the mud with the topic.


In the previous post we talked a bit about the future of contextual awareness regarding 3D content in Augmented Reality. We also shed light to some of the improvements we implemented into the latest version of our Umbra Composit Viewer.

Here we’ll dig a bit deeper into some of the not-so-obvious details of interacting with 3D content in AR when using familiar touch gestures on an iOS-device. If you want to read more about similar quirks of interaction design but for VR, I highly recommend you check out a keynote from GDC 2017 about the user experience in Google Earth VR.

The Target Pixel

Perhaps a bit of a technicality but important nonetheless. Previously, you weren’t really interacting with the 3D model. Your touch was actually reflected against the chosen AR tracking surface. This led to unexpected behavior as sometimes you were trying to interact with something that’s very near to you. In reality the input could have been registered somewhere far in the horizon.

Without going further into the reasoning why we don’t [yet] provide collision meshes as a part of our optimization, we ended up using the depth buffer for detecting which pixel you’re trying to hit with your finger. 

As you can see below, depending on the distance to the object you try to move the model by, the same interaction results in a more predictable result.

Nearby Drag-1Distant Drag

Above or Below the Horizon

When you’re pointing your device’s camera below the horizon, it makes sense that a swipe down on the screen actually brings the model closer to you. But when you hold the camera above the horizon, the situation is actually inverted - it's more natural that the a swipe down on the screen moves the 3D content further away from you.

Below the HorizonAbove the Horizon

Through the Walls

As we added the above target-pixel recognition, it brought about an interesting dilemma when you want to navigate through walls. Standing next to wall, your swipe would now always hit the wall. This made it impossible to pass through the wall. Well, unless you physically moved forward in the real world.

We came up with a concept of a “touch radius”. At a certain distance, we will ignore the touch result if its too close to the camera and transform it to be a bit further away. So essentially on your first drag, you’ll end up at the wall and on the second swipe one you’ll go neatly through it.

Through the Walls

While there would be quite a bit more to write about custom-shaped tracking planes, let's leave some goodies for the next time. But indeed, it’s all these kind of small details together that will make Augmented Reality as familiar of a platform as the current generation mobile devices with their "flat interfaces" and touch gestures are for us today. If you read this far, go ahead, download our iOS application and let us know how we did!

Download the iOS App


shutterstock_app

 

Check out our AR tutorial to learn how to use these features in action! 

AR Tutorial

Popular Posts