In our previous blog post we discussed the wider context of emerging digital technologies, and how these technologies will contribute to the development of ‘Zero UI’ based interaction. In this post we’ll look more closely at the role AR will play in this, some of the associated hardware technologies and the differences in the resultant user experience.
At present, AR is most commonly being introduced for both enterprise and consumer application based on using smart phone or tablet devices. Such devices, because of their widespread use and technology maturity are the obvious choice for hosting AR software and apps. Indeed, Liv Systems’ current internal project looking at the use of AR as a tool to aid navigation in built environments is based on a prototype app to be hosted on smart phones or tablets. However, whilst these devices may be able to host AR reasonably effectively, it is likely that the most powerful and effective AR experiences will be delivered by smart glasses-based systems.
AR using smart glasses is starting to be trialled and used by many enterprises for a variety of applications. These include collaborative working on designs, presented as virtual objects placed in the real world, and maintenance activities in which digital information is overlaid onto real world objects. For instance, the wiring within an electrical cabinet could be overlaid onto the users’ view of the cabinet with the cabinet still closed, or perhaps the wiring could have digital information overlaid onto it, providing digital labelling and explanatory information.
Perhaps the most widely used AR smart glasses for enterprise use is currently the Microsoft HoloLens which supports the Microsoft Mixed Reality platform.
Whilst smart glasses such as the HoloLens are fine for certain types of enterprise use and also potentially for home based use, in the long term for people to make use of smart glasses in their daily lives, we need to be moving more towards the type of smart glasses which are unobtrusive and look almost exactly like conventional glasses.
Google attempted to introduce smart glasses to the consumer market a few years ago, but they were not very successful due to privacy concerns associated with the built in camera and furthermore they were not a very elegant design, so again, regardless of privacy concerns they would have been unlikely to appeal to consumers outside of a very niche market. Over the last few years though there have been efforts to design smart glasses which could appeal to a wider market and in the last year a US and Canada based company ‘North’ has developed a viable consumer product in their ‘Focals’ smart glasses.
These smart glasses feature a laser projected AR display, a built-in mic and speaker and a device known as the ‘loop’ which is worn as a ring with a tiny joystick functioning as an input device. The app support for the glasses can display message and appointment notifications, calendar, directional and weather information, texts and be used to order Ubers. They are coupled with Alexa and texting is supported by voice-to-text.
Interestingly AR based wayfinding assistance is one of the features already developed for the Focals smart glasses, providing further evidence that there is considerable interest in this as one of the most useful application areas for general consumers.
Smart glasses technology is still in its early days but as these technologies mature and provided their cost and aesthetic appeal proves acceptable to the public, it’s not unreasonable to expect the uptake of such devices could become widespread, providing the basis for the widespread use of AR based software and apps by the general public.
AR hosted on smart glasses will demonstrate a number of user experience benefits in comparison to AR provided by smart phones or tablets:
- It will be hands free
- It will provide a far greater field of view
- It will be able to exploit stereoscopic vision and hence make use of depth of field
- Digital information would be overlaid onto our view of the world in such a way that we would not need to look at a specific device to get the AR view we want; hence the AR information would be integrated more seamlessly with our visual sensory input.
- It will be more immersive and hence offer a more compelling user experience
- When combined with natural language-based user interaction, it will enable a user experience which is closer to the concept of ‘Zero UI’ .
Until such time as smart glasses are widely available, it is nevertheless worthwhile to explore the potential of AR using smart phone or tablet devices. This is because many people may continue to prefer such devices, smart phones still offer the potential to provide interesting and effective AR experiences and furthermore much of what we learn from developing AR for smart phones will be applicable to AR provided by smart glasses.
In our next blog post we’ll go into detail regarding our own research project on the use of AR for wayfinding in built environments, including exploring different approaches to providing wayfinding assistance using AR and the potential benefits to managers of built environment infrastructure which may be provided by the use of AR by their customers/users.
As always, please feel free to get in touch as we’re keen to discuss these topics with industry professionals and academics who are as excited about the potential of these technologies as we are.