Emerging Interaction Methods for AR Smart Glasses to Forge a AR Market Value of $100 Billion by 2024
User Interface (UI) is a critical component to designing a product that will exceed user expectations, achieve optimal User Experience (UX), and stand out from the competition. At the current stage of hardware maturity, Augmented Reality (AR) smart glasses and AR Software Development Kits (SDKs) support a greater range of interaction methods than ever before. Now in the mix with traditional controllers and smartphones is quickly maturing voice control and novel HMD-based options like gesture recognition and eye tracking. Pairing the proper input method to a well-designed UI allows users to perform tasks more efficiently, intuitively, and swiftly, leading to enhanced user experience and ultimately more fully realized value. ABI Research, a global tech market advisory firm, examined the emerging interaction methods for AR smart glasses that maximize user flexibility and optimize UX in a market that promises value over US$100 billion by 2024.
“The decision around the most efficient input method and UI design strongly relies on the nature of the target task or content, the user’s potential environment, and the type of the AR device,” says Eleftheria Kouri, Research Analyst at ABI Research. For instance, traditional buttons and touchpads are not always suitable in scenarios where a user must wear safety gloves or handles tools/equipment. Hands-free voice control is considered the most optimal interaction method in these and similar scenarios. “In regards to AR device type, monocular devices, which are used in less complex enterprise and consumer applications, can perform tasks efficiently with traditional user-friendly interaction methods via buttons or smartphones, with the option to add additional value and flexibility with more advanced input methods.”
More Intuitive and user-friendly interaction will enhance User Experience and drive AR smartglass growth
Gaze and gesture control are considered among the most emerging interaction methods for AR smart glasses, which significantly enhances UX and immersion by allowing users to intuitively and swiftly perform hands-free tasks. While highly capable, gaze and gesture control are not suited for every user or use case because they require high accuracy and low latency to be efficient and meet user’s expectations. Advanced AR headsets such HoloLens 2 and Magic Leap 1 support a wider range of UI opportunities and input methods thanks to inclusion of things like eye tracking. While these are currently the most capable AR devices available, there is still significant value in simpler devices that maximize usability and streamline user experience with other input paradigms.
“Simple and intuitive UI and streamlined UX have mostly been ignored in the AR market thus far, but both are essential for enhancing and maximizing the value which will propel AR smart glasses growth toward the mainstream. Being able to remove the requirement for device training and keep users engaged with the device will maximize value for consumers and enterprises. At the same time, device features such as lower weight, capability for spatial mapping and sound, improved display quality, and better user feedback through UI and haptics also play an important role in maximizing this value,” concludes Kouri.
These findings are from ABI Research’s User Interface and User Experience in AR technology analysis report. This report is part of the company’s Augmented and Virtual Reality research service, which includes research, data, and ABI Insights. Based on extensive primary interviews, Technology Analysis reports present in-depth analysis on key market trends and factors for a specific technology.