Facebook looks towards the next era of human-computer interaction
- Facebook has unveiled its research and 10-year vision smart glasses and most recently, some nearer-term research on the wrist-based wearable with limited contextualized AI
- Later this year, Facebook will address some groundbreaking work in soft robotics to build all-day wearable devices and give an update on its haptic glove research.
Given that the reality of virtual interaction has been on the rise since the Covid-19 pandemic, Facebook has been focusing on expanding its AI-based augmented reality (AR) initiatives. In 2020, the social media giant kicked off a three-part series on the future of human-computer interaction (HCI). All three initiatives show how Facebook will potentially play a major role in the AR segment.
Last year August, the social networking company rebranded its Oculus virtual reality and augmented reality division to Facebook Reality Labs. The name change doesn’t appear to be a signal that the company is giving up on its virtual reality push. Instead, it most likely highlights a greater focus by the company on augmented reality technologies going forward. Partnerships with existing eyewear companies, including Ray-Ban, have been a major asset as the company readies a big push into the AR glasses space.
Facebook’s leading position in the social media space and strong catalog of software across connectivity apps including its namesake social media platform, Instagram, and messaging service WhatsApp gives it a natural advantage in the AR and VR technology categories. Some analysts have wondered where the company’s next big growth drivers will come from as its social media and digital advertising businesses seemingly mature, but Facebook could see dramatic catalysts if AR and VR gain ground.
Smart glasses in 2021
Facebook kickstarted its three-part series by sharing its 10-year vision of a contextually-aware, AI-powered interface for AR glasses that can use the information the wearer chooses to share, to infer what they want to do and when they want to do it. That is a major step for Facebook. The company has been exploring the possibilities of AR glasses for years, with CEO Mark Zuckerberg noting back in 2017 that they were still “five to seven years away” from reaching the next stage with such a device.
The AR smart glasses are set to release this year after nearly four years of hype, and apparently, the company is considering incorporating facial-recognition technology, according to Facebook’s head of augmented and virtual reality Andrew Bosworth.
“Imagine being able to teleport anywhere in the world to have shared experiences with the people who matter most in your life — no matter where they happen to be. That’s the promise of AR glasses. It’s a fusion of the real world and the virtual world in a way that fundamentally enhances daily life for the better. Rather than dragging our attention to the periphery in the palm of our hand like our mobile phones, AR glasses will see the world exactly as we see it, placing people at the center of the computing experience for the first time and bringing the digital world to us in three dimensions to help us communicate, navigate, learn, share, and take action in the world,” Bosworth said.
Compared by the Facebook Reality Labs team to the development of the mouse and the graphical user interface (GUI) in terms of impact to human-computer interaction, these AR glasses will be wearable throughout an entire day and capable of adapting to the user’s visual environment as well as sharing user-requested information in real-time. The glasses, which are being built in partnership with Ray-Ban and parent Luxottica Group SpA, will connect to a device — through users won’t be able to overlay digital objects onto their real-world view, a foundational element of AR. It is expected to be launched later this year.
Facebook is actually the latest technology company to try to create a pair of smart glasses. Alphabet Inc.’s Google was early to the category with Google Glass, which never caught on as a consumer device but has found a home as a worker aid in the warehouse and industrial settings. Snap Inc. has also launched multiple iterations of its smart glasses, called Spectacles, which let people record video hands-free and transfer it to their phone. Apple Inc. is also building a pair of smart glasses.
Bosworth believes the future of HCI demands an “exceptionally easy-to-use, reliable, and private interface that lets us remain completely present in the real world at all times. That interface will require many innovations in order to become the primary way we interact with the digital world. Two of the most critical elements are contextually-aware AI that understands your commands and actions as well as the context and environment around you, and technology to let you communicate with the system effortlessly — an approach we call ultra-low-friction input.”
But such a system is many years off he reckons. Hence why Facebook is taking a closer look at a version that may be possible much sooner: wrist-based input combined with usable but limited contextualized AI, which dynamically adapts to you and your environment. While there are many other input sources available, all of which are useful, the wrist was chosen because, for an instance, voice is intuitive, but not private enough for the public sphere or reliable enough due to background noise.
“A separate device you could store in your pocket like a phone or a game controller adds a layer of friction between you and your environment. As we explored the possibilities, placing an input device at the wrist became the clear answer: This proximity would allow us to bring the rich control capabilities of your hands into AR, enabling intuitive, powerful, and satisfying interaction. A wrist-based wearable has the additional benefit of easily serving as a platform for computing, battery, and antennas while supporting a broad array of sensors,” he added.
Facebook claimed the missing piece was finding a clear path to rich input, and a potentially ideal solution was electromyography — sensors to translate electrical motor nerve signals that travel through the wrist to the hand into digital commands that you can use to control the functions of a device. These signals let you communicate crisp one-bit commands to your device, a degree of control that’s highly personalizable and adaptable to many situations.
“The signals through the wrist are so clear that EMG can understand finger motion of just a millimeter. That means input can be effortless. Ultimately, it may even be possible to sense just the intention to move a finger,” Bosworth said.
Impact on other businesses
Alongside some of the world’s most recognizable brands, businesses, and institutions, Facebook is also exploring how AR can transform everything from shopping to education. These experiences are embedded directly into the Facebook family of apps and devices to give people a new way to connect no matter where they are. Businesses that run on Facebook, Instagram, and even Whatsapp are set to benefit from this new development into AR.
Thus far, Facebook is working with museums and cultural institutions to bring art, history, science, and culture to life in the palm of ours hand. The social media company is also exploring how AR can positively impact journalism, working with The New York Times to support their new AR Lab working to deliver AR news stories on Instagram and inform and entertain readers in an interactive way.
In fact, with AR, one can bring the best of in-store shopping with them anywhere. Walmart, West Elm, and Anastasia Beverly Hills are just a few of the newest brands to employ AR shopping across Facebook. AR Try On has been live in Facebook ads and with a small number of Instagram Checkout Partners since 2019, and with the wearables out soon, businesses can consider expanding the usage of AR to benefit from it.