Building 8, the Facebook research team, is working on a project that allows us to write up to 100 words/min with our mind, and on another to listen with our skin.
Regina Dugan, vice president of engineering and Building 8 of Facebook, jumped on stage, on the second day of the conference F8, to talk about the advances in the development of non-invasive mental interfaces, giving as an example two projects they are working on: writing with our mind and listening with our skin. Facebook’s Building 8 product research and development team focuses on creating and categorizing new consumer products that are social first and drive the advancement of Facebook’s mission.
Michael Abrash, head of research of Oculus, he advanced moments before Dugan’s talk, that full augmented reality is going to need great advances in many fields and one of them is to have access to our mind to improve our memory, interpret context and other features that augmented reality will need.
Dugan first presented a project to write with our mind, whose objective is to achieve that we can introduce 100 words per minute with our brain, which turns out 5 times faster than typing with a mobile. It’s not about decoding random thoughts, but about the user choosing what he wants to share. “Think of it like this: You take a lot of photos and only choose to share some of them. So we have a lot of thoughts and we just chose to share some of them,” Dugan explained. “It’s about decoding those words that you have decided to share, sending them to the speech center of our brain. It’s a way to communicate with the speed and flexibility offered by voice and text privacy.”
To carry out this technology on a large scale, they have to use non-invasive sensors with reasonable prices and that are capable of reading hundreds of times per second, “something that today still does not exist”. Dugan points out that optical technology will be the only one capable of reading our minds without being invasive and that in the coming years the goal of 100 words / min will be achieved.
Facebook is also working on a system that allows users to listen with their skin. During the talk we could see some examples that by means of actuators on one of the arms, the user interpreted various words such as colors, shapes and actions. Dugan concludes that this type of technologies will allow us to connect with anyone around the world, reading the semantic context of words. The day will come when we will be able to think in Mandarin and feel in Spanish.