Developer uses brainwaves to control Google Glass

Building consumer tech that interfaces with your brain sounds a bit like science fiction, but touch and voice commands used to be thought of in the same way and look at their capabilities today. Now, thanks to a new tool called MindRDR, developers have a framework for implementing thought control on top of a device more and more of us are choosing to wear: Google Glass.

Glass is somewhat limited in the way it can be controlled, and in some ways that is a good thing. Through the use of eye tracking and voice input, a great deal of the UI can be controlled without your hands. While this has been sold to us as great for Doctors to use in surgery or for turn-by-turn directions when we are driving, the application of this technology for someone who has lost control of part of their body is something that can’t be overlooked.

In a recent demonstration, MindRDR has been shown driving Google Glass using commercially available EEG headsets to allow for basic controls just by thinking.

Google Glass OMAP

You aren’t likely to want to walk around town with a NeuroSky EEG headset resting just overtop of your Google Glass. That is, if you’re the type to be alright leaving the house with the Android computer on your face at all. But the applications for those incapable of basic motor control is significant.

From the combination of these two pieces of consumer technology, and of course the MindRDR software, those of us who would otherwise be unable to effectively communicate on their own would have the tools necessary to do anything from reply to a text message to take and send a photo with just thoughts and voice.

MindRDR has been made available for free to developers by This Place, though Google has made it clear this software has not been reviewed or approved by their team yet so it’s unlikely to be consumer-friendly in the immediate future.

There are no comments

Add yours