From Stephen Hawking to Johnny Mnemonic: the long road traveled by the "helmets to read the mind"

Research in the field of mind reading by a computer dates back to 1970. Since then many things have changed. Especially interesting is the evolution in helmets capable of interpreting "thoughts".

The technology for "reading the mind" contains a good amount of science applicable to everyday life. But have we got it to work at last? Despite the fact that science fiction movies are still that, fiction, we are fast approaching the application of these types of devices in a much more practical way than in movies.

What is a helmet capable of reading the mind for?

Mind reading is an overly broad and complex concept. In it we can range from projecting our dreams to giving orders without moving a muscle: telepathy, telekinesis, information storage, memory retrieval ... all of these are more or less ways of interacting through our thoughts. However, of all of them, which ones are real? The truth is that there are some applications already available in our world that could well be classified as fantasy.

Among the most interesting applications are those that solve human problems. Some time ago we spoke to you about people with speech difficulties and how a device capable of reading the electroencephalogram could try to solve the communication barrier. The company Emotiv has been working on this type of device for quite some time, in fact.

His idea is capable of creating a helmet that reads the electrical signal from our brain and translates it into vocalized words using a device, such as our mobile. And if you are able to vocalize, why not read the movement of other parts of the body? The 2012 iBrain prototype intended just that, and was tested by Stephen Hawkin himself, who suffered from a terrible paralysis due to neurodegenerative disease. If such an element reached technological maturity, it could be implanted in people with bionic limbs to make them more precise and natural, right?

Well, this is already possible, although even the most advanced cases read the neurons of the limbs, and not directly the mind. This has an explanation: the difficulty of brain signals. We still don't fully understand how the "most important organ" in our body works. Diving through the maremagnum of changes that occur in the tissue, trying to interpret the signals it emits without even being able to "enter" it, is a titanic task.

However, progress is reaching impressive levels. In addition to reading the mind to reproduce movements, for a few years now, scientists have been working on an algorithm capable of reproducing the image we are imagining, as they explain in this Science article, thanks to the involvement of neural networks and machine learning. And although there is still a lot of work ahead, the truth is that probably, the virtual reality that we can appreciate in Johnny Mnemonic, the cult movie (which is supposed to take place in 2021, curiously) would start with works like this.

And if instead of reading the mind you read the mouth ...

As with bionic limbs that are not directly connected to the brain, there are some other applications that read the signal from our nervous system. After all, this is another way to "read the mind", even if it is at the end of one of your communication channels. And it is a fairly efficient way.

By reading at the end of the neural path we get rid of all the mess of signals received by the devices. This makes interpretation easier in most cases. Under this premise worked, precisely, the apparatus that accompanied Stephen Hawking until the end of his days. Unfortunately, and despite the ingenuity we were talking about, Hawking never used a useful device capable of reading his mind.

Instead, the device that connected his chair read the muscles of his cheek, as described in this article from the technological SwiftKey, where they reveal some of his secrets. The device was specially designed for him, and the software learned the language he used to improve his communication as well as his limited interaction with the world. In a similar way, the joint project between MIT and Google, AlterEgo, works on a somewhat particular "mind-reading" helmet.

And it is that, instead of reading the mind, the helmet, attached to the lower jaw, is able to read the words that we do not pronounce. When we verbalize a message, even if we don't utter a word, our facial muscles prepare to communicate. It is a phenomenon known as subvocalization. Thinking of verbal commands: "Turn on the computer" or "Write this phrase", AlterEgo is able to read muscle signals and identify the message.

Thanks again to neural networks and the machine learning, two technologies inherently linked to these advances, the device distinguishes orders and filters them, allowing them to interact with other software. In this way, this small, non-invasive helmet becomes the gateway to interact with the world through the mind, but subvocalizing.

The most promising inventions to date

Moving a bionic limb, helping us communicate, executing orders or transmitting an image to a screen are applications that we already have between us. What state are they in? At this point, the most advanced devices, without a doubt, are the bionic limbs controlled by thought, both those that are directly connected and those that read muscle signals (which are the majority). There are numerous products on the market, some very advanced, such as Luke or the impressive prosthetic members of the Johns Hopkins laboratory of applied physics, which even works on the ability to include touch.

Of course, among the most leading devices is also AlterEgo, which, although it has been a few years since they published any news about the project, continues to be on the edge of the greatest advances due to its application and its possibilities: as it is not invasive and Being a very integrated device in the IoT environment, not only serves to solve problems, but could help make life more comfortable around the world, like one more device.

Another example at the apex of progress is the device we started with, from Emotiv. The company sells several of these headsets, not only capable of helping to communicate better, they claim, but also designed to collect all kinds of cognitive data. What interest or utility can they have? At a colloquial level, probably none, although the company assures that it allows us to review our stress levels, focus and other mental states. In any case, there is still a long way to go, even in a device that has been on the market for years.

Finally, we are left with mind readers capable of exposing the image we think about. The algorithm we were talking about is capable of decoding with some precision the image we were thinking about. Of course, not from scratch. You need some pioneers and extensive training. But, for the moment, this algorithm presented by a team from Kyoto University, has laid the foundations for this technology, as we observed in its article. The algorithm is now capable of detecting simple shapes and "printing" them on the screen. There is still a long way to go to make the Mnemonic universe a reality, but it's still the closest approximation to that world.

Images | PxFuel, Unsplash, MIT, Emotiv

Share none:  Our-Selection Science Analysis 

Interesting Articles

add