1.2202035-2971177201
Arnav Kapur Image Credit: Supplied

MIT has developed a machine that can read your thoughts. That’s what the headline said. It must be true.

For those of you who are now actively thinking of ways of blocking the Thought Police: relax. The science behind MIT’s latest invention is pretty clever, but it doesn’t read your thoughts. But let’s give you the background first.

Last week, MIT issued a statement that it had developed a computer interface, called AlterEgo, that “can transcribe words that the user verbalizes internally but does not actually speak aloud.” That’s true, and it’s why web editors everywhere jumped to the conclusion they did. But what AlterEgo really does is read the nerve impulses that your brain sends to your jaw when you’re thinking about speaking.

Still sounds too much like mind control? Ok, let’s try this. When we all learned to read, we generally did so by internally “saying” the words we read in our head, which is called “subvocalizing.” This mental process is connected to the jaw, which is apparently why some people move their lips when they read, or at least that’s the theory that people studying speed-reading back in the 1960s came up with.

When we subvocalize, our brain sends impulses to our jaw in anticipation of actual speech. According to MIT, they found seven places in our jaws that are good for reading these impulses; they then use an algorithm to turn those signals into actions.

So far the technology is limited, and the usable vocabulary it offers is dwarfed by the fact that you look like a complete goofball when wearing the AlterEgo. While I love technology, they really need to start introducing a little fashion design into MIT’s curriculum. Seriously, this thing makes you look like a refugee from a dentist’s chair.

However, I digress. As I was saying, the vocabulary is limited so far to about 20 words and MIT is boasting an accuracy rate of about 92 percent. In a video from MIT, Arnav Kapur, a graduate student at the MIT Media Lab who led the development of the new system, uses it to control a ROKU streaming player.

The possible uses for subvocalization as a user interface for computers are as versatile as human speech. It could be used to give those without a voice better control over their surroundings. On a less noble, and probably more commercially viable, note, it can also be used to cut down on the arguments between you and Siri, Alexa and/or Google Assistant.