Brain-computer interfaces are slowly beginning to take form, and here at Neural we couldn’t be more excited! Elon Musk’s Neuralink claims it’s on the cusp of a working device and Facebook’s been developing non-invasive BCI tech for years.
If everything goes according to plan, we could be wearing doo-dads or getting chip implants that allow us to control machines with our minds in a decade or less.
That’s a pretty cool idea and there are innumerable uses for such a device, but who knows how useful they’ll actually be in the beginning.
It’s easy to get swept up in dreams of controlling entire drone swarms with our thoughts like a master conductor or conducting telepathic conversations with people around the world via the cloud.
But the current reality is that the companies working on these devices are spending hundreds of millions and, so far, we can use them to play pong.
This isn’t meant to denigrate the use of BCIs in the fields of medicine and accessibility, we’re strictly talking about recreational or personal-use gadgets. But, judging from the above video, it could be a while before we can ditch our iPhones and PS5 game pads for a seamless BCI.
In the meantime, there’s nothing wrong with a little conjecture. BCIs aren’t a new idea, but they’ve only ever really existed in the realm of science fiction. Until now. The Deep Learning AI revolution that started in 2014 made them not just possible, but viable.
Machine learning allows us to miniaturize chips, discover new surgical techniques, run complex software on relatively simple hardware, and a dozen other computing and communications feats that work as a rising tide to lift all vessels when it comes to BCIs.
While no technological advance is guaranteed, it seems like BCIs are a shoe-in to become the next big thing in tech. It’s even arguable they could become mainstream before driverless cars do.
On the other hand, it could take decades. AI isn’t a new technology, current machine learning techniques can be traced back to the 1950s and people have been trying to connect the human brain to computers for even longer.
With everything up in the air, it can be difficult to imagine the future, but there’s another potential eureka technology rising alongside modern artificial intelligence and it could be a game-changer for BCIs in the future: quantum computers.
Currently there’s no reason to believe quantum computers and BCIs would have anything to do with each other, they’re apples and oranges – if the apples were both fresh and rotten at the same time and you had no way of knowing until you bit into one.
But the big deal here is that quantum machine learning could solve some potential problems with BCIs.
Let’s say Facebook debuts a wearable BCI five years from now in 2026. Instead of trying to predict everything it’ll be used for, we’ll just say it’s functionally capable of interfacing with a laptop or smartphone.
With this device, we’d be able to look at our monitors and type or move the mouse cursor with nothing but our minds.
Here’s the problem: it’ll probably be slow and glitchy. Facebook, for example, is going to have to come with some robust algorithm solutions to translate our noisy brain waves into digital commands using an external device.
There’s no conceivable way to translate our thoughts into pure control, a computer processor won’t directly react to our thoughts. The brain waves will have to go through a sort of digital translator to be turned into data that can either be sent to the cloud for processing or end up on an on-device chip where they’ll be converted into communications data to be processed at the target system. And then that process has to happen in reverse as well.
That all sounds complicated, but the bottom line is that it’s incredibly likely the first few generations of these things will work about as well as any early networked technology: it’s going to be difficult to make all of this complex communication instantaneous outside of a laboratory.
Sure, in this hypothetical scenario we can assume the device runs pretty good in perfect conditions. But things are different out in the wild. We’ve had smartphones for decades now, but raise your hand if the device you’re reading this on right now experiences slow downs, freeze ups, or software glitches.
Quantum machine learning promises to exponentially speed up classical AI functions. This is because quantum computers can actually time-travel or even stop time in order to find multiple potential solutions for a problem simultaneously.
If you imagine an AI that translates the thought “call mom” into a series of protocols that unlocks your phone, pulls up your contacts, finds mom, and then dials the number it seems pretty simple. After all, we can already do this with our voices.
But think about it without baked-in software. Imagine a physical robot taking your smartphone, finding your contacts app, scrolling through it until it finds “mom,” and then dialing the number for you. That wouldn’t happen instantly. The AI would have to go through each contact listed and decide if it was “mom” or “not mom.” If you’ve got 200 contacts it could take a few seconds.
That might not sound like very long, but ask any UI designer what percentage of people leave a software environment after a three-second or longer delay and you might be surprised. We have very little patience for technology that takes longer to do a simple task than it would be for us to just do it ourselves.
Quantum machine learning could, hypothetically, take the exact same task and perform it near-instantly. Rather than working out which numbers were or weren’t “mom,” a BCI running quantum algorithms could check them all at the exact same time and, because quantum mechanics are spooky, surface the answer while it was still performing the searches.
There’s a lot of incredibly complex and theoretical physics going on behind such a concept, but the gist of it would be that it would feel like the BCI was prepared for what we were going to ask it ahead of time.
This is the simplest way to conceive BCIs and quantum computers working together if you ask us. But, then again, all of this becomes moot if someone figures out a new classical AI paradigm that shortcuts problems with branching solutions.
As to what would happen if we developed a BCI specifically designed to connect to quantum computers… probably nothing. Quantum computers don’t work like classical ones. You wouldn’t be hunched over a keyboard playing quantum solitaire while you wait for quantum movies to quantum download.
Quantum computer are more like engines than interfaces. They, essentially, perform work. We use classical computers to interface with them so we can interpret that work as classical data.
It’s possible in some far away future that we’ll replace classical systems with quantum computers and be able to interface directly with them. But that’ll take a much greater understanding of the human brain, quantum physics, and the universe itself.
Right now, we haven’t built a bridge between our observed reality and the spooky stuff that happens at the quantum level.
This is a syndicated post. Read the original post at Source link .