When Brain-Computer Interfaces Go Mainstream, Will Dystopian Sci-Fi Be Our Only Guidance?


"I enter the subway. It's crowded as usual around this time, but I manage to find a vacant seat next to a Talker—a man carrying on a conversation on his phone in public space. Only grandpas do that these days. I take out my ThoughtReader. I just got a new one last month, much more discreet than my old one, it fits right behind my ear. I hold up my smartwatch and open the ThoughtNotes app. I press the tiny switch behind my ear and feel a little tingle, a sign that it's connected. A small light blinks through my ear to indicate to others that I am focussed. I start jotting down some ideas. It can be a little messy sometimes especially when you are just forming the thoughts, but it's fast and I can easily clean it up on my computer at home. When I'm done, I press save and open the QuietChat app. I call my husband and we thought-chat about work and dinner a bit. I can sense he's tired. It's funny, I think afterwards to myself, I can't believe we used to have these conversations out loud…"

When Brain-Computer Interfaces Go Mainstream, Will Dystopian Sci-Fi Be Our Only Guidance? GIF by Zaza Zuilhof

This scenario may sound like your average sci-fi story, but there is an important difference—this scenario, like some of the better sci-fi, is grounded in real research and current technological development. More importantly, this scenario is an initial sketch for a future vision that I wouldn't mind inhabiting.

When I set out to write an article about the near future of brain-computer interfaces (BCI), I was met with a lot of shivers and 'hm, good luck'-s and 'oh, scary!'-s. The public image of BCI is heavily shaped by dystopian scenarios as depicted in movies and series like Black Mirror. Whenever a new technological breakthrough in this field is presented, you can bet that all the doom scenarios are listed in the endless comment threats below. I understand the strong reactions to such an intimate and socially impacting piece of technology, but what about the promises?

Most BCIs were initially developed for medical applications. Some 220,000 hearing impaired already benefit from cochlear implants, which translate audio signals into electrical pulses sent directly to their brains. Recent;y Elon Musk entered the industry, announcing a $27 billion investment in Neuralink, a venture with the mission to develop a BCI that improves human communication in light of AI. And Regina Dugan presented Facebook's plans for a game changing BCI technology that would allow for more efficient digital communication.

Whether you're ready for it or not, these are all signals that brain-computer interfaces won't just stay in the realm of neuroprostheses and entertainment, but could actually go mainstream. If we accept for a moment that people will continue to work on this technology and its capabilities will continue to improve, and if we assume that no-one is interested in living the doom scenario, then we can try to consider the real implications and possibilities of this technology and imagine a viable alternative. What does it mean for interactions with our devices, and more importantly, with each other? Could this be the ultimate interface—one that is invisible, seamlessly integrated into our minds?

First of all, what are these so called brain-computer interfaces currently out there actually capable of? The answer depends who you ask and whether or not you are willing to undergo surgery. For the purpose of this thought-experiment, let's assume that healthy people will only use non-invasive BCIs, which don't require surgery. In that case, there are currently two main technologies, fMRI and EEG. The first requires a massive machine, but the second, with consumer headsets like Emotiv and Neurosky, has actually become available to a more general audience.

When Brain-Computer Interfaces Go Mainstream, Will Dystopian Sci-Fi Be Our Only Guidance? Emotiv's EEG Headset (Image via Emotiv)

In an impressive demonstration of the potential of so-called active BCI (where you substitute physical or voice control with 'thought-commands'), Rodrigo Hübner Mendes used EMOTIV's EEG headset to drive a Formula 1 car with his mind. Erica Warp, VP of Product at EMOTIV, believes there is even more potential in the use of passive BCI. "We currently interface with machines in very discreet moments, which is quite limited. Our constantly changing cognitive states, which we can access via brain computer interfaces, opens up a wealth of untapped information." Giving computers awareness of our cognitive state may allow them to adapt accordingly. Think of parameters such as focus, engagement, interest and stress. Assuming we figure out the privacy issues, I am excited by the idea of contextually-aware digital companions that respond more in sync with our state of mind. Like a good co-worker, they would not disturb me with notifications if they sense I'm deeply focused. Or like a good friend, they communicate in a more soothing, relaxed tone if they notice I'm tired or stressed. And like a good teacher, they could adjust an educational approach dynamically according to my level of engagement. And there are companies out there, like QNeuro, already actively exploring this direction.

Although valid in specific situations, I don't feel that use cases like these would make us walk around with a headset all day. In order to imagine that scenario, we have to travel further into the future. Back to Facebook's presentation. If Facebook can pull off what it presented, to develop a non-invasive BCI that reads the speech centre of our brain at 100 words per minute, 5 times faster than typing on a smartphone, things get a little more uncomfortable. Most of that discomfort comes from the sensation that this might actually be the kind of form in which BCI gets adopted more broadly.

These further out future scenarios are impossible to predict, but therefore even more important to imagine. This is where the average dystopian sci-fi image lives, and we should take some of the valid concerns exposed in those images and provide a more humane, preferable alternative.

For example, what are the terms and conditions? I rationally know that typing something on my phone might not be that different from thought-writing something to my phone, but the privacy bells start ringing in my head. What does it mean to wear a device that can read my brain? In my ideal scenario, the device is entirely and fully mine. I pay for the device, rather than exchange personal date for the device. I go through a period of training with the device, it learns how my brain communicates certain words, we get the difference between angry excitement and happy excitement. And after a week we are good to go. It is an amazing input device, like a keyboard, a very personal keyboard, my own brain-keyboard that I can plug into any computer I find. The keyboard only connects locally, it's like Bluetooth, or whatever its future equivalent may be, and I only connect it when I use it.

When Brain-Computer Interfaces Go Mainstream, Will Dystopian Sci-Fi Be Our Only Guidance? GIF by Zaza Zuilhof

Initially, the social impact may not even be that big. It is, after all, just an input device. Say I have a coffee with a friend and I want to send a message to my colleague. I would need to activate my thought-reader and pull out a screen of some sort (a phone, watch or AR headset) and then, while doing most of the writing by thought, I would still closely watch the screen to avoid typos. And this whole ritual would most likely be considered just as rude as using my smartphone in the same scenario today. So while we think of BCIs as being highly invisible, I expect that the initial usage would still be reasonably transparent and visible to those around us.

The big social disruption likely lies even further out, but will be directly influenced by the way the first mainstream BCIs are designed. Imagine a future in which we can not only read signals from the brain but also write signals back to it. In this future, imagine Augmented Reality is pervasive—digital information can be overlaid on your visual perception at will. We can have very private conversations in public space. It might be hard to tell the difference between someone daydreaming or thought-writing. However, like the little light behind the ear of the protagonist in my scenario, designers might end up creating purposeful signs to show when someone is using a BCI and manage to avoid rude or otherwise uncomfortable social situations. Those are exactly the kind of mundane details that could define the difference between a dystopian and utopian future.

Although it is hard to tell where exactly these technologies will take us, people right now are working hard to make them a reality. Whether they will become mainstream is more a question of when than if, but when they do, my biggest concern will be how. Currently the biggest push comes from the medical, neuroscience and technology industries, and only few designers have shared visions for the possibilities of BCIs outside of assistive or diagnostic medical tech. In their unique position to represent the final user and consider downstream social implications, they could add meaningfully to the creation of a positive future vision. I believe it's important that designers help shape the future of our brain-computer interactions sooner rather than later and guide the way past dystopian visions to the promise of these technologies without their negative consequences.


Tags: Facebook, Elon Musk, Design, Tech, Ux, Consumer Electronics, Emotiv, Regina Dugan, Neuralink, Zaza Zuilhof, Neurosky, Rodrigo Hübner Mendes, Erica Warp

Source:  http://feedproxy.google.com/~r/core77/blog/~3/fGmxtDXIZdY/When-Brain-Computer-Interfaces-Go-Mainstream-Will-Dystopian-Sci-Fi-Be-Our-Only-Guidance



Related:
February 8, 2018 at 3:00 PM Bould Design's Latest Projects, Rylo and Vestaboard, Are All in the Tiny Details
May 15, 2017 at 5:46 PM Here's everything we know so far about Faraday Future's FF91 electric, self-driving car
January 5, 2016 at 6:17 AM Designers: What is the Application for Flexible Displays?
May 31, 2015 at 7:27 PM Intel’s User Experience Vision on Display at Computex!