Audio version:

Broadly speaking, brain-machine interface technologies include any technology that directly communicates commands from a human to a computing device, as well as communicating information or alerts from a computing device to a human, without the use of touch or sound.  Said another way, brain-machine interface technologies may encompass any way of communicating between a human and a computer and vice versa other than through typing, gestures or voice.

Brain-machine interface technologies will be a new paradigm for how we interact with machines and how machines provide services to humans.  Today, it is difficult to anticipate all of the ways our lives will change as computing devices become increasingly sensitive to our thoughts, emotions and reflexes.  The one thing we can count on is that 10 years hence, we will be surprised by how rudimentary our interactions with computers were at the start of the 21st century.

For purposes of this blog, any human-to-machine interface that depends on new sensors or signaling involving biological or physiological phenomenon are encompassed within the brain-machine interface technologies.  In 2019, the brain-machine interface revolution has already entering the marketplace.  Smart phone apps can sense the mood of their users based on observable factors, such as voice tone, movements, and usage patterns.  These apps use pattern recognition to guess the mood of the user, and can adjust the way the app interfaces with the user, such as by changing background colors, changing voice tone, suggesting text for messages, etc.  Wearable devices configured to sense the physiological parameters of their user are now very common, including watches that measure pulse, activity, and now in the Apple Watch 4, even the users heart activity by performing a single lead electrocardiogram, better known by its acronym “EKG.”  Wearables can also provide feedback to users, such as tactile vibrations, audio tones and displays, to prompt users to stand, exercise, relax, etc.  These new ways in which machines are sensing human conditions and providing suggestions to users are quickly becoming part of our daily lives and suggest a future that awaits us all.

Another brain-machine interface technology now entering the marketplace is eye tracking.  In this technology, a camera images a user’s eyes and determines where the user is looking.  When the user is looking at a computer display, eye tracking information can be used as a direct interface to the machine.  Such technologies are just beginning to enter the marketplace, but the concepts have been patented for a few years.  There are patents on ways to enhance graphical user interfaces by magnifying the portion of the display at which the user is looking at any given instant.  There are patents on eye tracking concepts, some of which include determining whether a user is looking on a display and activating a function just as if the person touched a touchscreen or clicked on a graphical user interface icon.  Eye tracking is a technology that may be combined with other physiological sensors (e.g., EEG) in an application of brain-machine interface technologies.

A likely mainstream approach to brain-machine interface technologies is the capture and analysis of brainwaves and neural signals.  Systems for analyzing brainwave and neural signal have been around for decades and used for diagnostic and patient monitoring purposes.  The sensors and amplifying equipment necessary to capture brainwaves are well developed.  Thus, we can expect numerous products that will leverage brainwave analyses to turn thoughts and other mental activities into actionable determinations and machine actions.

Human brainwaves are weak signals that exhibit complex patterns that are highly variable from person to person and difficult to correlate to specific thoughts or brain activities.  Brainwave patterns have been correlated to certain generalized brain activity states, like concentration, relaxation, and different phases of sleep.  Technologies for correlating brainwaves and nerves signal to practical applications are becoming a reality in research laboratories.

Brain-machine interface technologies may not be limited to interfaces with the brain as thoughts, emotions, and actions are communicated through nerves and other physiological aspects of the human body.  Sensing and acting upon neural signals is a well-established technology in the fields of cardiac pacemakers and implantable defibrillators.  Similar technologies may be applied for other purposes related to controlling computer or medical devices.  For example, neural sensors on or implanted within limbs could be used to control prosthetic actuators or exoskeleton devices and systems, enabling users to control such devices as they would their own hands or fingers.  For an excellent introduction to these technologies and potential applications, check out Elise Hu’s Future You segment on NPR at https://jwp.io/s/ohqXcQeV.

Researchers are focusing on new and more challenging applications for brain-machine interface technologies. One area with promise involves using brain-machine interface technologies to provide or enable thought-controlled prosthetics that can behave more like the limbs that they replace. Another application is controlling machines that require quick reactions and lightning reflexes, like aerial drones. Can you imagine racing drone pilots putting down their controllers and just thinking their drones through the race course?
Another future of brain-machine interface technology involves microelectronic medical implants or “brain chips” that can directly interface with portions of the brain. In 2017, Elon Musk co-founded Neuralink, a startup venture working to combine sensing of the human brain with artificial intelligence. According to recent news reports, Neuralink is developing ultra-high bandwidth brain-machine interfaces to connect humans and computers, has developed a robot for implanting such interfaces in the human brain, and is planning to begin animal trials by 2020.
Brain-machine interface technologies may not be the only way to interface with a computing device. Keyboards, gesture recognition and voice recognition are not going anywhere, at least not for some time. Most likely brain-machine interface technologies will augment other interface technologies to better enable humans to communicate with machines. Yet, there may be applications for brain-machine interface technologies that enable machines and systems for which no other form of human-to-machine interface will work. These will be applications, device and machines we can hardly imagine today.
Brain-machine technologies will raise new legal issues and challenges for inventors, entrepreneurs, and companies working in this area. In future articles, we will explore unique challenges that will be faced in obtaining patent protections for inventions in these technologies. Also, we will the explore legal implications of machines recording, processing and acting upon our thoughts, such as personal privacy; ownership of your brainwave patterns; liability for machines acting on thoughts; and government licensing and regulations.