Brain-machine interface

The power of thoughts

Controlling machines with our minds; it seems like the stuff of science fiction, right? You might be surprised to learn that in the research world, it is a very real possibility. In fact, controlling machines using electroencephalograms (EEG -> brain signals) was already accomplished in the 80s19. Of course, the first applications were simple and prone to errors, but it showed enough promise to make brain-machine interfacing its own research field. In assistive research, interfacing directly with the mind of the user is an extremely convenient idea because, contrary to body movements, brain signals of some form are available from absolutely everybody no matter the disabilities or physical impairments.

Brain-machine interfaces for users with severe disabilities

At first, brain-computer interfaces were very invasive, as in pieces of hardware surgically added in your skull invasive.

However, for some people with severe disabilities, it was still an acceptable trade-off for the opportunity to interact more with their environment. People were gradually able to activate signals, move a computer cursor, drive and electric powered wheelchair and eventually even control 7 degrees of freedom robots using nothing but their minds20, 21.

Kinova Gen3 white robot thought control user cloud comparison no end effector embedded vision on end-of-arm blue base

Above: Figure 9, Thought control example with Kinova Gen3.

For obvious reasons, most people were still reluctant to use this kind of technology. This drove researchers to work on what is called surface EEG, i.e. brain signal measurements from sensors outside of the body. This is accomplished by wearing a "hat" that contains an array of sensors spread out around the head. The big issue with this method is that the actual signals originate from inside the brain, not the surface. So not only are they weak once they reach the surface, but it is also hard to tell where they come from to identify them properly. Nevertheless, with the advances in AI in recent years enabling clearer and clearer identification.

Researchers were able to infer increasingly complex information from brain signals.

In assistive, surface EEG first enabled users to make binary choices, then do object or target selection and even most recently control a robotic manipulator for simple reaching tasks and even to imitate human motion22-25.

BMI: The key to intuitively control non-intuitive machines

It is clear that brain-machine interfaces are not quite ready to be integrated into professional and industrial environments. However, it is also indisputable that it may become ready in the near future.

With the use of a brain machine interface, the possibility opens to intuitively teleoperate machines that don’t make sense to control via other HMI methods like eye-tracking and body-machine interfaces.

Even the good-old manual controller is inconvenient to use for devices that have more than a couple of degrees of freedom. One specific kind of device that is common in industrial contexts, has a few degrees of freedom and cannot be mapped intuitively to either eye-tracking and body-machine interfaces is parallel robots like Stewart platforms (a common 6 DoF parallel robot type), which would be a prime candidate for teleoperation via EEG signals.

Machines that literally read your mind are still only possible in science fiction, but it is fair to expect in the near future an improvement of signal processing and classification from the AI algorithms that are behind the miracles of brain-machine interfaces. With that and the continuously reducing cost of EEG technology, BMI could become democratized enough that anyone could stumble upon an opportunity to interact with a device with his/her mind and that, if nothing else, is pretty cool.

Contact us

Intro & Part I

For your convenience, we splitted this article in 3 parts: Introduction, Part I and Part II. Please click on the tiles below to access the different parts.

References

1. Andrew T Duchowski. A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments, & Computers, 34(4):455–470, 2002.

2. Martin Leroux, Maxime Raison, T Adadja, and Sofiane Achiche. Combination of eyetracking and computer vision for robotics control. In 2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA), pages 1–6. IEEE, 2015.

3. Martin Leroux, Sofiane Achiche, and Maxime Raison. Assessment of accuracy for target detection in 3d-space using eye tracking and computer vision. PeerJ Preprints, 5:e2718v1, 2017.

4. Qiyun Huang, Yang Chen, Zhijun Zhang, Shenghong He, Rui Zhang, Jun Liu, Yuandong Zhang, Ming Shao, and Yuanqing Li. An eog-based wheelchair robotic arm system for assisting patients with severe spinal cord injuries. Journal of neural engineering, 16(2):026021, 2019.

5. Yann-Seing Law-Kam Cio, Maxime Raison, Cédric Leblond Ménard, and Sofiane Achiche. Proof of concept of an assistive robotic arm control using artificial stereovision and eye-tracking. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 27(12):2344–2352, 2019.

6. Reuben M Aronson and Henny Admoni. Eye gaze for assistive manipulation. In Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, pages 552–554, 2020.

7. Gheorghe-Daniel Voinea and Razvan Boboc. Towards hybrid multimodal brain computer interface for robotic arm command. In Augmented Cognition: 13th International Conference, AC 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Orlando, FL, USA, July 26–31, 2019, Proceedings, volume 11580, page 461. Springer, 2019.

8. Chien-Ming Huang. Human-robot joint action: Coordinating attention, communication, and actions. PhD thesis, The University of Wisconsin-Madison, 2015.

9. Sanders Aspelund, Priya Patel, Mei-Hua Lee, Florian Kagerer, Rajiv Ranganathan, and Ranjan Mukherjee. Controlling a robotic arm for functional tasks using a wireless head-joystick: A case study of a child with congenital
absence of upper and lower limbs. bioRxiv, page 850123, 2019.

10. Hairong Jiang, Juan P Wachs, and Bradley S Duerstock. Integrated visionbased robotic arm interface for operators with upper limb mobility impairments. In 2013 IEEE 13th International Conference on Rehabilitation Robotics (ICORR), pages 1–6. IEEE, 2013.

11. Martin Leroux. Design d’un manipulateur robotique à architecture anthropomorphique. PhD thesis, École Polytechnique de Montréal, 2017.

12. Anand Kumar Mukhopadhyay and Suman Samui. An experimental study on upper limb position invariant emg signal classification based on deep neural network. Biomedical Signal Processing and Control, 55:101669, 2020.

13. François Nougarou, Alexandre Campeau-Lecours, Daniel Massicotte, Mounir Boukadoum, Clément Gosselin, and Benoit Gosselin. Pattern recognition based on hd-semg spatial features extraction for an efficient proportional control of a robotic arm. Biomedical Signal Processing and Control, 53:101550, 2019.

14. Florin Gîrbacia, Cristian Postelnicu, and Gheorghe-Daniel Voinea. Towards using natural user interfaces for robotic arm manipulation. In International Conference on Robotics in Alpe-Adria Danube Region, pages 188–193. Springer, 2019.

15. Taizo Yoshikawa, Viktor Losing, and Emel Demircan. Machine learning for human movement understanding. Advanced Robotics, 34(13):828–844, 2020.

16. Guilherme N DeSouza, Hairong Jiang, Juan P Wachs, and Bradley S Duerstock. Integrated vision-based system for efficient, semi-automated control of a robotic manipulator. International Journal of Intelligent Computing and Cybernetics, 2014.

17. Tommaso Lisini Baldi, Giovanni Spagnoletti, Mihai Dragusanu, and Domenico Prattichizzo. Design of a wearable interface for lightweight robotic arm for people with mobility impairments. In 2017 International Conference on Rehabilitation Robotics (ICORR), pages 1567–1573. IEEE, 2017.

18. Tommaso Lisini Baldi. Human Guidance: Wearable Technologies, Methods, and Experiments. PhD thesis, Istituto Italiano Di Tecnologia.

19. Stevo Bozinovski, Mihail Sestakov, and Liljana Bozinovska. Using eeg alpha rhythm to control a mobile robot. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pages 1515–1516. IEEE, 1988.

20. Jennifer L Collinger, Brian Wodlinger, John E Downey, Wei Wang, Elizabeth C Tyler-Kabara, Douglas J Weber, Angus JC McMorland, Meel Velliste, Michael L Boninger, and Andrew B Schwartz. High-performance neuroprosthetic control by an individual with tetraplegia. The Lancet, 381(9866):557–564, 2013.

21. Dorian Goueytes, Aamir Abbasi, Henri Lassagne, Daniel E Shulz, Luc Estebanez, and Valérie Ego-Stengel. Control of a robotic prosthesis simulation by a closed-loop intracortical brain-machine interface. In 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER), pages 183–186. IEEE, 2019.

22. Fred Achic, Jhon Montero, Christian Penaloza, and Francisco Cuellar. Hybrid bci system to operate an electric wheelchair and a robotic arm for navigation and manipulation tasks. In 2016 IEEE workshop on advanced
robotics and its social impacts (ARSO)
, pages 249–254. IEEE, 2016.

23. David Achanccaray, Juan M Chau, Jairo Pirca, Francisco Sepulveda, and Mitsuhiro Hayashibe. Assistive robot arm controlled by a p300-based brain machine interface for daily activities. In 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER), pages 1171–1174. IEEE, 2019.

24. Valeria Mondini, Reinmar Josef Kobler, Andreea Ioana Sburlea, and Gernot R Müller-Putz. Continuous low-frequency eeg decoding of arm movement for closed-loop, natural control of a robotic arm. Journal of Neural Engineering, 2020.

25. Yuanqing Li, Qiyun Huang, Zhijun Zhang, Tianyou Yu, and Shenghong He. An eeg-/eog-based hybrid brain-computer interface: Application on controlling an integrated wheelchair robotic arm system. Frontiers in Neuroscience, 13:1243, 2019.

26. Jain, Siddarth, and Brenna Argall. "Probabilistic human intent recognition for shared autonomy in assistive robotics." ACM Transactions on Human-Robot Interaction (THRI) 9.1 (2019): 1-23.