Introduction

How existing assistive human-machine interfaces could change the way we interact with industrial robots?

Human-machine interfaces for assistive technologies

One of the original challenges of assistive robotics was to take a technology generally viewed as advanced and complicated, often only manipulated by highly specialized people in the automation industry and convince the physically impaired that they could use it on a day to day basis. To this day, human-machine and human-robot interface (HMI/HRI) are still vibrant fields of research. Of course, attempts at creating assistive technologies did not start by trying to put a robotic manipulator in the hands of the patients. In fact, a lot of the adapted control interfaces were first developed to control electric powered wheelchairs or computers - common items that are key enablers for social interaction and general quality of life.

OBI assistive eating device for people with upper-body reduced mobility featured in Kinova booth show

The key in human-machine interfacing for assistive technologies is to adapt to the user. The disabilities and physical limitations that drive people to use them are various, so the solution for them must be as well. As the field grew more mature, new methods were developed for human-machine interfaces to make assistive tools available to a larger and larger clientele. Researchers are also improving the existing methods to a level that no longer feels like a better-than-nothing fix, but rather a complete and intuitive solution, making them more accessible than ever.

Men with upper-body reduced mobility in an electrical wheelchair help a little girl with puzzle using the Kinova JACO arm

Meanwhile, in the industrial world…

The industrial world is seeing a shift in philosophy regarding robots. Where robots used to be fully automated and potentially dangerous machines, now collaborative robots - often shortened to cobots - are making their way to factory floors. These robots are meant to be safe and used by common factory workers instead of highly specialized engineers.

The classic HRI from industrial robots, like teach pendants and manual controllers, are reworked into new, more accessible and more intuitive versions. However, the actual technology behind the interfaces hasn’t evolved much.

In this series, we go over some of the creative HMI that was developed by the research community from assistive robotics and discuss how the technology could be transferred in an industrial or professional context to provide benefits and transform the way we teach and teleoperated robots.

Part I - Read more

Part I & Part II

For your convenience, we splitted this article in 3 parts: Introduction, Part I and Part II. Please click on the tiles below to access the different parts.

References

1. Andrew T Duchowski. A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments, & Computers, 34(4):455–470, 2002.

2. Martin Leroux, Maxime Raison, T Adadja, and Sofiane Achiche. Combination of eyetracking and computer vision for robotics control. In 2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA), pages 1–6. IEEE, 2015.

3. Martin Leroux, Sofiane Achiche, and Maxime Raison. Assessment of accuracy for target detection in 3d-space using eye tracking and computer vision. PeerJ Preprints, 5:e2718v1, 2017.

4. Qiyun Huang, Yang Chen, Zhijun Zhang, Shenghong He, Rui Zhang, Jun Liu, Yuandong Zhang, Ming Shao, and Yuanqing Li. An eog-based wheelchair robotic arm system for assisting patients with severe spinal cord injuries. Journal of neural engineering, 16(2):026021, 2019.

5. Yann-Seing Law-Kam Cio, Maxime Raison, Cédric Leblond Ménard, and Sofiane Achiche. Proof of concept of an assistive robotic arm control using artificial stereovision and eye-tracking. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 27(12):2344–2352, 2019.

6. Reuben M Aronson and Henny Admoni. Eye gaze for assistive manipulation. In Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, pages 552–554, 2020.

7. Gheorghe-Daniel Voinea and Razvan Boboc. Towards hybrid multimodal brain computer interface for robotic arm command. In Augmented Cognition: 13th International Conference, AC 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Orlando, FL, USA, July 26–31, 2019, Proceedings, volume 11580, page 461. Springer, 2019.

8. Chien-Ming Huang. Human-robot joint action: Coordinating attention, communication, and actions. PhD thesis, The University of Wisconsin-Madison, 2015.

9. Sanders Aspelund, Priya Patel, Mei-Hua Lee, Florian Kagerer, Rajiv Ranganathan, and Ranjan Mukherjee. Controlling a robotic arm for functional tasks using a wireless head-joystick: A case study of a child with congenital
absence of upper and lower limbs. bioRxiv, page 850123, 2019.

10. Hairong Jiang, Juan P Wachs, and Bradley S Duerstock. Integrated visionbased robotic arm interface for operators with upper limb mobility impairments. In 2013 IEEE 13th International Conference on Rehabilitation Robotics (ICORR), pages 1–6. IEEE, 2013.

11. Martin Leroux. Design d’un manipulateur robotique à architecture anthropomorphique. PhD thesis, École Polytechnique de Montréal, 2017.

12. Anand Kumar Mukhopadhyay and Suman Samui. An experimental study on upper limb position invariant emg signal classification based on deep neural network. Biomedical Signal Processing and Control, 55:101669, 2020.

13. François Nougarou, Alexandre Campeau-Lecours, Daniel Massicotte, Mounir Boukadoum, Clément Gosselin, and Benoit Gosselin. Pattern recognition based on hd-semg spatial features extraction for an efficient proportional control of a robotic arm. Biomedical Signal Processing and Control, 53:101550, 2019.

14. Florin Gîrbacia, Cristian Postelnicu, and Gheorghe-Daniel Voinea. Towards using natural user interfaces for robotic arm manipulation. In International Conference on Robotics in Alpe-Adria Danube Region, pages 188–193. Springer, 2019.

15. Taizo Yoshikawa, Viktor Losing, and Emel Demircan. Machine learning for human movement understanding. Advanced Robotics, 34(13):828–844, 2020.

16. Guilherme N DeSouza, Hairong Jiang, Juan P Wachs, and Bradley S Duerstock. Integrated vision-based system for efficient, semi-automated control of a robotic manipulator. International Journal of Intelligent Computing and Cybernetics, 2014.

17. Tommaso Lisini Baldi, Giovanni Spagnoletti, Mihai Dragusanu, and Domenico Prattichizzo. Design of a wearable interface for lightweight robotic arm for people with mobility impairments. In 2017 International Conference on Rehabilitation Robotics (ICORR), pages 1567–1573. IEEE, 2017.

18. Tommaso Lisini Baldi. Human Guidance: Wearable Technologies, Methods, and Experiments. PhD thesis, Istituto Italiano Di Tecnologia.

19. Stevo Bozinovski, Mihail Sestakov, and Liljana Bozinovska. Using eeg alpha rhythm to control a mobile robot. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pages 1515–1516. IEEE, 1988.

20. Jennifer L Collinger, Brian Wodlinger, John E Downey, Wei Wang, Elizabeth C Tyler-Kabara, Douglas J Weber, Angus JC McMorland, Meel Velliste, Michael L Boninger, and Andrew B Schwartz. High-performance neuroprosthetic control by an individual with tetraplegia. The Lancet, 381(9866):557–564, 2013.

21. Dorian Goueytes, Aamir Abbasi, Henri Lassagne, Daniel E Shulz, Luc Estebanez, and Valérie Ego-Stengel. Control of a robotic prosthesis simulation by a closed-loop intracortical brain-machine interface. In 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER), pages 183–186. IEEE, 2019.

22. Fred Achic, Jhon Montero, Christian Penaloza, and Francisco Cuellar. Hybrid bci system to operate an electric wheelchair and a robotic arm for navigation and manipulation tasks. In 2016 IEEE workshop on advanced
robotics and its social impacts (ARSO)
, pages 249–254. IEEE, 2016.

23. David Achanccaray, Juan M Chau, Jairo Pirca, Francisco Sepulveda, and Mitsuhiro Hayashibe. Assistive robot arm controlled by a p300-based brain machine interface for daily activities. In 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER), pages 1171–1174. IEEE, 2019.

24. Valeria Mondini, Reinmar Josef Kobler, Andreea Ioana Sburlea, and Gernot R Müller-Putz. Continuous low-frequency eeg decoding of arm movement for closed-loop, natural control of a robotic arm. Journal of Neural Engineering, 2020.

25. Yuanqing Li, Qiyun Huang, Zhijun Zhang, Tianyou Yu, and Shenghong He. An eeg-/eog-based hybrid brain-computer interface: Application on controlling an integrated wheelchair robotic arm system. Frontiers in Neuroscience, 13:1243, 2019.

26. Jain, Siddarth, and Brenna Argall. "Probabilistic human intent recognition for shared autonomy in assistive robotics." ACM Transactions on Human-Robot Interaction (THRI) 9.1 (2019): 1-23.