CC BY-NC-ND 4.0 · Laryngorhinootologie 2022; 101(S 01): S186-S193
DOI: 10.1055/a-1663-0803
Referat

Human-Robot Interaction: Networked, Adaptive Machines in Medicine

Article in several languages: deutsch | English
Hamid Sadeghian
1   Munich Institute of Robotics and Machine Intelligence (MIRMI), Technische Universität München
,
Abdeldjallil Naceri
1   Munich Institute of Robotics and Machine Intelligence (MIRMI), Technische Universität München
,
Sami Haddadin
1   Munich Institute of Robotics and Machine Intelligence (MIRMI), Technische Universität München
› Author Affiliations
 

Abstract

The application of robotic and intelligent technologies in healthcare is dramatically increasing. The next generation of lightweight and tactile robots have provided a great opportunity to be used for a wide range of applications from medical examination, diagnosis, therapeutic procedures to rehabilitation and assistive robotics. They can potentially outperform current medical procedures by exploiting the com- plementary strengths of humans and computer-based technologies. In this study, the importance of human- robot interaction is discussed and technological re- quirements and challenges in making human-centered robot platforms for medical applications is addressed.


#

1 Introduction

There are a wide range of applications for medical robotics such as surgical and interventional procedures, rehabilitation and assistive, diagnosis, etc. The key fact which is common in all of these applications is that the robot shares the workspace with the human (patient and/or doctor). This is not yet regularly the case for industrial robots, which are usually placed in structured and separated workspace from the human operator. Introducing a robot in the human environment requires extra precautions and anticipations mainly for safety reasons. The robot usually is installed in the same room with the patient and the doctor. Nevertheless, this is not essentially the case in tele-medicine applications where the robot maybe installed at significant distance from the doctor (from several meters to hundreds of kilometers). The robot can be controlled autonomously, based on a predefined program, semi-autonomously by direct guide of the doctor or patient with compliant interaction, or in tele-operation or even tele-presence mode through some mechanical interfaces.

By 2050, the percentage of people over 60 in many European societies will exceed 30% [1]. The growth in the population of elderly people is so dramatic such that it has been considered as a ’silent revolution’ [2]. This means that age related disabilities and diseases will lead to a worldwide crisis in near future. Over the past ten years, national programs for developing smart healthcare facilities such as assistive and rehabilitation robots, surgical robots and tele-medicine system have been accelerated. This trend seems to continue even faster in the current decade with increasing competencies of healthcare technologies to allow people to live longer at higher quality [3]. On the other hand, using centralized and tele-medicine based healthcare systems enable patients to access medical services, regardless of their living location [4]. Evidences also show that robot-assisted tele-medicine is also appealing from an economic perspective [5]. However, to realize the full potential of such models, both technology and infrastructure must be prepared.

Recently and particularly after COVID-19 pandemic, tele-medicine has gained a particular attention. Studies revealed that infections among medical staff needs urgent attention to protect them and prevent the spread of viruses [6]. As it is expected, the healthcare staff can be infected by the patients and increase the risk of spreading the infection among other patients and medical staff. To address this issue, for instance the company Franka Emika in collaboration with university hospital Klinikum rechts der Isar at Munich and our own group, developed a tele-diagnostic station based on the Panda tactile robot arm that comprises a naso-pharyngeal swab, and enough tools to inspect the oral cavity. The physician is connected through a leader robot to control the robot at the diagnostic station [7]. Similar efforts have been done in [8] for ultrasound scanning.

Many existing tele-medicine approaches do not yet include a mechanical robot and are mainly focused on digital and Internet-of-Things (IoT) technologies, enabling networked health information, electronic medical records and audio video stream [9]. For medical scenarios that involve physical interaction, new advanced robotics-based technologies, infrastructures, and doctor-patient physical interaction paradigms are necessary. For instance in [10], a dual doctor- patient twin paradigm is introduced which involves two robotics-based twins; one representing the doctor (on the patient side) and one representing the patient (on the doctor side). Each robotic twin serves as a multi-modal sensor as well as a physical avatar of its human counterpart and a bidirectional tele-medicine approach enables natural physical interaction between the doctor and the patient.

Overall, the advantages offered by medical and assistive robots may be grouped into four main areas:

Improve technical capabilities to perform procedures by exploiting the complementary strengths of humans and robots as illustrated in [Table 1]

Table 1 Complementary strengths of human and robots (originally given for a surgical task [11]).

Human

Robot

Strength

Excellent judgment

Excellent geometric accuracy

Excellent hand–eye coordination

Untiring and stable

Excellent dexterity (at natural human scale)

Immune to ionizing radiation

Able to integrate and act on multiple information source

Can be designed to operate at many different scales of motion and payload

Easily trained

Able to integrate multiple sources of numerical and sensor data

Versatile and able to improvise

Limitation

Prone to fatigue and inattention

Poor judgment

Limited fine motion control due to tremor

Limited hand–eye coordination

Limited manipulation ability and dexterity out-side natural scale

Limited dexterity

Bulky end-effectors (hands)

Hard to adapt to new situations

Limited geometric accuracy

Limited haptic sensing (today)

Affected by radiation, infection.

Improve safety factors by including technical performance and active assistance (for instance through virtual walls, tremor reduction, etc)

Include online information from different sources and making the procedure evidence-based by recording sensory data

Possible implementation of medical procedures over distance through tele-medicine

Apart from the above advantages that are expected for medical and assistive robots, employing a robot in human environment needs many precautions and considerations in advance. In the rest of this article, we discuss the considerations mainly relevant for Human- Robot Interaction (HRI) and Human-Robot Collaboration (HRC) scenarios.


#

2 The Significance of HRI/HRC

Service robots shall employed in the human environments need to interact with people directly. This may happen side by side through sharing the same workspace or via integration with the human for instance as in prostheses or exoskeletons. This interaction may happen at both cognitive and physical level. At cognitive level, the robot has be able to communicate with the human through audio and video, gesture, facial expression, etc. These features partly exist in many computers and smartphone applications and thus can also be integrated into the robots. The robot must be able to perceive, interpret and respond appropriately. Such features also exist in many robotics platform. For instance, the GARMI robot ([Fig. 1]) is able to recognize some verbal commands, or react based on facial expression [10]. These social features enable the robot to interact through human-centric terms and are mainly obtained through the processing of the perceived data of the camera and microphones through machine learning approaches.

Zoom Image
Fig. 1 GARMI, A service robotic platform embedded with appropriate level of intelligence for daily living and healthcare applications.

On the other hand, One of the most revolutionary and challenging features of service robots is their increasing ability to physically interact with humans through their body. Clearly, physical Human-Robot Interaction (pHRI) demands different requirements from the ones in industrial applications. Unlike industrial robots, which are heavy and stiff to guarantee high precision, the robots used in anthropic environments must be designed lightweight and with a high degree of compliance. This is especially true for the applications requiring physical interactions, not only because of unexpected impacts of the robot with humans, but for the execution of collaborative tasks that require intentional exchange of forces along the whole body of the robot. For instance, in many human-robot coexistence applications it is absolutely necessary to move the robot end-effector or the body of robot, kinesthetically. For Ensuring human safety, an extensive study was done in [12] [13] to evaluate the risk of injury during physical interaction and provide a systematic evaluation of safety in human-robot interaction.

Besides above considerations in HRI, the capability of the robot in doing collaborative tasks with humans is essential. When the human and robot share the same workspace, they might interact as a pair toward the same goal. In this case, in order to keep the human in the center of this collaboration, the robot must perceive and anticipate human actions and act in a complementary fashion to ensure joint action and prevent conflicting movements or interactions.


#

3 Technological Requirements

In the previous section the importance of HRI/HRC was discussed. Among all, ensuring safety is one of the main technological requirements that must be embedded to make a robot suitable for near-to-human applications. The safety of this interaction can be guaranteed combining different strategies. In general, the technological requirements can be pursued from two points of view: mechanical design considerations, robot sensing and control paradigms.

3.1 Mechanical Design Considerations

A mechanical robot arm is an essential element in many service robotic applications. It must be designed in such a way that it can be easily adapted to any task by mounting appropriate tools to its end- effector. Moreover, it must be human-friendly, with high payload-to-weight ratio and enough degrees of freedom for the given tasks. The inertia and friction of the robot are very crucial parameters that affect the mechanical bandwidth of the system. These parameters can not be easily alerted through active control of the system. This means that it is almost impossible to ask safety from rigid and heavy robot. On the contrary, it demands tactile lightweight arms with highly integrated joints that include motor, transmission, brake, joint position and torque sensors and power electronics. The system must also have reliable high bandwidth torque control with low response time. These characteristics can not be achieved easily and need state-of-the-art, Specifically, the tactile sense is very important and relies on high resolution torque sensors on the link side of each joint. The above high resolution and accuracy allows robot to dynamically sense the surrounding environment and respond to the physical interactions properly. As an example, [Table 2] illustrates the main sensing and interaction specifications for the Panda robot arm from Franka Emika [14].

Apart from the above safety considerations, the mechanical precision and efficient impedance of the robot are important factors that depend on the specific application. For instance, robots with high precision and stiffness are suitable for needle placement and eye surgeries. On the other hand, rehabilitation robots need low-stiffness and backdrivability because of their task to augment the human body. In sum, the mechanical design of a robot depends on its intended application. However, the main factors in design are,

  • Safety and human-friendly features

  • Integrated design and compactness

  • Mechanical precision, repeatability and stiffness

  • Kinematic redundancy and dexterity

  • Backdrivability


#

3.2 Sensing and Control Paradigms

Medical and assistive robots are supposed to work near the patient/doctor and thus many accidental and intentional interactions may happen. Hence, appropriate collision monitoring and reaction strategies must be embedded. Suitable algorithms can be used to estimate and observe the collision forces from joint positions or torques. For an extensive survey on robot collision detection, isolation, and identification, [15] is referred. Besides the sensing capabilities, the robot compliance must also be increased in order to handle interaction forces. Compliance can be introduced intrinsically into the mechanical structure of the robot (called passive compliance) by using elastic decoupling between the actuator and the driven link with fixed or variable joint stiffness (for example in [16]). However, this may introduce underactuation in the system and makes the control of the robot more challenging and difficult. Alternatively, the compliance can be achieved by relying on fast control loops through force and impedance control [17]. This active compliance is an important semi-autonomous feature and is already an embedded feature in some service robot arms.

For human robot collaboration applications, the robot must be equipped with online human state monitoring systems and high level of reasoning and perception in order to estimate the intention and anticipate the human counterpart. This anticipation is bilateral; it means that the robot must act in such a way that, its behavior can be perceived and anticipated by the human counterpart as normally happens between humans. This is an ultimate goal in all of Human-robot collaboration (HRC) algorithms. However, most of the proposed approach are based on the monitoring of the sensory information such as exchange of force on the task space as well as monitoring of the human environment without reasoning and understanding about the collaboration scenario. For instance in [18] [19] the robot uses a whole-body dynamic model and gesture of the human in order to optimise the position of the co-manipulation task in the workspace and provide more ergonomic configuration for human. A neural network is employed in [20] to estimate the human motion intention for Human-robot collaboration scenario. Furthermore, game theory is used in [21] to adjust its own role according to the human’s intention to lead or follow. This adaptation is inferred by exploiting the measured interaction force and sharing the control between human and robot through an optimization approach. It is worth to mention that each of the proposed approach is very limited in application and no general formalism exist.

In HRC, the control of the robot is usually shared in part with the human. In other words, the robot acts based on the commands from its local controller and the guiding forces of the human user. The shared control approaches can be appealing for many healthcare and medical applications such as rehabilitation [22],] assistive exoskeleton systems [23] [24] tele-operation [25] [26] and robotic surgery [27] [28] As illustrated in [Table 1], humans are better in terms of cognitive abilities, such as situational awareness and decision making skills, while robots are better often in physical abilities, such as the precision and strength. Particularly, the robot can follow a desired trajectory based on prior rough knowledge about the task and the environment autonomously, while the human may provide corrective action, fine-tuning control, and situational guidance. However, as in human-human collaboration, intuitive and successful joint collaboration require knowledge and experience about the specific joint task. Moreover, it requires online verbal/gestural communication as well as human-like skills and reasoning. Communication can be achieved based on gesture and speech recognition and skills can be encoded by combination of primitives [29]. However, reasoning and decision making are supreme human capabilities and can not be easily substituted by machines. Therefore, the current shared control policies are mainly planned based on the human leader and the robot follower.

In sum, the control of robots is performed through one or a combination of the following modalities;

  • Autonomous, semi-autonomous mode in which the robot performs an assigned task without direct control of user.

  • Tele-operation mode in which the robot is under the direct control of human through some (haptic) interfaces ([Fig. 2]).

  • Cooperative and shared mode in which the robot strength and precision are combined with the human intelligence and skills toward some common goal ([Fig. 3]).

Zoom Image
Fig. 2 Examples of bidirectional tele-diagnosis (top) and tele-rehabilitation (bottom) concepts: In both cases the robot arm on the patient side is controlled through a robot arm on the doctor side over distance, relying on precise haptic feedback.
Zoom Image
Fig. 3 Snapshot of a semi-autonomous needle-based medical interventions based on 3d reconstructed CT-scan images on a dummy phantom: A compliant control algo- rithm is used to enable the surgeon to move the needle guide on the target direction. The final insertion is performed by the surgeon. For more information please refer to [28].

For instance the GARMI robot illustrated in [Fig. 1] use the autonomous mode for grasping the auscultation or ultrasound device which further is used for remote examination of a patient by doctor. The same robot combines the second and third modalities for upper-limb tele-rehabilitation through a shared control framework [30]. Moreover, by combination of the first and second modalities it is possible to keep the end-effector of the robot in a specific zone or direction through so- called virtual constraints. This feature, may increase the safety and trust-ability of the tele-operation procedure.


#
#

4 Technological Challenges

As mentioned in the previous sections, medical and assistive procedures almost always involve some form of physical interaction between the patient and a medical tool. This can be considered as the main source of challenges in applications of robots in human environments, which bring other considerations that can be summarized as follows,

  • Mechatronic integration

  • Stability and safety of physical interaction

  • Transparency

  • Communication quality

The demand for efficient and lightweight robot arm requires high level of integration in mechatronics. The payload-to-weight ratio for robots is a very important factor. For instance, the current technology in Panda (from Franka Emika) and iiwa R800 (from Kuka) lightweight robot arms, have reached to the ratio of 3 kg/18 kg and 7 kg/23 kg, respectively. For sake of comparison, this ratio is almost 4 kg/4 kg for average human arm [31]. This mismatch is still a limiting factor in making assistive robots agile and safe enough.

For systems that execute physical interactions with humans, its stability must always be preserved. Stability of the interaction controllers are usually analysed based on passivity approaches. When a robot is exploited to assist the procedure through cooperative modality (for instance in exoskeletons), the control of interaction toward the same goal with the human counterpart is crucial. This is again a challenging issue in tele-operation, in which true and real-time feedback of the interaction is significantly important. This is specifically vital for instance in robotic tele- surgery scenarios where any mismatch or incorrect tactile information may produce unnecessarily large tissue forces. There are of course always technical limitations in transferring transparent and robust tactile sense. Nevertheless, in situations that such information is not perceived correctly by the operation side, extra visual clues or warning can be integrated to compensate.

High transparency has been always a critical requirements in tele-robotics. It describe the accuracy of reflecting a remote environment to the human user and can be considered at different levels. Mechanical transparency considers the mismatch between the environment impedance and the perceived impedance by the operator. In a fully transparent system, the user would feel the same as when directly working on the environment. This alternatively means that no external dynamics is felt by the user during free movements. Having a fully transparent system is almost impossible, and the experience of interaction over the haptic console is always different from the real feeling of the environment on the remote side. The mechanical transparency also is valid for instance for exoskeleton and assistive systems. The full transparency is achieved when the system follows exactly the motion of the user and thus the user does not feel the inertia or any resisting forces. However, typically stability and transparency are conflicting objectives and a trade off has to be made [32]. Besides mechanical transparency, the design of multi-modal interfaces, (for example, by including vision or the virtual model of the environment) might improve situation awareness and reduce human errors. Transparency can also described as the opposite of unpredictability [33]. When the behavior of the system is predictable and observable to the human user it is considered to be more transparent. The level of autonomy affects greatly on the transparency of the system [34]. Both high and low level of autonomy jeopardize the transparency of the system. If the system acts without significant user intervention, the state of the system is considered to be not well observable to the human user. Thus, the user feel that some part of the system is not on his/her control. In contrast, when the human operator almost involves in performing all the tasks, the feel of the system state is better, however the workload on the user increases and thus decreases the awareness and transparency of the system. Thus, an appropriate level of autonomy is particularly important in medical robotics.

Sensing

Force resolution

<0.05 N

Relative force accuracy

0.8 N

Force repeatability

<0.05 N

Torque resolution

<0.02 Nm

Interaction Control

Torque control frequency

1 kHz

Minimum controllable force

0,05 N

Force controller bandwidth

10 Hz

Guiding force

2 N

Collision detection time

<2 ms

Nominal collision reaction time

<50 ms

Communication delay is another challenge which is specifically important for network and tele-operation systems. Both system stability and transparency are affected dramatically by delay and packet loss in the communication channel. All the mentioned interaction control algorithms are routine based on high control rate (1 kHz) feedback control and tolerate very low delays only. This is not problematic as far as the controller works based local sensory feedback. However, in geographically distributed tele-operation systems, such as tele-surgery, a bidirectional channel for haptic signals is established and thus the local controller on each side needs the information on the other side. Even high bandwidth, ultra-low latency protocol such as 5G sometimes fail to provide reliable data transfer. The communication delay depends on distance and the infrastructure and may range from a few milliseconds up to several hundreds of milliseconds. The communication channel must have enough bandwidth to transfer high quality video and audio stream in realtime for most of the scenarios. This is a classical challenge and all the solutions proposed in the literature sacrifice part of the system transparency to handle it.


#

5 Conclusion and Future Directions

It is apparent that medical robotics and in general computer integrated medicine inevitably is changing our clinical experiences and routines. Regularly, new applications are proposed aiming to transcend human limitations. However, out of many researches and proposed applications in interventional medical robotics, only a few of them have been commercialized and employed broadly to assist doctors and patients. The situation sounds the same for assistive and rehabilitation robotics. Apart from technological limitations, the cost of the products, the ease of use and the level of acceptance in the society are other major factors that affect on the spread of healthcare robotic technologies. The outbreak of the corona pandemic in spring 2020 particularly indicated the importance of digitalization and artificial intelligence to maintain public life. It has also become more clear how technology can provide benefit to improve the quality of medical care and reduce the load and risk of infections on healthcare staff. Similar to other technologies, human needs will play a major role in defining what the future of health- care robotics will bring for us. Our past experiences show that humans even may adapt their behaviors and environment to robots when considering that change advantageous.

September 30, 2021


#

Hinweis

Dieser Artikel wurde gemäß des Erratums vom 11.7.2022 geändert.


#

Erratum

Im oben genannten Artikel war der englische Titel falsch angegeben. Der korrekte englische Titel lautet „Human- Robot Interaction: Networked, Adaptive Machines in Medicine“


#
#

Interessenkonflikt

Sami Haddadin hat einen Interessenkonflikt als Gesellschafter der Franka Emika GmbH.

  • Literatur

  • 1 Robotics E. Strategic research agenda for robotics in europe 2014–2020 IEEE Robot. Au- tom. Mag 2014; 24: 171
  • 2 Walker A, Gemeinschaften GBE. Age and attitudes: main results from a Eurobarometer survey. Commission of the European Communities. 1993
  • 3 Siciliano B, Khatib O, Kröger T. Springer handbook of robotics. Springer; 2008. 200.
  • 4 Ostermann M, Vincent J-L. How much centralization of critical care services in the era of telemedicine? critical care. 23. 2019
  • 5 Jang SM, Lee K, Hong Y-J, Kim J, Kim S. Economic evaluation of robot- based telemedicine consultation services. Telemedicine and e-Health 2020; 26: 1134-1140
  • 6 Chang D, Xu H, Rebaza A, Sharma L, Cruz CSD. Protecting health-care workers from subclinical coronavirus infection. The Lancet Respiratory Medicine 2020; 8: e13
  • 7 Fuchtmann J, Krumpholz R, Berlet M, Ostler D, Feussner H, Haddadin S, Wilhelm D. Covid-19 and beyond: development of a comprehensive telemedical diagnostic framework. International Journal of Computer Assisted Radiology and Surgery 2021; 1-10
  • 8 Akbari M, Carriere J, Meyer T, Sloboda R, Husain S, Usmani N, Tavakoli M. Robotic ultrasound scanning with real-time imagebased force adjustment: Quick response for enabling physical distancing during the covid-19 pan-demic. Frontiers in Robotics and AI 2021; 8: 62
  • 9 Becker CD, Dandy K, Gaujean M, Fusaro M, Scurlock C. Legal perspectives on telemedicine part 2: telemedicine in the intensive care unit and medicolegal risk The Permanente Journal. 23. 2019
  • 10 Tröbinger M, Jähne C, Qu Z, Elsner J, Reindl A, Getz S, Goll T, Loinger B, Loibl T, Kugler C, Calafell C, Sabaghian M, Ende T, Wahrmann D, Parusel S, Haddadin S, Haddadin S. Introducing GARMI – A service robotics platform to support the elderly at home: Design philosophy, system overview and first results. IEEE Robotics and Automation Letters 2021; 6: 5857-5864
  • 11 Taylor R, Joskowicz L. Computer-integrated surgery and medical robotics in: Standard handbook of biomedical engineering & design. McGraw-Hill Education; 2003
  • 12 Haddadin S, Albu-Schäffer A, Strohmayr M, Frommberger M, Hirzinger G. Injury evaluation of human-robot impacts in 2008 IEEE International Conference on Robotics and Automation. IEEE; 2008: 2203-2204
  • 13 Haddadin S, Albu-Schäffer A, Hirzinger G. Requirements for safe robots: Measurements, analysis and new insights. The International Journal of Robotics Research 28: 1507-1527 2009;
  • 14 Franka Emika https://www.franka.de/
  • 15 Haddadin S, De Luca A, Albu-Schäffer A. Robot collisions: A survey on detection, isolation, and identification. IEEE Transactions on Robotics 33: 1292-1312 2017;
  • 16 Grebenstein M, Albu-Schäffer A, Bahls T, Chalon M, Eiberger O, Friedl W, Gruber R, Haddadin S, Hagn U, Haslinger R. et al The DLR hand arm system in 2011 IEEE International Conference on Robotics and Automation. IEEE; 2011: 3175-3182
  • 17 Sadeghian H, Villani L, Keshmiri M, Si- ciliano B. Task-space control of robot manipulators with null-space compliance. IEEE Trans-actions on Robotics 30: 493-506 2013;
  • 18 Peternel L, Kim W, Babič J, Ajoudani A. Towards ergonomic control of human-robot co-manipulation and handover in 2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids). IEEE; 2017: 55-60
  • 19 Kim W, Lorenzini M, Balatti P, Nguyen PD, Pattacini U, Tikhanoff V, Peternel L, Fan- tacci C, Natale L, Metta G. et al. Adaptable workstations for human-robot collaboration: A reconfigurable framework for improving worker ergonomics and productivity. IEEE Robotics & Automation Magazine 2019; 26: 14-26
  • 20 Li Y, Ge SS. Human-robot collaboration based on motion intention estimation. IEEE/ASME Transactions on Mechatronics 2013; 19: 1007-1014
  • 21 Li Y, Tee KP, Chan WL, Yan R, Chua Y, Limbu DK. Continuous role adaptation for human–robot shared control. IEEE Transactions on Robotics 2015; 31: 672-681
  • 22 Riener R, Duschau-Wicke A, König A, Bolliger M, Wieser M, Vallery H. Automation in rehabilitation: How to include the human into the loop in World Congress on Medical Physics and Biomedical Engineering, September 7-12, 2009. Munich, Germany: Springer; 2009: 180-183
  • 23 Quere G, Hagengruber A, Iskandar M, Bus- tamante S, Leidner D, Stulp F, Vogel J. Shared control templates for assistive robotics in 2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE; 2020: 1956-1962
  • 24 Aguirre-Ollinger G, Colgate JE, Peshkin MA, Goswami A. Active-impedance control of a lower-limb assistive exoskeleton in 2007 IEEE 10th international conference on re- habilitation robotics. IEEE; 2007: 188-195
  • 25 Hirche S, Buss M. Human-oriented control for haptic teleoperation. Proceedings of the IEEE 2012; 100: 623-647
  • 26 Tonin L, Leeb R, Tavella M, Perdikis S, Millán J.d.R. The role of shared-control in bci- based telepresence in 2010 IEEE International Conference on Systems, Man and Cybernetics. IEEE; 2010: 1462-1466
  • 27 Tobergte A, Konietschke R, Hirzinger G. Planning and control of a teleoperation system for research in minimally invasive robotic surgery in 2009 IEEE International Conference on Robotics and Automation. IEEE; 2009: 4225-4232
  • 28 Sadeghian H, Barkhordari M, Kamranian Z, Jafarpisheh MS. Robotic needle positioning based on ct-scan images: Constrained admittance realization. Journal of Medical Robotics Research 2020; 5: 2150001
  • 29 Johannsmeier L, Gerchow M, Haddadin S. A framework for robot manipulation: Skill for-malism, meta learning and adaptive control in 2019 International Conference on Robotics and Automation (ICRA). IEEE; 2019: 5844-5850
  • 30 Tröbinger M, Costinescu A, Xing H, Elsner J, Hu T, Naceri A, Figueredo L, Jensen E, Burschka D, Haddadin S. A dual doctor-patient twin paradigm for transparent remote examination, diagnosis, and rehabilitation. IEEE/RSJ International Conference on Intelligent Robots and Systems 2021; 6: 5857-5864
  • 31 Gealy DV, McKinley S, Yi B, Wu P, Downey PR, Balke G, Zhao A, Guo M, Thomas- son R, Sinclair A., Cuellar P, McCarthy Z, Abbeel P. Quasi-direct drive for low-cost compliant robotic manipulation in 2019 International Conference on Robotics and Automation (ICRA). IEEE; 2019: 437-443
  • 32 Lawrence DA. Stability and transparency in bilateral teleoperation. IEEE transactions on robotics and automation 1993; 9: 624-637
  • 33 Alonso V, De La Puente P. System transparency in shared autonomy: A mini review. Frontiers in neurorobotics 2018; 12: 83
  • 34 Miller CA. The risks of discretization: what is lost in (even good) levels-of-automation schemes. Journal of Cognitive Engineering and Decision Making 2018; 12: 74-76

Korrespondenzadresse

Hamid Sadeghian

Publication History

Article published online:
23 May 2022

© 2022. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial-License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/).

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

  • Literatur

  • 1 Robotics E. Strategic research agenda for robotics in europe 2014–2020 IEEE Robot. Au- tom. Mag 2014; 24: 171
  • 2 Walker A, Gemeinschaften GBE. Age and attitudes: main results from a Eurobarometer survey. Commission of the European Communities. 1993
  • 3 Siciliano B, Khatib O, Kröger T. Springer handbook of robotics. Springer; 2008. 200.
  • 4 Ostermann M, Vincent J-L. How much centralization of critical care services in the era of telemedicine? critical care. 23. 2019
  • 5 Jang SM, Lee K, Hong Y-J, Kim J, Kim S. Economic evaluation of robot- based telemedicine consultation services. Telemedicine and e-Health 2020; 26: 1134-1140
  • 6 Chang D, Xu H, Rebaza A, Sharma L, Cruz CSD. Protecting health-care workers from subclinical coronavirus infection. The Lancet Respiratory Medicine 2020; 8: e13
  • 7 Fuchtmann J, Krumpholz R, Berlet M, Ostler D, Feussner H, Haddadin S, Wilhelm D. Covid-19 and beyond: development of a comprehensive telemedical diagnostic framework. International Journal of Computer Assisted Radiology and Surgery 2021; 1-10
  • 8 Akbari M, Carriere J, Meyer T, Sloboda R, Husain S, Usmani N, Tavakoli M. Robotic ultrasound scanning with real-time imagebased force adjustment: Quick response for enabling physical distancing during the covid-19 pan-demic. Frontiers in Robotics and AI 2021; 8: 62
  • 9 Becker CD, Dandy K, Gaujean M, Fusaro M, Scurlock C. Legal perspectives on telemedicine part 2: telemedicine in the intensive care unit and medicolegal risk The Permanente Journal. 23. 2019
  • 10 Tröbinger M, Jähne C, Qu Z, Elsner J, Reindl A, Getz S, Goll T, Loinger B, Loibl T, Kugler C, Calafell C, Sabaghian M, Ende T, Wahrmann D, Parusel S, Haddadin S, Haddadin S. Introducing GARMI – A service robotics platform to support the elderly at home: Design philosophy, system overview and first results. IEEE Robotics and Automation Letters 2021; 6: 5857-5864
  • 11 Taylor R, Joskowicz L. Computer-integrated surgery and medical robotics in: Standard handbook of biomedical engineering & design. McGraw-Hill Education; 2003
  • 12 Haddadin S, Albu-Schäffer A, Strohmayr M, Frommberger M, Hirzinger G. Injury evaluation of human-robot impacts in 2008 IEEE International Conference on Robotics and Automation. IEEE; 2008: 2203-2204
  • 13 Haddadin S, Albu-Schäffer A, Hirzinger G. Requirements for safe robots: Measurements, analysis and new insights. The International Journal of Robotics Research 28: 1507-1527 2009;
  • 14 Franka Emika https://www.franka.de/
  • 15 Haddadin S, De Luca A, Albu-Schäffer A. Robot collisions: A survey on detection, isolation, and identification. IEEE Transactions on Robotics 33: 1292-1312 2017;
  • 16 Grebenstein M, Albu-Schäffer A, Bahls T, Chalon M, Eiberger O, Friedl W, Gruber R, Haddadin S, Hagn U, Haslinger R. et al The DLR hand arm system in 2011 IEEE International Conference on Robotics and Automation. IEEE; 2011: 3175-3182
  • 17 Sadeghian H, Villani L, Keshmiri M, Si- ciliano B. Task-space control of robot manipulators with null-space compliance. IEEE Trans-actions on Robotics 30: 493-506 2013;
  • 18 Peternel L, Kim W, Babič J, Ajoudani A. Towards ergonomic control of human-robot co-manipulation and handover in 2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids). IEEE; 2017: 55-60
  • 19 Kim W, Lorenzini M, Balatti P, Nguyen PD, Pattacini U, Tikhanoff V, Peternel L, Fan- tacci C, Natale L, Metta G. et al. Adaptable workstations for human-robot collaboration: A reconfigurable framework for improving worker ergonomics and productivity. IEEE Robotics & Automation Magazine 2019; 26: 14-26
  • 20 Li Y, Ge SS. Human-robot collaboration based on motion intention estimation. IEEE/ASME Transactions on Mechatronics 2013; 19: 1007-1014
  • 21 Li Y, Tee KP, Chan WL, Yan R, Chua Y, Limbu DK. Continuous role adaptation for human–robot shared control. IEEE Transactions on Robotics 2015; 31: 672-681
  • 22 Riener R, Duschau-Wicke A, König A, Bolliger M, Wieser M, Vallery H. Automation in rehabilitation: How to include the human into the loop in World Congress on Medical Physics and Biomedical Engineering, September 7-12, 2009. Munich, Germany: Springer; 2009: 180-183
  • 23 Quere G, Hagengruber A, Iskandar M, Bus- tamante S, Leidner D, Stulp F, Vogel J. Shared control templates for assistive robotics in 2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE; 2020: 1956-1962
  • 24 Aguirre-Ollinger G, Colgate JE, Peshkin MA, Goswami A. Active-impedance control of a lower-limb assistive exoskeleton in 2007 IEEE 10th international conference on re- habilitation robotics. IEEE; 2007: 188-195
  • 25 Hirche S, Buss M. Human-oriented control for haptic teleoperation. Proceedings of the IEEE 2012; 100: 623-647
  • 26 Tonin L, Leeb R, Tavella M, Perdikis S, Millán J.d.R. The role of shared-control in bci- based telepresence in 2010 IEEE International Conference on Systems, Man and Cybernetics. IEEE; 2010: 1462-1466
  • 27 Tobergte A, Konietschke R, Hirzinger G. Planning and control of a teleoperation system for research in minimally invasive robotic surgery in 2009 IEEE International Conference on Robotics and Automation. IEEE; 2009: 4225-4232
  • 28 Sadeghian H, Barkhordari M, Kamranian Z, Jafarpisheh MS. Robotic needle positioning based on ct-scan images: Constrained admittance realization. Journal of Medical Robotics Research 2020; 5: 2150001
  • 29 Johannsmeier L, Gerchow M, Haddadin S. A framework for robot manipulation: Skill for-malism, meta learning and adaptive control in 2019 International Conference on Robotics and Automation (ICRA). IEEE; 2019: 5844-5850
  • 30 Tröbinger M, Costinescu A, Xing H, Elsner J, Hu T, Naceri A, Figueredo L, Jensen E, Burschka D, Haddadin S. A dual doctor-patient twin paradigm for transparent remote examination, diagnosis, and rehabilitation. IEEE/RSJ International Conference on Intelligent Robots and Systems 2021; 6: 5857-5864
  • 31 Gealy DV, McKinley S, Yi B, Wu P, Downey PR, Balke G, Zhao A, Guo M, Thomas- son R, Sinclair A., Cuellar P, McCarthy Z, Abbeel P. Quasi-direct drive for low-cost compliant robotic manipulation in 2019 International Conference on Robotics and Automation (ICRA). IEEE; 2019: 437-443
  • 32 Lawrence DA. Stability and transparency in bilateral teleoperation. IEEE transactions on robotics and automation 1993; 9: 624-637
  • 33 Alonso V, De La Puente P. System transparency in shared autonomy: A mini review. Frontiers in neurorobotics 2018; 12: 83
  • 34 Miller CA. The risks of discretization: what is lost in (even good) levels-of-automation schemes. Journal of Cognitive Engineering and Decision Making 2018; 12: 74-76

Zoom Image
Abb. 1 GARMI, ein intelligenter Serviceroboter für die Anwendung im Alltag oder im Gesundheitsbereich.
Zoom Image
Abb. 2 Beispiele für bidirektionale Konzepte der Tele-Diagnose (oben) und Tele-Rehabilitation (unten): In beiden Fällen wird der Roboterarm auf der Patientenseite durch einen Roboterarm auf der Arztseite über die Entfernung gesteuert, wobei er sich auf eine präzise haptische Rückkopplung stützt.
Zoom Image
Abb. 3 Halbautonomer, nadelbasierter medizinischer Eingriff auf Grundlage 3D-rekonstruierter CT-Scan-Bilder an Dummy-Phantom: Ein nachgiebiger Steuerungsalgorithmus ermöglicht es dem Chirurgen, die Nadelführung physisch in die gewünschte Richtung zu bewegen. Das endgültige Einführen der Nadel wird vom Chirurgen durchgeführt [28].
Zoom Image
Fig. 1 GARMI, A service robotic platform embedded with appropriate level of intelligence for daily living and healthcare applications.
Zoom Image
Fig. 2 Examples of bidirectional tele-diagnosis (top) and tele-rehabilitation (bottom) concepts: In both cases the robot arm on the patient side is controlled through a robot arm on the doctor side over distance, relying on precise haptic feedback.
Zoom Image
Fig. 3 Snapshot of a semi-autonomous needle-based medical interventions based on 3d reconstructed CT-scan images on a dummy phantom: A compliant control algo- rithm is used to enable the surgeon to move the needle guide on the target direction. The final insertion is performed by the surgeon. For more information please refer to [28].