The Spy who Loved and Nursed Me

Assistenzroboter Robear von Riken. Bild: Screenshot von Riken-Video

Robots and AI Systems in Healthcare from the Perspective of Information Ethics

Der folgende Beitrag ist vor 2021 erschienen. Unsere Redaktion hat seither ein neues Leitbild und redaktionelle Standards. Weitere Informationen finden Sie hier.

Robots in the health sector are important, valuable innovations and supplements. As therapy and nursing robots, they take care of us and come close to us. In addition, other service robots are widespread in nursing and retirement homes and hospitals. With the help of their sensors, all of them are able to recognize us, to examine and classify us, and to evaluate our behavior and appearance. Some of these robots will pass on our personal data to humans and machines. They invade our privacy and challenge the informational autonomy. This is a problem for the institutions and the people that needs to be solved.

1. Introduction

Robots are spreading in healthcare (Becker et al. 2013). Surgical robots such as the Da Vinci Surgical System are omnipresent, and therapy robots are common. There exist numerous prototypes of nursing robots, and one or the other model is already being used experimentally (Bendel 2018a). In addition, service robots of all kinds penetrate in care and therapy and in assisted living.

This article first presents the types of robots in the health sector. Surgical robots are a special case - they are telerobots and therefore omitted. Also omitted are exoskeletons and high-tech prostheses, because they are machines which are connected with the humans and controlled by them, and sex robots, because they have little importance in this context (Bendel 2018c; Levy 2007). The focus is, therefore, on more or less autonomous machines.

For selected examples, the author presents their technical possibilities, e.g. the extension of robots with the help of sensors and artificial intelligence. With regard to this description, questions from information ethics are highlighted, especially with regard to privacy (Calo 2011) and informational autonomy (Kuhlen 2004), whereby the author refers again to the types and thus to the areas of application. For this purpose, he uses findings from the literature.

Furthermore, own considerations are made, especially using the dialectical method (application of ethical terms to existing practice). Finally, possible solutions from different areas are proposed and a summary of the results is provided.

2. The Perspective of Information Ethics

Ethics is a millennia-old discipline of philosophy. It goes back to Aristotle and was heavily influenced by Immanuel Kant and Jeremy Bentham, who stand for two of the most well-known normative models, deontology and consequentialism, more specifically utilitarianism (Pieper 2007).

Practical ethics and applied ethics are associated with the specific areas of ethics that are specialized in selected fields. Information ethics is concerned with the morality of the information society (Kuhlen 2004; Floridi 2015; Bendel 2016). It examines how we behave or should behave in moral terms when offering and using information and communication technologies (ICT), information systems and new media. In a sense, it is at the centre of the other specific fields of applied ethics whose representatives must communicate with it (Bendel 2016; Bendel 2013b).

Medical ethics applies itself to the morality in medicine (Hope and Dunn 2018). It examines the moral thinking and behavior in relation to the treatment of human diseases and health promotion and asks for the morally desired and wanted when dealing with human diseases and health as well as when dealing with the aging and the moribund. Because the health sector is integrating more and more information technology, medical ethics is required to cooperate closely with information ethics (Bendel 2013b).

One can argue similarly with regard to business ethics (including corporate ethics and consumer ethics) (Göbel 2010), when it is confronted with the phenomenon of digitalization, which transforms the economy and society. The purpose of this article is to argue, above all, from the perspective of information ethics, and to use its terms and methods. It has to be taken into account at all times that medical ethics and business ethics must also be involved in more intensive studies, as well as technology ethics with its very wide field of research. There are several publications on robots and care and therapy assistance systems considering the different perspectives (Becker at al. 2013; Kollek 2013; Salichs et al. 2016; Santoni de Sio and van Wynsberghe 2016).

Ethics has been dealing with privacy for a long time. Together with other disciplines, it shows how it or its concept changes over time and what it means for human beings and being human. It asks how the loss and profit of privacy can be classified in a moral sense. The sacrifice of privacy seems to be a problem of the information society, when considering what some people divulge in social networks and communication services, and that they make their private life more or less public in virtual space (Bendel 2016). This brings us back again to the discipline of information ethics.

Informational autonomy (Kuhlen 2004; Bendel 2016) is a central concept of ethics. It is related to the concept of informational self-determination, which is also a legal term. Examples for informational autonomy are: to be able to decide on the creation and dissemination of one’s personal data, and to be able to view and change it or have it deleted. Informational autonomy also applies when the user realizes how personal data is collected, processed and interpreted and can take appropriate action. Overall, relations with the freedom of access to information and the right to be forgotten exist (Mayer-Schönberger and Ramge 2017).

The use of robots can be a matter of privacy (Calo 2011; van Wynsberghe 2016; Subramanian 2017) and informational autonomy alike. The system (with which a person may be connected, a caregiver, a therapist or some other interested person) penetrates physically into the privacy, occupies space and takes up attention and resources, collects and processes data, and questions the informational autonomy of the concerned person.

Privacy is a broad concept, informational autonomy no less; their points of contact concerning the invasion of privacy lie where a data subject and a data problem exist.

3. Types of Robots in the Health Sector

Below are three different types that are relevant in the health sector, the technologies already mentioned (like exoskeletons) are ignored. Therapy robots and nursing robots belong to the category of specialized service robots and are designed and programmed for their respective application areas. It is interesting that also service robots are being used which have actually nothing to do with healthcare, whereby some of them are extended by programming. These are referred to as assistant robots, where this term also designates a subset of the service robots and can even include therapy and nursing robots. In each category, examples are singled out and their technical possibilities are presented. The list is not complete, but the examples are among the best known in this area, as a literature analysis has shown (Becker et al. 2013; Bendel 2018a).

3.1 Therapy Robots

Therapy robots support therapeutic measures or apply them themselves, often as autonomous machines (Sedenberg 2016; Bendel 2018a; Bendel 2015; Bendel 2013a). They do exercises with the injured and paralyzed, entertain the elderly and challenge demented or autistic persons with questions and games. Some have mimic, gestural and linguistic abilities and are - to a certain extent - capable of thinking and learning (if you want to apply these terms to computer systems). Advantages include possible savings and reusability, disadvantages are possible adverse effects in therapy and lack of acceptance among relatives.

The artificial seal Paro, which has been in use for years in therapy facilities and nursing homes all around the world, and designed especially for people with dementia, is well-known even among non-affected groups (Bendel 2018; Glende et al. 2016). It understands its name (or the name you give it), remembers how well or badly it is treated and how often it has been stroked, and expresses its feelings (which it does not really have, of course) through sounds and movements. It can whimper, prop up on its flippers and raise its head. Paro is not mobile and has sensors for brightness, temperature and sounds.

Also well-known is Keepon by BeatsBots, a yellow robot that is supposed to monitor and improve the social interaction of autistic children. It is meanwhile available on the mass market - probably because it looks funny, likes to be tickled and can dance. Keepon is very small, not mobile and equipped with microphone, cameras and touch sensors.

The humanoid Milo as a variant of Zeno (a product of Hanson Robotics) is also aimed at autistic children. It is as big as a toddler, can walk and talk and has mimic and gestural skills. A touchscreen and a camera sit in its chest. It has eight microphones that enable it to determine through localization who exactly is addressing it.

3.2 Nursing Robots

Care and nursing robots complement or substitute human caregivers (Bendel 2018a; Bendel 2013a; van Wynsberghe 2016). They supply the necessary medicines, food and garments to those in need of care, and help them to lay down and to sit up, and to transfer them to another bed. They entertain patients and provide auditory and visual interfaces to human carers. Some of them have natural language skills that can in principle consist of several languages, which may be relevant in this context, and are, to a certain extent, intelligent, able to memorize or capable of learning. The advantages are continuous applicability and consistent quality of service. Disadvantages are cost intensity and lack of human contacts.

Care-O-bot 4 from Fraunhofer IPA is a mobile system and can fetch and take away things (Bendel 2015). Originally conceived for the health sector, it can now often be seen (without arms and named Paul) in shopping malls in Switzerland and in Germany. Care-O-bot 4 has a loudspeaker, microphones for speech recognition and cameras for face recognition and recognition of gestures. Via its display, it can show different moods, for example via facial expressions in a virtual face. It is a networked system.

Robear (previous versions RIBA and RIBA-II) by Riken, so like a bear to look at as its name promises, and almost as heavy, large and strong as such an animal, works in tandem with the caregiver and assists in the rebedding and raising of the patients. It can roll around and features torque sensors and tactile sensors, plus cameras and a microphone.

F&P Robotics has developed P-Rob. It has an arm and a hand with two fingers, like classical cooperation and collaboration robots in the industry (Bendel 2018b). The gripper has sensors. It can fulfil its function in both nursing and therapy. Lio, another product of this company, has a camera for person and object recognition, a sensitive case with touch sensors and an interactive display for communication.

3.3 Assistant Robots

Assistant robots have support functions, information functions, navigation functions, entertainment and game features. Just as Siri, Cortana and Google Assistant help their owners or users, they help them with all kinds of tasks. The difference is that they are physically present and partially mobile (Siri is also mobile in a certain way, because she provides her services on the smartphone, but, as a virtual assistant, she does not herself move through the physical world). In the given context, assistant robots inform patients, entertain them and show them movements. One advantage is that they are generalists, which in turn is a disadvantage, because only specialists can handle certain tasks.

A robot of this type, which is used all over the world, is Pepper by SoftBank. It is a companion robot, which lives in households in Japan. It is, as its name suggests, a family member, friend and companion. But this is not enough for it. Pepper appears increasingly in schools and universities, as well as in nursing and retirement homes and hospitals, and informs and entertains patients and elderly people.

Pepper is as big as a child, rolls around, has two arms and two hands with five highly movable fingers. It speaks with a cute robot voice. A touch screen is located in its chest. Pepper has four microphones, two HD cameras and a distance sensor and is WiFi capable. It can identify certain persons with the help of face recognition and it masters the analysis of facial expressions and gestures, as well as voice analysis. Thus, it is able to detect emotions and re-spond accordingly.

4. Ethical Questions

Below, an ethical discussion will be conducted, starting from the concrete technical equipment, with the use of literature analysis (Denning et al. 2009; Calo 2011; van Wynsberghe 2016; Subramanian 2017) and with the help of the listed terms and under raised awareness, in a dialectical process (it will be tested, whether the practice falls under the term or injures this concept). Thus, the focus is on reality, and it can be avoided to lose oneself arbitrarily. Several robots have cameras, partly connected with face recognition, recognition of facial expressions and gestures, others have microphones, sometimes associated with speech and voice recognition. Different systems are networked, connected with the cloud or with internal and external systems. In addition, a few are mobile. Thus, four technical categories can be identified, which go hand in hand with ethical challenges.

4.1 Optical Detection and Analysis

Cameras allow still pictures and moving images of the visible world. 2D and 3D techniques are available, and with HD (in the future with UHD), the result are high resolution images, in which you can zoom in and where you can see people along with their external features and objects clearly. After all, the optical sensors of Paro allow the distinction between light and dark areas. Thermal imaging cameras could open up more opportunities in the future.

Nursing and retirement home, hospital and home are connected with buildings, which have walls and ceilings, with which to protect oneself against environmental influences, but also from foreign, prying eyes. One's home is the actual place of privacy; already in the care and health facility it is inevitably disrupted, and makes it necessary to knock on the door and call out to residents and patients to warn and prepare them to avoid that one gets a certain image of them. Thus, the presence of cameras of this type is per se an invasion of privacy.

The cameras are mainly intended for the robot to move safely through the area, to recognize the right people and objects and recognize them again, for pragmatic purposes. The cameras can be connected to nurses, therapists and doctors who want to get an idea of the condition of the patient (Sedenberg 2016). This may be absolutely desirable, and a virtual knocking is possible, so that the persons concerned can prepare themselves appropriately. On the other hand, it is possible that an authorized or an unauthorized person activates the cameras in an awkward moment and watches people. If these are naked or during intimate activities, the privacy is threatened.

Face recognition allows the identification of persons. As mentioned earlier, this can be important for care and treatment. Face recognition also helps determine age and gender - this does not seem problematic in the given context -, and in principle health insights can be gained which allow to make medical judgements in automated form. One can also fathom the emotional state - this is emotion recognition, in this case particularly recognition of facial expressions and gestures. This, in turn, means an invasion of privacy, because just within one’s own four walls one throws off the mask one wears in public, and in a sick person or a person undergoing treatment there may be special emotions such as hatred or sadness, which they would like to hide. The informational autonomy is threatened not only by processing, analyzing and disseminating of data (Subramanian 2017), but also by the fact that interactions, transactions and algorithms are at work here, which remain unknown or whose functions and consequences a layperson can hardly understand.

It is clear that cameras support the mobility of the robot, increase the safety of the operation and the supervised and ensure the health of the supervised (to the extent of what is possible). The price is the loss of informational autonomy.

4.2 Auditory Detection and Analysis

Microphones allow "snapshots" and process logs in the world of voices and sounds (Bendel 2018a; Bendel 2018c). Some technologies, also known as virtual assistants such as Alexa, enable the perception of distant or very low noise (far-field technology). A microphone is also important, so that the robot can receive commands, and in conjunction with the loudspeakers and a synthetic voice, it is the basis for the communication with the supervised, the nurse and the therapist. The interaction can also have a therapeutic purpose.

If the devices are used in situations and environments that are classified as trustworthy and protected, there are risks to privacy and data protection (Subramanian 2017; Sedenberg 2016; Bendel 2018a). In non- or part-public spaces, one speaks differently, one communicates differently, has one's private and trade secrets, and the disparity between the ability of devices that can permanently record, store and share and the need for privacy, protection of personality and secrecy is great.

With the withdrawal of personal data or personal information (concrete individuals can be identified), the voice, diction and content (possibly language) are equally implicated (Subramanian 2017). At the level of the content, life data, opinions and ideologies are collected using speech recognition. If the operator evaluates the data or disclose them to others (e.g., a mediator), the informational autonomy is at risk (Bendel 2014).

Entering the area of lost health and existing disease or advancing age is particularly sensitive. One is no longer the same person one once was, one behaves differently, speaks differently, makes noises that bother oneself or others. Entering the area of sexuality can also be regarded as serious - e.g., on the vocal level the moaning or, on the semantic level, words of endearment or dirty talk, combined with the special situation or limited awareness of the person concerned (and that is probably recognizable through sound, voice or speech).

Voice recognition, which is possible with the help of the microphones and a special software, is again an invasion of privacy and a threat to the informational autonomy. It serves the recognition of a voice and hence the identification of a person or the analysis of the characteristics of the voice and diction of a person. With its help, as is the case with face recognition, emotion recognition is possible. All in all, one can obtain total information, e.g., about age, gender, health status and emotional state. If, in addition, the manner of speech is parsed, one can find out about the educational background, social skills, state of mind, etc.

Under the constant pressure that one feels when one is in a corresponding mental and emotional state, the customs and habits change in all those affected. One lowers the voice, omits details, avoids punch lines and stings, private language and words of endearment, and belching and farting. One might eventually avoid having contacts and discussions with people, knowing well that no exchange remains hidden.

Some devices can explore whether someone is in a given place, and what position the people have, whether they are sitting, whether they are lying down, and even where they are exactly (a peculiarity of Milo). Thus, further inferences about behaviors and attitudes are possible and adjustments are likely to be made by the persons concerned.

Auditory systems are obviously necessary for the control of the robots and a simple, successful communication. Privacy and informational autonomy are at risk.

4.3 Networking

Certain possibilities and problems arise with the networking of the robot, the connection with the cloud, and with other systems such as databases, search engines and AI systems, and with other robots. The robot is often not a local system that keeps secrets, but propagates them to known and unknown persons, groups, and servers. There is a danger that these data are processed by other systems, distributed and stored (Subramanian 2017). Last but not least, attackers - humans and machines - can access the robots through the network. Malware can lodge itself in the robot and pull more or other personally identifiable data.

Networking may bring the world into one's own home. It is important to make the robot strong and connect it with people who can help in an emergency, but it is not without risk, because one does not know what is behind the possibly familiar creature. Through networking, the privacy so to speak is disassembled, hollowed out from within.

The personal autonomy gains by networking whereas the informational autonomy loses. Networking carries the risk that one has little chance to find the personal data and information about oneself at all, in order to delete them, because they are in multiple locations, or you cannot even delete them, for competence or logistics reasons. Here again one has to bear in mind that the people concerned are old, weak and sick.

4.4 Mobility

More opportunities and challenges result from the mobility (Hertzberg et al. 2012). Paro moves its flippers and its eyelids, but not itself. P-Rob and Lio have a limited range of motion and cannot leave their place. Also in their case one cannot speak of mobility. But, after all, they can take different perspectives, can probably contemplate the patient from back to front, from top to bottom. Care-O-bot, on the other hand, rolls autonomously around, and Milo walks around slowly. This means that they can follow one and suddenly come to meet one.

Privacy is shrinking even more, and one can only save it by shutting the door behind one's back. But first one has to be able to do that, and a bedridden person is, in this respect, at the mercy of the mobile robot. In general, there may be a mismatch between the mobility of the robot and the mobility of the patient.

Here, it becomes very clear that the personal autonomy can be strengthened, the informational autonomy, however, can be weakened. The robot becomes the extended arm, the mobile factor of immobile existence, but it is, at the same time, the ubiquitous monster, one must live together with.

5. Solution Approaches

Until now, starting with technical conditions, the ethical perspective has been consistently taken. The focus was on the elaboration of risks (opportunities were mentioned only marginally, but are also not in focus either because privacy and informational autonomy are concerned), not on the solution. This can arise from an ethical perspective, but then must lead to technical, organizational and legal implementations. Those are hinted at below.

An ethical approach to the solution of the problems lies in the discursive method. In order to bring the various parties (such as the robot manufacturer, the IT service provider, the hospital, the nursing or retirement home, the patient and his relatives) together, the representatives disclose their interests and all work together to create a solution, in which privacy is better protected and acceptance is increased (Glende 2016). One can undertake a specific information-ethical discourse and the corresponding steps together (Kuhlen 2004).

The terms and concepts of information ethics and ethics in general can be discussed, negotiated and adjusted which leads towards the dialectical method (Pieper 2007). In their normativity, they are subject to the spirit of the times and to social change. Accordingly, a new understanding is not determined by individual commissions and groups, but by an overall societal discourse. This discourse needs to be established on several levels and in different vessels.

One can also take a closer look at the models of normative ethics. As mentioned at the beginning, deontology and consequentialism are two common models that have different influence in different cultures (in the health sector for example, Great Britain is more in favor of teleological approaches, whereas Germany prefers deontological approaches). The question is whether in this area one should pay more attention to duties or focus more on the consequences. Is a directive on data protection ethically (not legally) to be seen in an absolute manner or do the consequences have to be looked at case by case?

Hospitals, nursing homes and homes for the elderly can be considered places of retreat which means organizational and spatial solutions. A robot does not always have to be present, it can be turned off, locked out or robot-free zones can be created so that patients can retire if they are physically able to do so. Even a robot quota could be considered. In the case of assisted living, especially in one’s own home, there is a higher authority on the one hand, and on the other hand, there is often less space.

Jurisprudence, computer science and robotics should work on standards, norms and laws, together with standardization organizations and politics, with the aim not only to guarantee the operational and personal safety, but also the privacy and data protection. Under current U.S. law, personal data stored by robots will be accessible to government agents under different standards and processes depending upon how they are legally classified (Sedenberg 2016). Certainly, adjustments need to be considered here in order to better protect the patient. In Europe, there was an important change in 2018. Thus, the General Data Protection Regulation (GDPR) has entered into force granting citizens a right to be forgotten and more freedom of access to information (Voigt and von dem Bussche 2018). Although the new law is discussed primarily in terms of websites and corporate and government data processing, it has also implications for robotics and AI.

Starting from machine ethics and robotics, more proposals may be submitted (Anderson and Anderson 2011; Bendel 2015; Bendel 2014). In this way, the robot can make it clear that it is a machine, and can point out the risks, which emanate from it. In its possible forms of communication, the audio system and the tablet or touchscreen, it indicates dangers, possibly in big letters, in a simple language and with easily understandable icons. The models of normative ethics can be made fertile for the implementation of morality in the machine (that turns into a moral agent), in addition to deontology and consequentialism, and maybe the virtue ethics as well.

Furthermore, you need privacy-preserving data sharing mechanisms, so that data can be collected and distributed ethically (Sedenberg 2016). Privacy by design is an approach that can be particularly effective for robots in the health sector (Schaar 2010). On the one hand, it must be noted that the robot itself is data protection friendly. Privacy by default is a central term here. Alternative sensors must also be taken into account, such as thermal imaging cameras instead of normal cameras. On the other hand, it must be possible for certain interest groups to adjust the settings. For example, it may be necessary to protect persons of public interest in particular.

In principle, a high IT security and data security in this area is important. Encryption and deletion of data as well as preferential treatment and rejection of server locations are options. In addition, the fundamental questioning of cloud solutions and networked systems is appropriate. Especially in therapy, there is often no need for such approaches. However, if the patients are in a very poor condition and need to be protected against themselves or against the environment, an alternative is difficult. Admittedly, this is not only a problem of robotics. Also smartwatches, intelligent bracelets and other gadgets that monitor one’s health, and determine one’s whereabouts and if necessary raise the alarm, should be taken into account in this problem area.

In the end, special attention must be paid to the role of the patient. He or she is the interest group that matters most. With a patient’s decree, he or she can exclude certain robots and the use of robots. He or she can also provide information about the desired use of the data. Perhaps the patient would like to make his or her data available to research and the general public (Sedenberg 2016). This decision must be respected and supported - it just cannot be made a general duty. It is also important for him to understand what the sensors are capable of. This requires a transparent design (Schafer and Edwards 2017) and sufficient information. The patient’s condition and background must also be taken into account.

6. Summary and Outlook

Robots in the health sector are important, valuable additions. As therapy and nursing or care robots they come close to us. Assistant robots also spread in the nursing homes, hospitals and homes for the elderly. They exist as well in retirement homes and private households. Using their sensors, they can recognize us, investigate, classify, evaluate our behavior and our appearance. Some pass the data on to humans and machines. They invade our privacy and question the informational autonomy.

Such robots can improve personal autonomy, the persons concerned can get things and have things taken away, they can be fed and turned in bed, and are no longer dependent in all situations on their fellow humans. However, the informational autonomy is at risk and this in an area where privacy has a special meaning. In addition, we already know certain robots, such as Pepper, and may have put our confidence in them, only to then be disappointed with some likelihood.

In the end, possible solutions were outlined at an ethical, a legal, a technical and an organizational level. Some quite interesting approaches have been found that can be used in practice. But ultimately, the use of robotics in the health sector poses a dilemma, which is difficult to solve.

References

Anderson, M.; and Anderson, S. L. eds. 2011. Machine Ethics. Cambridge: Cambridge University Press.

Becker, H.; Scheermesser, M.; and Früh, M. et al. 2013. Robotik in Betreuung und Gesundheitsversorgung. TA-SWISS 58/2013. Zürich: vdf Hochschulverlag.

Bendel, O. 2018a. Roboter im Gesundheitsbereich: Operations-, Therapie- und Pflegeroboter aus ethischer Sicht. In Bendel, O. ed. 2018. Pflegeroboter. Wiesbaden: Springer Gabler.

Bendel, O. 2018b. Co-robots from an Ethical Perspective. In Dornberger, R. ed. 2018. Information Systems and Technology 4.0: New Trends in the Age of Digital Change. Cham: Springer International Publishing. 275 - 288.

Bendel. O. 2018c. SSML for Sex Robots. In Cheok, A. D.; and Levy, D. eds. 2018. Love and Sex with Robots. Third International Conference, LSR 2017, London, UK, December 19-20, 2017, Revised Selected Papers. Cham: Springer International Publishing. 1 - 11.

Bendel, O. 2016. 300 Keywords Informationsethik: Grundwissen aus Computer-, Netz- und Neue-Medien-Ethik sowie Maschinenethik. Wiesbaden: Springer Gabler.

Bendel, O. 2015. Surgical, Therapeutic, Nursing and Sex Robots in Machine and Information Ethics. In van Rysewyk, S. P.; and Pontier, M. eds. 2015. Machine Medical Ethics. Series: Intelligent Systems, Control and Automation: Science and Engineering. Berlin, New York: Springer. 17 - 32.

Bendel, O. 2013a. Dr. Robot entdeckt die Moral: Maschinen- und Menschenethik im Gesundheitsbereich. IT for Health, 02/2013, 2 - 4.

Bendel, O. 2013b. Die Medizinethik in der Informationsgesellschaft: Überlegungen zur Stellung der Informationsethik. Informatik-Spektrum, 36 (2013) 6, 530 - 535.

Calo, M. R. 2011. Robots and Privacy. In Lin, P.; Abney, K; and Bekey, G. A. eds. 2011. Robot Ethics: The Ethical and Social Implications of Robotics. Cambridge: MIT Press. 187 - 202.

Denning, T.; Matuszek, C.; and Koscher, K. et al. 2009. A spotlight on security and privacy risks with future household robots: attacks and lessons. In Proceedings of the 11th international conference on ubiquitous computing. New York: ACM. 105 - 114.

Floridi, L. 2015. The Ethics of Information. Oxford: Oxford University Press.

Glende, S.; Conrad, I.; and Krezdorn, L. et al. 2016. Increasing the Acceptance of Assistive Robots for Older People Through Marketing Strategies Based on Stakeholder Needs. International Journal of Social Robotics (2016) 8, 355 - 369.

Göbel, E. 2010. Unternehmensethik: Grundlagen und praktische Umsetzung. Stuttgart: Lucius & Lucius.

Hertzberg, J.; Lingemann, K.; and Nüchter, K. 2012. Mobile Roboter: Eine Einführung aus Sicht der Informatik. Berlin and Heidelberg: Springer.

Hope, T; and Dunn, M. 2018. Medical Ethics: A Very Short Introduction. Oxford: Oxford University Press.

Kollek, R. 2013. Ethik der Technikfolgenabschätzung in Medizin und Gesundheitswesen: Herausforderungen für Theorie und Praxis. In Bogner, A. ed. 2013. Ethisierung der Technik - Technisierung der Ethik: Der Ethik-Boom im Lichte der Wissenschafts- und Technikforschung, Baden-Baden: Nomos. 199 - 214.

Kuhlen, R. 2004. Informationsethik: Umgang mit Wissen und Informationen in elektronischen Räumen. Konstanz: UVK.

Levy, D. 2007. Love and Sex with Robots: The Evolution of Human-Robot Relationships. New York: Harper Perennial.

Mayer-Schönberger, V.; and Ramge, T. 2017. Das Digital: Markt, Wertschöpfung und Gerechtigkeit im Datenkapitalismus. Berlin: Econ.

Pieper, A. 2007. Einführung in die Ethik, 6. ed. Tübingen and Basel: A. Francke Verlag.

Salichs, M. A.; Encinar, I. P.; and Salichs, E. et al. 2016. Study of Scenarios and Technical Requirements of a Social Assistive Robot for Alzheimer’s Disease Patients and Their Caregivers. International Journal of Social Robotics (2016) 8, 85 - 102.

Santoni de Sio, F.; and van Wynsberghe, A. 2016. When Should We Use Care Robots? The Nature-of-Activities Approach. Sci Eng Ethics, 2016, December 22 (6), 1745 - 1760.

Schaar, P. 2010. Privacy by design. Identity in the Information Society, Vol. 3 No. 2, 267 - 274.

Schafer, B.; and Edwards, L. 2017. "I spy, with my little sensor": fair data handling practices for robots between privacy, copyright and security. Connection Science, 2017 Vol. 29, No. 3, 200 - 209.

Sedenberg, E.; Chuang, J.; and Mulligan, D. 2016. Designing Commercial Therapeutic Robots for Privacy Preserving Systems and Ethical Research Practices Within the Home. International Journal of Social Robotics (2016) 8, 575 - 587.

Subramanian, R. 2017. Emergent AI, social robots and the law: Security, privacy and policy issues. Journal of International Technology and Information Management, 26 (3), 81 - 105.

van Wynsberghe, A. 2016. Service robots, care ethics, and design. Ethics and Information Technology (2016) 18, 311 - 321.

Voigt, P.; and von dem Bussche, A. 2018. EU-Datenschutz-Grundverordnung (DSGVO): Praktikerhandbuch. Berlin: Springer.