• No results found

Ethically Sound Technology? Guidelines for Interactive Ethical Assessment of Personal Health Monitoring

N/A
N/A
Protected

Academic year: 2021

Share "Ethically Sound Technology? Guidelines for Interactive Ethical Assessment of Personal Health Monitoring"

Copied!
8
0
0

Loading.... (view fulltext now)

Full text

(1)

Ethically Sound Technology? Guidelines for

Interactive Ethical Assessment of Personal

Health Monitoring

Elin Palm, Anders Nordgren, Marcel Verweij and Göran Collste

Linköping University Post Print

N.B.: When citing this work, cite the original article.

Original Publication:

Elin Palm, Anders Nordgren, Marcel Verweij and Göran Collste, Ethically Sound Technology?

Guidelines for Interactive Ethical Assessment of Personal Health Monitoring, 2013,

Interdisciplinary Assessment of Personal Health Monitoring, 105-114.

http://dx.doi.org/10.3233/978-1-61499-256-1-105

Copyright: IOS Press

http://ebooks.iospress.nl/

Postprint available at: Linköping University Electronic Press

(2)

Ethically Sound Technology? Guidelines for interactive

ethical assessment of personal health monitoring.

Elin Palma, Anders Nordgrena, Marcel Verweij b and Göran Collstea aLinköping University

bUtrecht University

Introduction

The starting point for an interactive ethical assessment of novel Personal Health Monitoring-technologies is the question of how the technologies influence the realization of values and ethical principles central to health care. Although check-lists of values and ethical principles for novel technologies have been developed (cf Palm and Hansson, 2005) and provide important input, in order to achieve increased relevance of the ethical analysis and a nuanced picture of how PHM-technologies influence different stakeholders, the proposed interactive ethical assessment-model supplements a traditional top-down approach with a bottom-up approach. The bottom-up approach includes interviews with relevant stakeholders - in particular health care personnel and patients.

The interactive ethical assessment methodology presupposes a social constructivist theory of technology. In contrast to technological determinism, social constructivists insist that new technology is the result of social interests, forces, and choices. The theory is both descriptive and constructive. It informs us that technologies are not neutral, but instead serve the interests of some institutions and social groups. The model carries constructive and normative implications. If we are aware of the fact that a technology is not set but may be shaped according to our needs and values, technological development becomes an ethical challenge. The process of technological construction is intimately connected to questions of what constitutes a good life and which values we ought to realize and why.

Technological development is typically difficult to predict – especially long-term consequences (Palm and Hansson, 2005). Hence, anticipation must be organized as a regular activity. According to this methodology for ethical assessment, the stakeholders will be involved in the assessment in a regular way. In an ethical assessment the questions raised concern values, moral principles, and norms like autonomy and privacy. It is constructive in the sense that it is integrated in the design and development of a new technology and interactive means that it involves different categories of stakeholders (Schot, 2001, Reuzel et al 2001).

Within the PHM-project three piloting investigations have been performed: (1) Careousel, a smart medicine management device (Palm, 2010), (2) Robot Giraff, an interactive and mobile communication device originally developed as a tool for video-conferencing (Palm, 2010, Palm, 2011), and (3) I-Care, a care-software that combines alarm and register system. Three small-scale interviews carried out within the PHM-project illustrate the bottom-up approach. However, the investigations conducted are merely piloting an interactive methodology. A full-scale interactive ethical assessment would better represent the many different stakeholder categories, recurrent interviews and continuous influence on the technology development.

In brief, the interactive ethical assessment model is a diptych model including a bottom-up as well as top-down perspective that provides a richer understanding of the impact of PHM-technologies on ethical values than a traditional top-down model.

(3)

1. Objectives

An interactive ethical technology assessment aims to evaluate emerging technologies and to influence the development of technology at an early stage – preferably before the artifacts have reached the market. If aspects in need of modification are pointed out at the prototype stage, technology developers are more likely to alter the design than later on in the developmental process when a change is more inconvenient and costly (Palm and Hansson, 2006).

Conducted at an early stage of technology development/implementation an interactive ethical assessment can encircle concerned parties’ needs, interests and opinions. Based on such results, technology developers and those responsible for the introduction of certain technical solutions can adjust the technology, promote positive aspects of the technology and avoid negative ones as far as possible. An interactive ethical assessment can show who is affected by a technology and in what way, and may contribute to a fair distribution of benefits and burdens, as well as allocating responsibility for dealing with risks and burdens. First and foremost, the aim of interactive ethical assessment of health technologies is to reveal implications and integrate health care values in emerging technologies. However, not only users benefit from interactive ethical technology assessments but technology developers and care providers as well. Negative impact may be costly in many respects, not only from an economical perspective. Users´ well-being is central to any technology developer – and perhaps particularly so regarding health care technology.

A core idea in all forms of Technology Assessment (TA) is that technology should not be developed in isolation from the users. This model includes a broad range of stakeholders and is interactive in the sense that the stakeholders´ opinions should be considered throughout the development and implementation of technology that concerns them. Ideally, an interactive ethical assessment would be part throughout the whole chain of technology development and all concerned parties should have their say and access to the opinions of others.

Importantly, the interactive empirical part of the ethical assessment does by no means replace the ethical analysis of the new technology. First, an ethicist will identify relevant values from the background of discussions in medical ethics and ICT-ethics, for example. Values and principles that have proven to be relevant in relation to similar technologies are most likely of relevance for PHM-technologies as well. For instance, Privacy, Autonomy, Liberty and Justice have been identified as such core values in relation to ICT (Brey, 2000). Having encircled “classical” values and principles, the interactive part can inform the ethicist of how stakeholders interpret the different values and principles and how they see their relevance for the particular context.

The interactive part can provide the ethicist with information about which values are more important and which are less for the specific assessment. Responses from interviews can also give the ethicist new perspectives and insights for her ethical analysis. Furthermore, the empirical part will also inform the ethicist on possible value conflicts related to the introduction of a new PHM-technology. The task for the ethicist is then to analyse the conflict and – when feasible – suggest a possible ways out.

2. Development process

Guidelines suggested for interactive ethical assessment contain three parts: (1) a battery of ethical questions, (2) an ethical matrix and (3) ethical reflection. In the first part a battery of questions are presented. Those questions cover different ethical aspects and possible ethical implications of new technologies for personal health

(4)

monitoring. They are anchored in values and principles that have proven to be of relevance to other – similar – technologies such as autonomy, privacy, dignity and justice.

They can be directed at different stakeholders; technicians, politicians, health care personnel and patients. The second part is an ethical matrix. In this graph the Y-axis shows values of relevance for the ethical assessment and the X-axis possible stakeholders. The third part consists of a general ethical reflection.

New technologies are developed through different stages. In order to maximally influence the construction of a new technology, the interactive ethical assessment should ideally start already at the stages of design and prototypes (Palm and Hansson, 2005).

3. Users

Ideally, an interactive ethical assessment should involve all relevant stakeholders. Regarding health care technologies this includes patients, relatives, health care personnel, technology developers etc. It is vital that a person with ethical expertise is involved, who, preferably, works together with the technology developers and health care personnel concerned.

4. Methods

The methods used must be adjusted to the specific technology under assessment. The same can be said regarding stakeholders involved. Both questionnaires and interviews have been used within this project. Below some examples of questions in the three small-scale assessments conducted within the PHM-project are described. Noticeably, the questions are more or less relevant for different stakeholders.

5. Guidelines 1: Questionnaire for ethical assessment

Questions related to need and functionality

 What is the aim of the technology?  What problem does technology X solve?  Is the aim laudable/reasonable?

 Is the technology a good/the best means given that aim?  What need does the technology correspond to?

 What need does the user have?

 Does the technology correspond to this need?

 How should the technology be designed/implemented in order to correspond to the need?  How is the technology marketed/launched?

Issues related to concepts and definitions Questions pertaining to autonomy

 Have you been involved in the development and/or implementation of the technology? If so, have you been able to influence the design/usage of the technology? (sufficiently so?) (surveillance theory)  Do you feel a need to adjust to the technology? Do you alter your behaviour when under surveillance?  Do you know what the system is doing and why?

 Do you know how to control the system in different usage situations?

 Does the technology enhance independence? If so, is this a desirable development?

(5)

 What types of information, if any, do you considered intrusive? Why?  Under what circumstances and why?

 Are certain ways of using the technology/applications considered privacy sensitive/invasive? If so, what ways/applications and why?

 Are you aware of how your data is processed?

 Are you aware of which persons/institutions that have access to your data?

 Who has/should have access to the information generated by technology X and why?  Are there any aspects of using technology X that you consider embarrassing?

Questions pertaining to freedom of choice

 To what extent do you perceive the usage of technology X a voluntary action?  What alternatives do you have to use technology X?

 How do you consider the alternatives to technology X?

 Does the technology have any impact on your freedom of choice with regard to activities? (enable/restrict/enhance)

 Does the technology raise demands on you being “accessible at all times”?

Questions pertaining to informed consent

 How is the technology presented?

 In what way, if at all, did you consent to technology X?  Were you sufficiently informed prior to your consent?  Are you aware of how the technology functions?  Are you aware of the consequences of your acceptance?

 Are there aspects of the technology/system that you are uncertain of/would like more information about?

 Would you like reminders regarding a ubiquitous/continuously active system?  Do you have the chance to opt out?

 Do you feel that you are forced to consent?

Questions pertaining to quality of life and quality of care

 Does the technology increase the quality of life?

 Does the technology enhance or diminish your sense of control?

 Does the technology enhance or diminish your sense of involvement in the health care process?  Does the technology enhance or diminish the quality of care?

 Does technology X reduce/enhance bad conscience among kin persons/informal caregivers?  Does technology X reduce/enhance stress among care personnel, kin persons/informal caregivers?

Questions pertaining to trust, safety and security

 Do you feel safer with the technology than before it was implemented?  Does the perceived safety correspond with the security/actual reliability?  Is the reliability sufficient for the aim the technology should fill?

Questions pertaining to human contact patterns

 Does technology X make you feel isolated?

 Does the technology imply more frequent contacts (real/virtual)?  Does the technology facilitate (real/virtual) contacts?

 What impact, if any, does the technology have on the relation care giver – care recipient? (enhanced/diminished closeness/presence/isolation/trust)

 What kind of contact do you prefer – real/virtual?  Does the technology increase isolation?

(6)

 Can all citizens be granted equal access to technology X?  To what extent is the technology distributed fairly?  To what extent can the technology be fairly distributed?

 Does it imply a “digital divide”/”information divide” – a gap between have and have nots?  Can e-Inclusion be ensured in other than public services?

 Should the technology be used to even out inequalities stemming from “the natural lottery”?

Questions pertaining to assumptions

 Would you prefer independent/social living?

 Would you prefer invasive/non-invasive technology? 6. Guidelines 2: An ethical matrix

Distinguishing different steps in a technology project clarifies the role of ethics within the project. A project consists of four steps:

(1) investigation of the perceived needs of the target group of users by an interview study, (2) investigation of the technological resources available to meet these needs,

(3) investigation of the perceived needs and the technological resources from an ethical point of view in order to find ways to avoid, accommodate or “solve” ethical problems and ethical dilemmas (or “risks”),

(4) development of a technical product appropriate for the target group, taking the results of the previous three steps into consideration.

The ethical matrix consists of values to be realised (for example, satisfaction of needs and preferences, privacy, independence, social contact, cost-effectiveness) and stakeholders (for example, the patient/client, health care professionals, the county council, the municipality, and technology developers). However, the matrix is only a tool for identifying ethical problems. To be useful in clarifying, accommodating or solving ethical problems the values outlined in the matrix must be specified and balanced regarding specific stakeholders in specific contexts.

Table 1: An ethical matrix Stakeholders Values to be

realised

Patient/ client

Relatives Friends Health care professionals

Social care professionals

County council

Municipality Private care provider Technology developers Satisfaction of individual needs and preferences (utility) Privacy (of data) Privacy by confidential human handling of technically transferred data Privacy by respect for a private sphere (local, physical, mental)

Security and safety Autonomy/Freedom of choice

(7)

Informed consent Social relations: Patients-personnel Other A good balance of independence and social contact Robustness Easy to use Affordable price Cost-effectiveness Ownership of data

7. Guidelines 3: General reflection

In this way, the matrix offers input for an overall concluding ethical reflection, which can take various forms. One traditional format is that an ethicist writes an advisory report in which the technology is discussed, applying both the normative framework (top-down) and the matrix (bottom-up) to specify ethical arguments and problems. Alternatively, or complementary, reflection can take the form of a structured discussion among stakeholders, prepared by an ethicist. If problems are well-defined, it is often the stakeholders themselves who are in the right position to decide how problems can be best avoided or accommodated.

8. Areas of application

Preferably, interactive ethical technology assessments of PHM-technologies should be conducted where the technologies are to be used – in the course of the pateients’ everyday life. This may mean in their homes or dwelling places or during certain activities outside home. Living labs have been established all over Europe and communities have been formed such as the European Network of Living Labs (ENoLL) with the ambition to, as far as possible, test novel ideas and prototypes in natural user environments.1 For example, ExCITE is a

three-year European Project recently established to evaluate socially interactive robotics and in particular robots developed for elderly care. Central ambitions within this project are to involve users and integrate their perspectives on technology, hence accommodating user needs.2 Despite the many living labs and tests beds

around Europe however, few of them are currently conducting tests on PHM-technologies.

9. Conclusions

Technology has a strong potential to shape society and our lives. In many ways technology can empower us and facilitate our lives but it may also alter conditions and behaviours in unforeseen and unwanted ways. Health care technology may alter the conditions of a rather vulnerable group of people – individuals with care needs. Thus, health care technology deserves particular attention. What is an ethically sound technology? A possible answer is a technology that realizes important values and norms and fulfils the aims of the involved stakeholders more efficiently than alternative technologies. In order to achieve this goal, emergent technologies should be ethically

1 http://www.openlivinglabs.eu/ 2 http://www.oru.se/excite

(8)

assessed and the stakeholders’ views and values must be known. Health care is basically a moral practice and thus, values of health care should be embedded in health care technologies.

This chapter relates some ways to assess emergent PHM-technologies from an ethical point of view. Interviewing stakeholders and reflecting on different stakeholders’ values and interests are some ways to put an ethical assessment into practice. The results provide a basis for an ethical evaluation of PHM-technologies.

References (extended list)

[1] R. Bemelmans, J.G. Gelderblom, P. Jonker & L. De Witte, Socially Assistive Robots in Elderly Care: A Systematic Review into Effects and Effectiveness, Journal of the American Medical Directors Association (2010)

[2] J. Broekens, M. Heerink & H. Rosendal, Assistive social robots in elderly care: a review. Gerontology, 8(2) (2009), 94-103.

[3] E. Burns & E. Haslinger-Haufmann. (2008) Evaluation of the nursing diagnosis “social isolation” and the use of evidence-based nursing.

Pflege, Feb; 21(1) (2008), 25-30.

[4] S. Carretero, J. Garces, F. Rodenas, & V. Sanjose, The informal caregivers’ burden of dependent people: theory and empirical review.

Arch Gerontol Geriatr. 49 (2009), 74-79.

[5] G. Collste, Under my Skin: The Ethics of Ambient Computing for Personal Health Monitoring. In S. Nagy Hesse-Biber (ed.) The

Handbook of Emergent Technologies in Social Research. Oxford University Press, Oxford, 2011.

[6] G. Collste & M. Verweij, Personal Health Monitoring and Human Interaction. The American Journal of Bioethics 12(9) (2012), 47-48. [7] M. Decker, Caregiving robots and ethical reflection: the perspective of interdiscipinary technology assessment. AI & Society 22 (2008),

315-330.

[8] H. Draper & T. Sorell, (advance publication online) Telecare, remote monitoring and care. Bioethics (2012). DOI:

10.1111/j.1467-8519.2012.01961.x.

[9] A. Dunér & M. Nordström, The roles and functions of the informal support networks of older people who receive formal support: A Swedish qualitative study. Aging and Society, 27 (2007), pp. 67-86.

[10] A. Essén, The two facets of electronic care surveillance: An exploration of the views of older people who live with monitoring devices.

Social Sciences & Medicine 67 (1) (2008) pp.128-136

[11] A. Fex, From Novice Towards Self-Care Expert – Studies of self-care among persons using advanced medical technology at home, Doctoral thesis, Division of Nursing Science, Department of Medical and Health Sciences, Faculty of Health Sciences, Linköping University, Sweden, 2010

[12] P.B. Koff, R.H. Jones, J.M. Cashman, N.F. Voelkel & R.W. Vandivier, (2009) Proactive integrated care improves quality of life in patients with COPD. European Respiratory Journal 33 (2009), 1031–1038.

[13] L. Kuokkannen & H. Leino-Kilpi, Power and empowerment in nursing: three theoretical approaches. Journal of Advanced Nursing

31(1) (2000), 235-241.

[14] M.W. Martin & R. Schinzinger, Ethics in Engineering (4th ed.), MacGraw Hill, Boston, 2005 [15] H. Nissenbaum, Privacy as contextual integrity. Washington Law Review 79(1) (2004), 119-158.

[16] A. Nordgren, The web-rhetoric of companies offering home-based personal health monitoring. Health Care Analysis. 20(2) (2012), 103-118.

[17] A. Nordgren, Remote Monitoring or Close Encounters? Ethical Considerations in Priority Setting Regarding Telecare, Health Care

Analysis (2012), DOI 10.1007/s10728-012-0218-z

[18] E. Palm, När vården flyttar hem till dig – den mobila vårdens etik. Etikk i praksis 2/2010, 71-92.

[19] E. Palm, Who cares? Moral obligations in formal and informal care provision in the light of ICT-based home care. Health Care

Analysis (2012), http://www.springerlink.com/content/nk4602n376811025/

[20] E. Palm, An interactive ethical assessment of surveillance-capable software within the home-help service sector" Journal of

Information, Communication & Ethics in Society (forthcoming)

[21] E. Palm. & S.O. Hansson, The Case for Ethical Technology Assessment (eTA).Technological Forecasting and Social Change, 2005 [22] R. Reuzel, G. van der Wilt, H. ten Have & P. de Vries Robbe, "Interactive Technology Assessment and Wide Reflective Equilibrium",

Journal of Medicine & Philosophy, vol. 26, no. 3 (2001) , pp. 245.

[23] J. Schot, Constructive Technology Assessment as Reflexive Technology Politics, in Technology and ethics: a European quest for

responsible engineering, Peeters, Leuven, 2001.

[24] A. Sharkey & N. Sharkey, Granny and the robots: ethical issues in robot care for elderly. Ethics and Information Technology, 2010 (online).

References

Related documents

&KDSWHU 7KHILUVWVWHSRIWKHVFUHHQLQJSURFHVVWKHHVWDEOLVKPHQWRIWKHVFUHHQLQJREMHFWLYHLV

Four clusters described ethical values (ethos), and five clusters explained possibilities as experienced by caregivers and how to act in an ethical manner in the daily work with

After commenting on the changing notion of creativity throughout history of thinking, we provide tentative answers to the posed research question: “Which are the aesthetical,

I ett annat fall anser modern att om barnet varit med på BUP-samtalen skulle en utredning förmodligen ha gjorts mycket tidigare och att då hade barnet inte förlorat all

Dokumenten som har studerats är: Sveriges fjärde nationalrapport om klimatförändringar från Miljö- och samhällsbyggnadsdepartementet; kommittédirektivet till Klimat-

The first aim is to say something about general philosophical questions relating to coherentism as a theory in metaethics, and especially in relation to value education; the

All interviewed experts confirmed the overreliance on the technology as well as security of digital and physical assets, however, they discussed about additional

Det som också framgår i direktivtexten, men som rapporten inte tydligt lyfter fram, är dels att det står medlemsstaterna fritt att införa den modell för oberoende aggregering som