Machines that support highly complex decisions of doctors have been a reality of r almost half a century. In the 1950s. computer-supported medical diag­ nostic systems started with "punched cru·ds in a shoe box". In the 1960s :md 1970s medicine wa�. to a cenain extent, transfo rmed into a quantitative science by inten­ sive i nt erdisc ip linary research coUaborations o f exp erts fi·om medicine. mathemat­ ics and electrical engineering; This was followed by a second shift in research on machine support of medical decisions from numerical probabilistic to knowledge basedapproaches. Solutions ofthe later form cameto be known as (medic;ll) expert systems, knowledge based systems research o•· Artificial Intelligence in Medicine. With growi11g complexity of machines physician patient interaction can be supported in various ways. This includes not only d iag nos is and th erapy options but could also include ethical problems like end-of-life decisions. Here questions of shared responsibility need to be answered: should machine or human have the last say? This chapter explores the question of shared responsibility mainly in ethical decision making in medicine. After addressing the historical development of decision support systems in medicine the demands of users on such systems are analyzed. Then the special structure of ethical dilemmas is explored. Finally, this chapter discusses the question how decision suppo11 systems c

Inthorn, J., Seising, R., & Tabacchi, M. (2015). Having the final say: Machine support of ethical decisions of doctors. In S. van Rysewyk, & M. Pontier (a cura di), Machine Medical Ethics (pp. 181-206). Springer Verlag [10.1007/978-3-319-08108-3].

Having the final say: Machine support of ethical decisions of doctors

TABACCHI, Marco
2015

Abstract

Machines that support highly complex decisions of doctors have been a reality of r almost half a century. In the 1950s. computer-supported medical diag­ nostic systems started with "punched cru·ds in a shoe box". In the 1960s :md 1970s medicine wa�. to a cenain extent, transfo rmed into a quantitative science by inten­ sive i nt erdisc ip linary research coUaborations o f exp erts fi·om medicine. mathemat­ ics and electrical engineering; This was followed by a second shift in research on machine support of medical decisions from numerical probabilistic to knowledge basedapproaches. Solutions ofthe later form cameto be known as (medic;ll) expert systems, knowledge based systems research o•· Artificial Intelligence in Medicine. With growi11g complexity of machines physician patient interaction can be supported in various ways. This includes not only d iag nos is and th erapy options but could also include ethical problems like end-of-life decisions. Here questions of shared responsibility need to be answered: should machine or human have the last say? This chapter explores the question of shared responsibility mainly in ethical decision making in medicine. After addressing the historical development of decision support systems in medicine the demands of users on such systems are analyzed. Then the special structure of ethical dilemmas is explored. Finally, this chapter discusses the question how decision suppo11 systems c
Settore INF/01 - Informatica
Settore M-FIL/02 - Logica E Filosofia Della Scienza
Inthorn, J., Seising, R., & Tabacchi, M. (2015). Having the final say: Machine support of ethical decisions of doctors. In S. van Rysewyk, & M. Pontier (a cura di), Machine Medical Ethics (pp. 181-206). Springer Verlag [10.1007/978-3-319-08108-3].
File in questo prodotto:
File Dimensione Formato  
10.1007%2F978-3-319-08108-3_12.pdf

non disponibili

Dimensione 400.38 kB
Formato Adobe PDF
400.38 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: http://hdl.handle.net/10447/96518
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 6
  • ???jsp.display-item.citation.isi??? 5
social impact