New technologies are not inherently ethically good or bad; they can both enhance or violate fundamental rights and values and their use can have far-reaching consequences. This is especially apparent in the medical and health sectors.
The rapid development of new technologies deployed in this sector has significantly improved our ability to diagnose and treat medical conditions and advance the health and general well-being of people across varying contexts including in the hospital and at home.
Accompanying the design and use of these technologies are critical ethical and privacy concerns, but also opportunities, that call for accurate identification and thorough comprehension.
AI tools to enhance medical practice
The ethical challenges facing the medical and health sectors do not remain on an abstract philosophical level. For example, studies show that AI tools can diagnose cancer better than human doctors.
This is an amazing medical and technological breakthrough that can save lives and improve treatment. However, the black box problem in AI—not fully understanding what features of the data an algorithm has used in its decision-making process—means that a Deep Learning algorithm that predicts the optimal treatment for a patient does not provide the reasoning it used to make that prediction. This might result in doctors being wary of trusting its conclusions.
Furthermore, such systems could cause extensive harm; a radiologist who misreads a scan may harm one patient, but a flawed AI system in widespread use could harm many. Additionally, it is now well documented that machine learning algorithms may develop racial, socioeconomic, or gender biases if they have not been trained on a diverse set of data.
For example, IBM’s AI-based clinical decision support system, Watson for Oncology, was trained on a dataset consisting of US residents, however, it is principally used in Asia.
As a consequence, its treatment recommendations vary in agreement with experts’ recommendations from 49% to 83% in Korea, China, Thailand, and India.
Health tracking systems and privacy
Ethical and privacy issues for the medical and health sectors are not limited to clinical or hospital settings, and they are increasingly entangled with business, political, cultural, and of course, legal matters. The recent development and deployment of tracking measures for positive Covid-19 cases raises immediate concerns about threats to individuals’ autonomy, privacy, and dignity.
Although tracking positive cases might help scientists understand the spread of the virus and warn healthy individuals where they might come into contact with those infected, tracking individuals’ location could be used for less benign purposes by governments or other organizations after the pandemic subsides.
Conceptual puzzles immediately surface even if we begin an assessment of an ethical issue in the medical or health sector with the 4 basic principles of biomedical ethics
- respect for autonomy
Respect for autonomy
Is it ethically permissible to perform a blood transfusion on a child whose parents refuse such treatments on religious grounds?
Is everyone morally obligated to donate a kidney or bone marrow to a person in need? Or are these actions of generosity or charity—a sign of a virtuous character perhaps, but not a duty or obligation?
Beneficence might seem uncontroversial, but scholars disagree on how this principle generates moral duties that are obligatory for everyone.
Does justice entail equality? Fairness? Must it be deserved? Are a country’s laws of justice applicable only to its own residents, or to all humans?
Concepts of justice also raise questions precisely because there is no single principle of justice.
Improving health with technology
Despite such significant concerns, new technologies afford key ethical opportunities as well. The use of elder-care robots in Japan, in the form of assistants or companions, addresses a critical shortfall of workers needed to provide the necessary care. Importantly, many elderly who have interacted with care robots describe their experiences positively. In rural Pakistan, chat bots were deployed to provide reproductive health and hygiene information to young women who otherwise did not have access to, and are often culturally shamed for seeking out such information.
Trilateral offers expertise in ethics and privacy in the medical and health sectors. Contact our team for more information in this area.