Research in Cardiovascular Medicine

LETTER TO EDITOR
Year
: 2022  |  Volume : 11  |  Issue : 1  |  Page : 36--37

Clinical decision-making and personality traits; Achilles' heel of artificial intelligence


Ehsan Khalilipur1, Majid Chinikar2, Mehdi Mehrani3, Armin Elahifar1,  
1 Cardiovascular Intervention Research Center, Rajaie Cardiovascular Medical and Research Center, Iran University of Medical Sciences, Tehran, Iran
2 Cardiology Department, Arya General Hospital, Rasht, Iran
3 Tehran Heart Center, Tehran University of Medical Sciences, Tehran, Iran

Correspondence Address:
Dr. Armin Elahifar
2nd Floor, Research Building, Shahid Rajaei Research and Training Hospital, Next to Mellat Park, Vali-Asr Ave, Tehran
Iran




How to cite this article:
Khalilipur E, Chinikar M, Mehrani M, Elahifar A. Clinical decision-making and personality traits; Achilles' heel of artificial intelligence.Res Cardiovasc Med 2022;11:36-37


How to cite this URL:
Khalilipur E, Chinikar M, Mehrani M, Elahifar A. Clinical decision-making and personality traits; Achilles' heel of artificial intelligence. Res Cardiovasc Med [serial online] 2022 [cited 2022 Jul 3 ];11:36-37
Available from: https://www.rcvmonline.com/text.asp?2022/11/1/36/341263


Full Text



Dear Editor,

Personality is a critical part of human existence since it affects all human behaviors in both personal and social settings and may also cause significant conflicts with a person's environment when incompatible traits and features exist or it leads to the best decision in a critical situation.[1]

Personality is defined as specific variances in the individual patterns of emotion, thoughtful, and performance. One of its important characteristics is being stable over time and the situations that each individual is known for it. In fact, personality traits have an important effect on academic performance and decision-making and activity of individuals.[2]

Personality traits are divided into “extraversion, openness to experience, agreement, conscientiousness, and neuroticism,” based on the five-factor model.[3]

As previous studies demonstrated, conscientiousness and interpersonal motivation predicted rational processing style. Interpersonal motivation also had a mixed effect on personality, with data indicating a correlation between conscientiousness and both experiential and rational processing styles.[3]

While it is true that “one must be born” into a certain medical specialization, many aspects of personality may be trained via adequate training. Increased levels of conscientiousness, agreeableness, and openness, together with decreased levels of neuroticism, may indicate the optimal combination for professional satisfaction in the medical profession. Empathy promotes good patient–physician communication by increasing patient trust, compliance, and satisfaction. Emotional intelligence – the capacity to respond to one's own and others' emotions – has been shown to enhance empathic ability synergistically with empathy. Clinical communication skills are critical for medical students to develop to provide the best possible patient care, and they are undoubtedly related to and/or influenced by empathy, interprofessional collaboration abilities, emotional intelligence, and, most significantly, personality traits.[1]

While artificial intelligence is an excellent tool for improving treatment and diagnosis in medicine, relying only on it results in significant misdiagnosis and mismanagement of patients. Biases exist in artificial intelligence systems for two reasons:

Cognitive biases are unintentional mistakes of thought that impair individuals' judgments and conclusions. These biases develop as a result of the brain's attempt to simplify the processing of environmental information. Psychologists have recognized and categorized over 180 human biases one is unrecognized impact of personality traits. Cognitive biases may infiltrate machine learning systems in one of two ways:

Designers introduce them to the model unintentionallyA training set that incorporates such biases.

Incomplete data: Incomplete data may not be representative and hence may contain bias. For instance, the majority of clinical research papers involve data from a specific region and a subset of the population that does not reflect the entire population. For instance, a heart attack is overwhelmingly misdiagnosed in women. Nevertheless, prediction models for cardiovascular disease that claim to predict heart attacks 5 years before they happen are trained in predominantly male datasets. As cardiovascular disease has different patterns of expression in men versus women, an algorithm that has been trained predominantly with data samples of men may not be as accurate in diagnosing women.

It is crucial that bias is mitigated when developing and deploying artificial intelligence systems in medicine to prevent health-care inequality for particular patient groups and to ensure functionality that is safe for all patients.[4],[5]

As a result, we believe that complete study and training of personality traits in physicians in the age of artificial intelligence might aid researchers in developing an algorithm with less inherent bias for diagnosing and managing patients properly.

Ethical clearance

Nil.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.

References

1Suciu N, Meliț LE, Mărginean CO. A holistic approach of personality traits in medical students: An integrative review. Int J Environ Res Public Health 2021;18:12822.
2El Othman R, El Othman R, Hallit R, Obeid S, Hallit S. Personality traits, emotional intelligence and decision-making styles in Lebanese universities medical students. BMC Psychol 2020;8:46.
3Parker-Tomlin M, Boschen M, Glendon I, Morrissey S. Factors influencing health practitioners' cognitive processing and decision-making style. J Interprof Care 2019;33:546-57.
4Norori N, Hu Q, Aellen FM, Faraci FD, Tzovara A. Addressing bias in big data and AI for health care: A call for open science. Patterns (N Y) 2021;2:100347.
5Vokinger KN, Feuerriegel S, Kesselheim AS. Mitigating bias in machine learning for medicine. Commun Med (London) 2021;1:25.