Article, 2024
A scientometric analysis of fairness in health AI literature
PLOS Global Public Health,
ISSN
2767-3375,
Volume 4,
1,
Page e0002513,
10.1371/journal.pgph.0002513
Contributors
Alberto, Isabelle Rose I
0000-0002-7206-4770
[1]
Alberto, Nicole Rose I
0000-0001-9166-8134
[1]
Altinel, Yuksel
[2]
Blacker, Sarah
0000-0002-6146-8972
[3]
Binotti, William Warr
0000-0001-6761-8807
[4]
Celi, Leo Anthony G
0000-0001-6712-6626
[5]
[6]
[7]
Chua, Tiffany
0000-0003-0664-6606
[8]
Fiske, Amelia
0000-0001-7207-6897
[9]
Griffin, Molly Elizabeth
0000-0002-1615-2645
[7]
Karaca, Gulce
0000-0001-5211-0680
[10]
Mokolo, Nkiruka
[11]
Naawu, David Kojo N
[11]
Patscheider, Jonathan
(Corresponding author)
[12]
Petushkov, Anton
0000-0003-3661-236X
[13]
Quion, Justin Michael
0009-0009-6844-3047
(Corresponding author)
[14]
Senteio, Charles R
0000-0002-0254-3127
[15]
Taisbak, Simon
[16]
Tirnova, İsmail
0000-0003-4488-1607
[17]
Tokashiki, Harumi
0000-0002-7307-1884
[18]
Velasquez, Adrian
[18]
[19]
Yaghy, Antonio N
0000-0002-5054-495X
[4]
Yap, Keagan
[6]
Affiliations
- [1]
University of the Philippines Manila
[NORA names:
Philippines; Asia, South];
- [2]
Sağlık Bilimleri Üniversitesi
[NORA names:
Turkey; Asia, Middle East; OECD];
- [3]
York University
[NORA names:
Canada; America, North; OECD];
- [4]
Tufts Medical Center
[NORA names:
United States; America, North; OECD];
- [5]
Massachusetts Institute of Technology
[NORA names:
United States; America, North; OECD];
(... more)
- [6]
Harvard University
[NORA names:
United States; America, North; OECD];
- [7]
Beth Israel Deaconess Medical Center
[NORA names:
United States; America, North; OECD];
- [8]
University of San Francisco
[NORA names:
United States; America, North; OECD];
- [9]
Technical University of Munich
[NORA names:
Germany; Europe, EU; OECD];
- [10]
Massachusetts General Hospital
[NORA names:
United States; America, North; OECD];
- [11]
Meharry Medical College
[NORA names:
United States; America, North; OECD];
- [12]
Trust Stamp Denmark, Copenhagen, Denmark
[NORA names:
Denmark; Europe, EU; Nordic; OECD];
- [13]
University of Michigan–Ann Arbor
[NORA names:
United States; America, North; OECD];
- [14]
University of the East Ramon Magsaysay Memorial Medical Center
[NORA names:
Philippines; Asia, South];
- [15]
Rutgers, The State University of New Jersey
[NORA names:
United States; America, North; OECD];
- [16]
Inviso by Devoteam, Aarhus, Denmark
[NORA names:
Denmark; Europe, EU; Nordic; OECD];
- [17]
Başkent University
[NORA names:
Turkey; Asia, Middle East; OECD];
- [18]
Carney Hospital
[NORA names:
United States; America, North; OECD];
- [19]
Brown University
[NORA names:
United States; America, North; OECD]
(less)
Abstract
Artificial intelligence (AI) and machine learning are central components of today's medical environment. The fairness of AI, i.e. the ability of AI to be free from bias, has repeatedly come into question. This study investigates the diversity of members of academia whose scholarship poses questions about the fairness of AI. The articles that combine the topics of fairness, artificial intelligence, and medicine were selected from Pubmed, Google Scholar, and Embase using keywords. Eligibility and data extraction from the articles were done manually and cross-checked by another author for accuracy. Articles were selected for further analysis, cleaned, and organized in Microsoft Excel; spatial diagrams were generated using Public Tableau. Additional graphs were generated using Matplotlib and Seaborn. Linear and logistic regressions were conducted using Python to measure the relationship between funding status, number of citations, and the gender demographics of the authorship team. We identified 375 eligible publications, including research and review articles concerning AI and fairness in healthcare. Analysis of the bibliographic data revealed that there is an overrepresentation of authors that are white, male, and are from high-income countries, especially in the roles of first and last author. Additionally, analysis showed that papers whose authors are based in higher-income countries were more likely to be cited more often and published in higher impact journals. These findings highlight the lack of diversity among the authors in the AI fairness community whose work gains the largest readership, potentially compromising the very impartiality that the AI fairness community is working towards.
Keywords
AI literature,
Embase,
FAIR community,
Google,
Google Scholar,
Matplotlib,
Microsoft,
Microsoft Excel,
PubMed,
Python,
academia,
accuracy,
analysis,
analysis of fairness,
article,
artificial intelligence,
authors,
authorship,
authorship team,
bias,
bibliographic data,
central component,
citations,
community,
countries,
cross-check,
data,
data extraction,
demographics,
diagram,
diversity,
eligibility,
eligible publications,
environment,
excellence,
extraction,
fairness,
findings,
funding,
funding status,
gender,
gender demographics,
graph,
health,
healthcare,
high-income countries,
higher-income countries,
i.,
impact,
impact journals,
impartiality,
intelligence,
journals,
keywords,
lack,
lack of diversity,
learning,
literature,
logistic regression,
machine,
machine learning,
male,
medical environment,
medicine,
members of academia,
overrepresentation,
paper,
publications,
questions,
readership,
regression,
relationship,
research,
review,
review article,
scholars,
scholarship,
scientometric analysis,
seaborne,
spatial diagrams,
status,
study,
tableaux,
team,
today's medical environment,
topics
Data Provider: Digital Science