Skip to main content

Table 7 Ethical issues identified (n = 218)

From: Ethical implications related to processing of personal data and artificial intelligence in humanitarian crises: a scoping review

Ethical issues identified

Count (%)

Autonomy

Lack of consent: Data is collected without informed consent

91 (42%)

Data agency: People do not have the right to control, access, or delete their data

50 (23%)

Participation: People/communities are not involved in decisions to use of new/experimental technologies for collecting data

50 (23%)

Undisclosed use: Data may be used beyond purposes for which they were collected

40 (18%)

Lack of respect: People/communities are not treated with respect

37 (17%)

Autonomy: Unwillingness to share data does not lead to disadvantages (e.g., exclusion from assistance or protection)

35 (16%)

Lack of group agency: Processed information is not available to affected communities

8 (4%)

Any implication related to Autonomy

146 (67%)

Beneficence

Unreliability: Processed data is inaccurate and does not sufficiently reflect reality to inform assistance

108 (50%)

Dependence: Data is processed with the assistance of a political, economic, or military entity

107 (49%)

Non-neutrality: Data is processed in a way that benefits or appears to benefit one side of the conflict over the other

67 (31%)

Ineffective or inefficient: Not producing expected result, unmet expectations

56 (26%)

Lack of action: Processed data is not utilized to inform assistance to the affected person/community

42 (19%)

Any implication related to Beneficence

191 (88%)

Non-maleficence

Privacy: Personal/sensitive data is shared with third parties

134 (61%)

Harm: People suffer physical or psychological harm as a result of data processing

106 (49%)

Data security: Personal/sensitive data is not protected against malicious actors

103 (47%)

Power imbalance: Data processing reinforces or worsens a lack of power of affected people

93 (43%)

Excess: More data was collected than necessary

28 (13%)

Redress/rectification: People do not have the ability to correct wrong information about them or receive compensation

16 (7%)

Any implication related to Non-maleficence

199 (91%)

Justice

Bias: Data is processed in a way that may (dis)advantage some people disproportionate to their humanitarian needs

122 (56%)

Unequal access to technology / exclusion from data collection

75 (34%)

Lack of accountability: Endangering (or not protecting) rights; absolving responsibility

52 (24%)

Unfair distribution of risks and benefits

37 (17%)

Any implication related to Justice

174 (80%)