Using AI to Fight Human Trafficking: Ethical Considerations

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

This study examines to what extent current information systems (IS) research has investigated artificial intelligence (AI) applications to fight human trafficking, the theories and models employed to study these applications, the ethical concerns that these applications pose, and the possible technical and managerial solutions to address such concerns. This investigation will be conducted through a systematic literature review methodology and a semantic text similarity analysis. The results of our review will make significant contributions to IS researchers by synthesizing previous human trafficking studies and proposing an agenda and future directions for IS studies to make them more oriented toward studying information technologies that contribute to "a better world." This study will also help law enforcement authorities and organizations that develop and provide AI services by suggesting technical and managerial solutions that can minimize possible ethical concerns related to biases and discriminatory outcomes generated by AI to combat human trafficking.

Original languageEnglish
Title of host publication30th Americas Conference on Information Systems, AMCIS 2024
PublisherAssociation for Information Systems
ISBN (Electronic)9798331307066
StatePublished - 2024
Event30th Americas Conference on Information Systems, AMCIS 2024 - Salt Lake City, United States
Duration: 15 Aug 202417 Aug 2024

Publication series

Name30th Americas Conference on Information Systems, AMCIS 2024

Conference

Conference30th Americas Conference on Information Systems, AMCIS 2024
Country/TerritoryUnited States
CitySalt Lake City
Period15/08/2417/08/24

Keywords

  • artificial intelligence
  • ethics
  • Human trafficking

Fingerprint

Dive into the research topics of 'Using AI to Fight Human Trafficking: Ethical Considerations'. Together they form a unique fingerprint.

Cite this