Abstract

ABSTRACT With new technologies presenting ever more ways to encounter war, this article asks about one in particular: representations of veterans powered by artificial intelligence. Specifically, I focus on the Virtual Veterans chatbot – named Charlie – produced by Anzac Square, the state war memorial of Queensland, Australia. Shared on the social media platform X, this feature has garnered significant public attention. I argue that AI representations of this nature delimit the boundaries of war by discussing certain topics and not others. They possess a martial ontology that locates certain subjects and issues outside of ‘war’ and others within it. Further, they produce an epistemology of war that presents a state-sanctioned history as ‘first hand’, lending it significant authority and engendering vicarious militarism. In so doing, these chatbots reproduce existing ideas of nation, gender, racial identity and more.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.