Abstract

Given the controversies over the efficacy of algorithm transparency within political conflicts among agencies, this study examined whether there are discrepancies between a user’s perceived and received transparency and trust in Google’s algorithm-based search engine. Drawing on the concepts of perceived transparency, actual transparency, and trust, this study conducted a moderation mediation analysis along with the bootstrapping method to investigate 1) whether there are gaps between perceived transparency in Google Search and the actual understanding of the system, i.e., received transparency; and 2) how gaps in transparency affect the extent of user trust in search engines. The main findings suggested that received and perceived transparency are important for improving user trust in the search system. In particular, our results supported a knowledge-based trust model that achieves greater trust through a process linking transparency perception with actual understanding of the system. This result challenges the industrial sector’s aggressive logic of algorithm secrecy by implying the benefits of greater openness, thereby shedding light on a reciprocal relationship between industrial actors and end users.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call