Abstract

Due to the enormous amount of information being carried over online systems today, no user can access all such information. Therefore, to help the users, all major online organizations deploy information retrieval (content recommendation, search or ranking) systems to find important information. Current information retrieval systems have to make certain design choices. For example, news recommendation systems need to decide on the quality of recommended news stories, how much emphasis to give to a story's long-term importance over its recency or freshness etc. Similarly, recommendation systems over user generated contents (e.g., in social media like Facebook and Twitter) need to take into account the content posted by heterogeneous user groups. However, such design choices can introduce unintended biases in the contents presented to the users. For example, the recommended contents may have poor quality or less news value, or the news discourse may get hijacked by hyper-active demographic groups. In this thesis, we want to systematically measure the effect of such design choices in the content recommendation systems, and build alternate recommendation systems that mitigate the biases in the recommendation output.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.