Abstract

At first glance, the service search-engine seems very useful and faultless, but by the more careful examination one may notice weaknesses in this search results. One of these weakness is that the result pages, which the search-engines offer is sometimes without content and sometimes have no relevance to the field that user had in mind . On the other hand many of quality-pages have no place in the search results. This paper advises search-engines to hand the job of decision making about the content of web sites to users, because humans are very much faster and have a very lower rate of error and can decide about the usefulness of a website with more justice. In the proposed algorithm which is based on fuzzy logic, we try to use parameters such as speed of mouse movements, scrolling speed, standard deviation of horizontal position of mouse and the time spent by user in each page to evaluate the extent of user's satisfaction with the page content. This ppaer describes the surveys conducted and then analyzes of the fuzzy variables, fuzzy sets and membership functions. Finally, discusses the benefits of the proposed algorithm.

Highlights

  • Links within web pages have connected them like spider webs

  • Relying upon the fact that web pages with a related subject usually are connected by hyperlinks, search-engines start from a page and follow the links

  • In the part where users were expected to find what they had in mind, horizontal movement of the mouse was slower, that shows the user was reading the part with more attention. Another interesting finding was that as the text became more related to the subject, standard deviation of horizontal mouse location was higher, which means that users moved their mouse cursor horizontally at eye level when reading the text

Read more

Summary

Introduction

Links within web pages have connected them like spider webs. Relying upon the fact that web pages with a related subject usually are connected by hyperlinks, search-engines start from a page and follow the links. Because many web pages are in the dark internet, spider is unable to find them (Gaston L’Huillier, 2011) Another cause of weakness in the search engine results can be because of their databases are not up to date. This paper advises search-engines to hand the job of decision making about the content of web sites to users, because humans are very much faster and have a very lower rate of error and can decide about the usefulness of a website with more justice. This way, search-engines will maintain user feedback instead of page analysis. By analyzing the movement of human eye we can determine whether or not the user is reading the page, or which part of the page has attracted the users more, or in other words, where the user has found the answer to his question (Figure 1)

Background and Related Work
The Proposed Algorithm
Recording User Behavior
Results
Fuzzy Logic
Inference
Benefits of the Proposed Algorithm
Conclusion
10. REFERENCES
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.