Abstract
This chapter emphasizes the fact that to neutralize the risk of the Web being invaded, it must be encouraged and preserved. Supposing that the Web is a living ecosystem that supplies information, a great deal of serious problems emerge from deliberate attacks such as proxies and crawlers, which account for a fair proportion of the overall Web traffic, and parasites or parasitic computing, which demonstrate the Internet's vulnerability and resilience to damage. Broken links, hardware failure in the Internet communication infrastructure, and software bugs are examples of natural damages that affect the Web ecosystem. One way in which search engines can purify their results is based on the simple observation that reputable sites do not link to disreputable ones unless they have actually been infiltrated, which is rare. Search engines devise shields against the spammers' weapons. For example, in 2004, Google proposed a simple way to defeat guest-book spam. The idea was to introduce a new kind of link called “no follow” that anyone can use. These links work in the normal way, except that search engines do not interpret them as endorsing the page they lead to. However, good sites can occasionally be deceived into linking to bad ones. Therefore, it is advantageous to keep inference chains short. However, if an error is discovered, it can immediately be corrected by hand-labeling the page.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.