Abstract

This paper describe Web Content Filtering that aimed to block out offensive material by using DistributedAgents. The proposed system using FCM algorithm and other page's features (Title, Metadata , Warning Message) to classifythe websites (using as candidate) into two types:- white that considered acceptable, and black that contain harmful materialtaking the English Pornographic websites as a case study.

Highlights

  • In last recent year the (WWW) has become infinite information repository, and people became more dependent on Internet in their life, either for searching information, communication, e-commerce, e- mail,...,etc

  • There exists some drawbacks, like offensive material that can be found on the websites, with reference to exist more than (4,200,000) websites in the world in 2013, i.e. 12% of the websites in world

  • To address the problem of web content filtering system some strategies have been used, some used packet filtering approach, this method concern about IP address, but the IP address represents a particular host and this host can contain more than one sites, some of these sites considered acceptable, when blocking this IP this cause to block all the acceptable sites[3], the control access list of IP is generated manually and this required great human efforts, Other used white/black lists of resources, they classified the sites to white and black, respectively

Read more

Summary

INTRODUCTION

In last recent year the (WWW) has become infinite information repository, and people became more dependent on Internet in their life, either for searching information, communication, e-commerce, e- mail,...,etc. The words "sex" and "porn" rank fourth and sixth among the top ten most popular search terms[2] This number may be increase in the year, this type of sites is considered harmful for children and for adult people, and could cause a side effect, existing a system that can filtering these websites is necessary especially in home, school, university,...etc. To address the problem of web content filtering system some strategies have been used, some used packet filtering approach, this method concern about IP address, but the IP address represents a particular host and this host can contain more than one sites, some of these sites considered acceptable, when blocking this IP this cause to block all the acceptable sites[3], the control access list of IP is generated manually and this required great human efforts, Other used white/black lists of resources, they classified the sites to white and black, respectively. This paper organized as follow: section the introduction, section: Page Features, section explains the proposed systems, section explain the implementation, and section: Conclusion and Experimental results

PAGE FEATURE FOR CLASSIFICATION
THE PROPOSED SYSTEM
THE FUZZY C – MEANS
Classifier Agent
Administrator Agent
Updating Agent
Filtering Agent
IMPLEMENTATION
Findings
CONCLUSION AND EXPERIMENTAL RESUTLS
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.