Abstract

Currently, the areas of application of machine learning are multifaceted: artificial intelligence, financial applications, bioinformatics, intellectual games, speech and text recognition, computer language processing, medical diagnostics, technical diagnostics, text search and rubrication. Machine learning is an area of scientific knowledge associated with learning-capable algorithms. The use of machine learning methods is explained by the fact that for most intelligent complex tasks (for example, speech recognition, etc.) it is almost impossible to develop an obvious algorithm for their solution. However, you can teach a computer to learn how to solve such problems. In our article, we propose a model based on the machine learning algorithm “RandomForest”, which allows one to recognize bots by HTTP sessions. The chosen algorithm provides many advantages: non-iterative learning; high quality of the resulting models (comparable to neural networks and ensembles of neural networks); a small number of adjustable parameters. It works well with missing data (retains good accuracy); internal assessment of the generalizability of the model; able to work with raw data without preprocessing. The algorithm was trained on a dataset of more than 5000 sessions. The prospects of this direction are obvious, since robotic traffic accounts for more than 40% of the total Internet traffic.

Highlights

  • The new digital world assumes a gradual transition of business to the online space

  • Has the activity of people only increased in the online space? After reviewing data from traffic data, Imperva'sThreatResearchLab [1] produced a report for 2020 that shows an increase in network activity of automated means of interacting with content on websites

  • The final classifier is a (x) = 1N∑i = 1Nbi (x), in simple words - for the classification problem we choose the solution by voting by the majority, and in the regression problem - by the average [9]

Read more

Summary

Introduction

The new digital world assumes a gradual transition of business to the online space. The 2020 quarantine measures accelerated this process. After reviewing data from traffic data, Imperva'sThreatResearchLab [1] produced a report for 2020 that shows an increase in network activity of automated means of interacting with content on websites. In this case, the share of human activity stands out - 62.8%; good bots - 13.1% and bad bots 24.1% [4,5,6]. The damage is done to both businesses and individuals In this regard, it is imperative to develop processes for filtering unwanted activity on websites [10, 11]

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.