Abstract

With more and more digital machines consolidated on fewer and fewer cloud servers, the software running in those servers needs safety. Virtualization poses unique software defects which must be detected and prevented specific software requirements designed for virtualization environments. In this research thesis, software virtualization technology became used to transparently record the allocation and release of memory resources implemented to a database connection on a virtual machine in the cloud, and these records provided the information to detect memory leaks hiding in the code. Memory leaks account for lack of a self-adaptive handy cloud computing structure due to consistent use ordinary static and dynamic memory leak analysis tools. Most of the available tools for defects detection do not provide for consistency of memory leak prevention. The main intention of the research developed a self-adaptive virtualization model for software defects detection and prevention of software Memory Leaks Using Deep Learning and Machine Learning Methods. Data sampling used was code-based sampling based on Low-Density Parity Checks which avoided overestimating false positives for the variables used. There were a total population of 35 variables for the study, out of these; seven variables were selected as a sample. The sample objects, classes and class loaders access for the 4-database test connection used a minimum 0.1% sampling rate which had 4 database connection references out of every 7 variables used. The approach used gave an accuracy of 98% security rate when compared with other existing methods like Long Short-Term Memory which achieved 82.3%, Self-organizing Maps was 85.5% and Boltzmann Approach was 93.5%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call