Abstract

Artificial Neural Networks (ANNs) represent a family of powerful machine learning-based techniques used to solve many real-world problems. The various applications of ANNs can be summarized into classification or pattern recognition, prediction and modeling. As with other machine learning techniques, ANNs are getting momentum in the Big Data era for analysing, predicting and Big Data analytics from large data sets. ANNs bring new opportunities for Big Data analysis for extracting accurate information from the data, yet there are also several challenges to be faced not known before with traditional data sets. Indeed, the success of learning and modeling Big Data by ANNs varies with training sample size, depends on data dimensionality, complex data formats, data variety, etc. In particular, ANNs performance is directly influenced by data size, requiring more memory resources. In this context, and due to the assumption that data set may no longer fit into main memory, it is interesting to investigate the performance of ANNs when data is read from main memory or from the disk. This study represents a performance evaluation of Artificial Neural Network (ANN) with multiple hidden layers, when training data is read from memory or from disk. The study shows also the trade-offs between processing time and data size when using ANNs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call