Abstract

SummaryHadoop framework has been widely used in animation industries to build a large scale, high performance parallel rendering system. However, Hadoop Distributed File System (HDFS) and the MapReduce programming model are designed to manage large files and suffer performance penalty while rendering and storing small files in a rendering system. Therefore, a method that merges small files based on two intelligent algorithms is proposed to solve the problem. The method uses Particle Swarm Optimization (PSO) to select the optimal merge values for multiple sets of scenes and then uses Support Vector Machine (SVM) to generate a general SVM model which can be used to get the optimal merge value for any scene, by mainly considering the rendering time, memory limitation and other indicators. Then, the method takes advantage of frame‐to‐frame coherence to merge files in the same scene in an interval‐based way with the optimal merge value. Finally, the proposed method is compared with the naive method under three different render scenes. Experimental results show that the proposed method significantly reduces the number of small files and render tasks, and improves the storage efficiency and computing efficiency. Copyright © 2016 John Wiley & Sons, Ltd.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call