The Comparison of Performance According to Initialization Methods of Deep Neural Network for Malware Dataset
Young-Man Kwon1, Yong-woo Kwon2, Dong-Keun Chung3, Myung-Jae Lim4
1Young-Man Kwon, Department of Medical, Information Technology, Eulji University, Republic of Korea.
2Yong-woo Kwon, Department of Medical, Information Technology, Eulji University, Republic of Korea.
3Dong-Keun Chung, Department of Medical, Information Technology, Eulji University, Republic of Korea.
4Myung-Jae Lim, Department of Medical, Information Technology, Eulji University, Republic of Korea.
Manuscript received on 05 March 2019 | Revised Manuscript received on 12 March 2019 | Manuscript Published on 20 March 2019 | PP: 57-62 | Volume-8 Issue- 4S2 March 2019 | Retrieval Number: D1S0014028419/2019©BEIESP
Open Access | Editorial and Publishing Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open-access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)
Abstract: Training Deep NN, proper initialization of weights leads to good performance. The methods that commonly used to initialize the weights are, limited variance of weight’s values and re-use unsupervised pre-trained weights. In this paper, we proposed the new algorithm that some of weights are used after being pre-trained by the CD method of unsupervised DBN and the other of the weights are initialized using the Xavier or He initialization method. We call these DB Nn X and DB Nn He. We compare the performance with several DB Nn X, DB Nn He and existing methods. We evaluated and visualized these by using AUC score and using box plot. As the result of experiment, we found the DBN2X and DBN2He are best.
Keywords: Weight Initialize, DBN, Malware Dataset, AUC, Deep NN.
Scope of the Article: Information Retrieval