Comparison The Performance of The Recurrent Neural Network in Reducing Training Parameters for Convolution Neural Network

مقارنة أداء الشبكات العصبونية التكرارية في تخفيض بارامترات التدريب لشبكات التلافيف العصبية

المؤلفون

  • Mohamad Ahmad Mounir Batikh, Mohamad Ayman Nael, Amer Boushi

الكلمات المفتاحية:

الشبكات العصبية التكرارية
شبكات الطي العصبية
LWCNN
LSTM
GRU

الملخص

The study aims to reduce the number of parameters in the Convolution Neural Network (CNN), which is one of the best techniques used to extract and categorize behavioral features in video files. These networks have a very big size and a large number of parameters which distributed in the deep layers, especially in the last layers that responsible for classification, the values of the parameters ​​are modified at each stage of network training, which consume memory too much, and its need for a very large memory space. In this research we work to reduce the number of parameters through using lightweight Convolution Neural Network (LWCNN), we choose Alex Net network in our research, but we made some modification on it, we decrease the number of filters in convolution layer, and we replace the last layers in the network with one of the most important types of Recurrent Neural Network (RNN). We use each of Long Short Term Memory (LSTM), and Gated Recurrent Unit (GRU). The work was tested during the research period on a dataset containing 960(videos) for normal children and children with autism spectrum, which were taken in Center for psychosocial support for people with special needs. And the experimental results proved the significant decrease in the number of parameters in the system with lightweight networks after linking them with recurrent networks with 84%, as well as the recurrent network with long reliability (LSTM) gave better results than the Gated recurrent unit (GRU) in accuracy and the loss value.

السيرة الشخصية للمؤلف

Mohamad Ahmad Mounir Batikh, Mohamad Ayman Nael, Amer Boushi

 

Faculty of Electric and Electronic Engineering || Aleppo University || Syria

التنزيلات

منشور

2022-03-27

كيفية الاقتباس

1.
Comparison The Performance of The Recurrent Neural Network in Reducing Training Parameters for Convolution Neural Network: مقارنة أداء الشبكات العصبونية التكرارية في تخفيض بارامترات التدريب لشبكات التلافيف العصبية. JESIT [انترنت]. 27 مارس، 2022 [وثق 22 نوفمبر، 2024];6(1):1-14. موجود في: https://journals.ajsrp.com/index.php/jesit/article/view/4840

إصدار

القسم

المقالات

كيفية الاقتباس

1.
Comparison The Performance of The Recurrent Neural Network in Reducing Training Parameters for Convolution Neural Network: مقارنة أداء الشبكات العصبونية التكرارية في تخفيض بارامترات التدريب لشبكات التلافيف العصبية. JESIT [انترنت]. 27 مارس، 2022 [وثق 22 نوفمبر، 2024];6(1):1-14. موجود في: https://journals.ajsrp.com/index.php/jesit/article/view/4840