
تعداد نشریات | 162 |
تعداد شمارهها | 6,622 |
تعداد مقالات | 71,533 |
تعداد مشاهده مقاله | 126,862,160 |
تعداد دریافت فایل اصل مقاله | 99,904,976 |
Changes in Artificial Neural Network Learning Parameters and Their Impact on Modeling Error Reduction | ||
Journal of Algorithms and Computation | ||
مقاله 10، دوره 50، issue 2، اسفند 2018، صفحه 141-155 اصل مقاله (285.67 K) | ||
نوع مقاله: Research Paper | ||
شناسه دیجیتال (DOI): 10.22059/jac.2018.70904 | ||
نویسنده | ||
Somayeh Mehrabadi* | ||
Free scholar | ||
چکیده | ||
The main objective of this research is to investigate the effect of neural network architecture parameters on model behavior. Neural network architectural factors such as training algorithm, number of hidden layer neurons, data set design in training stage and the changes made to them, and finally its effect on the output of the model were investigated. It developed a database for modeling using by multi-layer perceptron. In particular, the modeling process enjoyed three training algorithms: Bayesian Regularization (BR), Scaled Conjugate Gradient (SCG) and Levenberg Marquardt (LM). Model selection criteria based on the lowest error rate and data regression, using a trial and error approach. The results showed that models that greatly reduce the error have less generalizability. In the meantime, the BR algorithm with the data set design of 15-15-70 (for test, validation and training sections, respectively), has been used to reduce the error better than other algorithms, but improper generalizability. In contrast, the LM algorithm has better generalizability than the other two algorithms. Data analysis shows that, in most cases, when the amounts of data dedicated to test and validation change (increase or decrease), the model requires more neurons in order to reduce errors. | ||
آمار تعداد مشاهده مقاله: 1,982 تعداد دریافت فایل اصل مقاله: 240 |