Abstract—The terms “Neural Network” (NN) and “Artificial Neural Network” (ANN) usually refer to a Multilayer Perceptron Network. It process the records one at a time, and "learn" by comparing their prediction of the record with the known actual record. The problem of model selection is considerably important for acquiring higher levels of generalization capability in supervised learning. This paper discussed behavioral analysis of different number of hidden layers and different number of hidden neurons. It’s very difficult to select number of hidden layers and hidden neurons. There are different methods like Akaike’s Information Criterion, Inverse test method and some traditional methods are used to find Neural Network architecture. What to do while neural network is not getting train or errors are not getting reduced. To reduce Neural Network errors, what we have to do with Neural Network architecture. These types of techniques are discussed and also discussed experiment and result. To solve different problems a neural network should be trained to perform correct classification.
Index Terms—Back Propagation; Neural Network; Training; Testing; Weights.
1 gaurangpanchal.ce@ecchanga.ac.in, 2 amitganatra.ce@ecchanga.ac.in, 3 ypkosta.adm@ecchanga.ac.in, 4 devyanipanchal.it@ecchanga.ac.in
Cite: Gaurang Panchal, Amit Ganatra, Y P Kosta, and Devyani Panchal, "Behaviour Analysis of Multilayer Perceptrons with Multiple Hidden Neurons and Hidden Layers," International Journal of Computer Theory and Engineering vol. 3, no. 2, pp. 332-337, 2011.
Copyright © 2008-2024. International Association of Computer Science and Information Technology. All rights reserved.