On Neural Network : How can we improve on back propagation?
- The system can forget the old and less accurate information improving its performance. Continuous learning. New info given more weight over the old ones, which currently works in reverse -- newer information is almost discarded.
- The weight of a neural connection is kept the same during both forward and back propagation. Tests show giving random weight during back propagation give better results.
- Group the neurons into many structures and train those parts individually.
Maybe this whole neural network concept has passed its days. And now its time to come up with new better concepts.