Abstract: |
This paper proposes a general learning mechanism for ART2 neural networks that relaxes the specification within the classic ART2 model, which restricts learning only to the active node. Thus the learning allows slow forgetting by exponential decay of all long-term memory traces. This approach changes the ATR2 learning rules, but does not use additional node features, or supervisory subsystems. It preserves the basic ART2 architecture and functioning, whereby makes the implementation straightforward. The proposed general learning mechanism releases redundant committed nodes for further learning, helps to prevent the system from blocking, and enhances a variety of network features. It may be used for some classes of applications that require clustering in a very large input space, or rapidly changing environmental conditions. |