WIT Press


Exploring The Parameter Space Of Unsupervised ART Neural Networks For Data Mining

Price

Free (open access)

Paper DOI

10.2495/DATA020451

Volume

28

Pages

Published

2002

Size

698 kb

Author(s)

E A Capuano & C Chauke Nehme

Abstract

Adaptive Resonance Theory (ART), developed by Stephen Grossberg and Gail Carpenter, has attracted some enthusiasts worldwide as a promising approach of artificial neural networks (ANN) for pattern recognition and classification tasks. However, this technology is still not popular among IT professionals and its adoption for data mining applications has not spread out as one could expect. We believe that one reason for this fact is the lack of knowledge about the ART ANN behavior as the parameters change in space. To explore this idea, we executed almost eight hundred experimental tests to map ART1, ART2 and Fuzzy-ART configuration parameters. Some analyses have been done and the results can help system analysts deal with this kind of ANN and explore its complete power. Experimental data, referring to simulated and real databases, have shown that dependent parameter ~ could be clearly defined in Euclidean space R2 as a function of the vigilance parameter p and other patterns and network parameters in ART1, ART2 and Fuzzy-ART models. These results set numerical boundaries up that are useful for data mining application designers using such technology. For instance, the behavior of normalized parameter < as a function of p shows that it is possible to overcome problems with binary coding in ART1 by imposing higher values of p to get complete featured cluster vectors and attenuate (or eliminate) the \“bit flood effect” of \“O” and” 1\“ observed in \“1-of-N” or \“thermometer” coding. An important conclusion is that ART1, ART2 and Fuzzy-ART neural networks, as many other ANN approaches, can support data mining applications based on unsupervised learning (for on line systems) overcoming the stability-plasticity dilemma present in other ANN models.

Keywords