Please use this identifier to cite or link to this item:
Type: Artigo de periódico
Title: Observations on morphological associative memories and the kernel method
Author: Sussner, P
Abstract: The ability of human beings to retrieve information on the basis of associated cues continues to elicit great interest among researchers. Investigations of how the brain is capable to make such associations from partial information have led to a variety of theoretical neural network models that act as associative memories. Several researchers have had significant success in retrieving complete stored patterns from noisy or incomplete input pattern keys by using morphological associative memories. Thus far morphological associative memories have been employed in two different ways: a direct approach which is suitable for input patterns containing either dilative or erosive noise and an indirect one for arbitrarily corrupted input patterns which is based on kernel vectors. In a recent paper (P. Sussner, in: Proceedings of the International ICSA/IFAC Symposium on Neural Computation, Vienna, September 1998), we suggested how to select these kernel vectors and we deduced exact statements on the amount of noise which is permissible for perfect recall, In this paper, we establish the proofs for all our claims made about the choice of kernel Vectors and perfect recall in kernel method applications. Moreover, we provide arguments for the success of both approaches beyond the experimental results presented up to this point. (C) 2000 Elsevier Science B.V. All rights reserved.
Subject: associative memories
morphological neural networks
kernel method
kernel vectors
Country: Holanda
Editor: Elsevier Science Bv
Rights: fechado
Identifier DOI: 10.1016/S0925-2312(99)00176-9
Date Issue: 2000
Appears in Collections:Unicamp - Artigos e Outros Documentos

Files in This Item:
File Description SizeFormat 
WOS000086007800012.pdf216.67 kBAdobe PDFView/Open

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.