Lluís A. Belanche Muñoz, Universitat Politècnica de Catalunya
This work deals with the development of general classes of neuron models, accepting heterogeneous inputs by aggregation of continuous (crisp or fuzzy) numbers, linguistic information, and discrete (either ordinal or nominal) quantities, with provision also for missing information. The internal stimulation of these neural models is based on an explicit similarity relation between the input and the weight tuples (which are also heterogeneous). The framework is very comprehensive and several particular models can be derived as instances thereof. These networks are capable to learn from non-trivial data sets with an effectiveness comparable, and often better, than that of classical networks.