AAAI Publications, Twenty-Ninth AAAI Conference on Artificial Intelligence

Font Size: 
Tensor-Variate Restricted Boltzmann Machines
Tu Dinh Nguyen, Truyen Tran, Dinh Phung, Svetha Venkatesh

Last modified: 2015-02-21


Restricted Boltzmann Machines (RBMs) are an important class of latent variable models for representing vector data. An under-explored area is multimode data, where each data point is a matrix or a tensor. Standard RBMs applying to such data would require vectorizing matrices and tensors, thus resulting in unnecessarily high dimensionality and at the same time, destroying the inherent higher-order interaction structures. This paper introduces Tensor-variate Restricted Boltzmann Machines (TvRBMs) which generalize RBMs to capture the multiplicative interaction between data modes and the latent variables. TvRBMs are highly compact in that the number of free parameters grows only linear with the number of modes. We demonstrate the capacity of TvRBMs on three real-world applications: handwritten digit classification, face recognition and EEG-based alcoholic diagnosis. The learnt features of the model are more discriminative than the rivals, resulting in better classification performance.


tensor; rbm; restricted boltzmann machine; tvrbm; multiplicative interaction; eeg;

Full Text: PDF