This is the official Keras implementation of the Deep Embedded Self-Organizing Map (DESOM) model.
DESOM is an unsupervised learning model that jointly learns representations and the code vectors of a self-organizing map (SOM) in order to survey, cluster and visualize large, high-dimensional datasets. Our model is composed of an autoencoder and a custom SOM layer that are optimized in a joint training procedure, motivated by the idea that the SOM prior could help learning SOM-friendly representations. Its training is fast, end-to-end and requires no pre-training.
When using this code, please cite following work:
Florent Forest, Mustapha Lebbah, Hanene Azzag and Jérôme Lacaille (2019). Deep Embedded SOM: Joint Representation Learning and Self-Organization. In European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2019).
Florent Forest, Mustapha Lebbah, Hanene Azzag and Jérôme Lacaille (2019). Deep Architectures for Joint Clustering and Visualization with Self-Organizing Maps. In Workshop on Learning Data Representations for Clustering (LDRC), PAKDD 2019.
(see also http://florentfo.rest/publications)
The implementation is divided into several scripts:
The data directory contains USPS and REUTERS-10k datasets.
The main script has several command-line arguments that are explained with:
$ python3 DESOM.py --help
All arguments have default values, so DESOM training can be simply started doing:
$ python3 DESOM.py
For example, to train DESOM on Fashion-MNIST with a 20x20 map, the command is:
$ python3 DESOM.py --dataset fmnist --map_size 20 20
Training generates several outputs:
Behavior is similar for the kerasom model.
For information, one training run on MNIST with 10000 iterations and batch size 256 on a laptop GPU takes around 2 minutes.
A full benchmark of DESOM on the 4 datasets can be started by calling the script desom_benchmark.py
. Parameters, number of runs and save directories are specified inside the script. Paper results were obtained using this script and number of runs equal to 10. Similar scripts were used for other compared models (minisom, kerasom and with pre-trained autoencoder weights).
The main dependencies are keras, tensorflow, scikit-learn, numpy, pandas, matplotlib.
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。