HUSCAP logo Hokkaido Univ. logo

Hokkaido University Collection of Scholarly and Academic Papers >
Graduate School of Information Science and Technology / Faculty of Information Science and Technology >
Peer-reviewed Journal Articles, etc >

Kernelized Supervised Laplacian Eigenmap for Visualization and Classification of Multi-Label Data

Files in This Item:

The file(s) associated with this item can be obtained from the following URL: https://doi.org/10.1016/j.patcog.2021.108399


Title: Kernelized Supervised Laplacian Eigenmap for Visualization and Classification of Multi-Label Data
Authors: Tai, Mariko Browse this author
Kudo, Mineichi Browse this author →KAKEN DB
Tanaka, Akira Browse this author →KAKEN DB
Imai, Hideyuki Browse this author →KAKEN DB
Kimura, Keigo Browse this author
Keywords: Supervised Laplacian eigenmaps
Out-of-sample problem
Multi-label problems
Kernel trick
Separability-guided feature extraction
Issue Date: 26-Oct-2021
Publisher: Elsevier
Journal Title: Pattern recognition
Volume: 123
Start Page: 108399
Publisher DOI: 10.1016/j.patcog.2021.108399
Abstract: We had previously proposed a supervised Laplacian eigenmap for visualization (SLE-ML) that can handle multi-label data. In addition, SLE-ML can control the trade-off between the class separability and local structure by a single trade-off parameter. However, SLE-ML cannot transform new data, that is, it has the "out-of-sample" problem. In this paper, we show that this problem is solvable, that is, it is possible to simulate the same transformation perfectly using a set of linear sums of reproducing kernels (KSLEML) with a nonsingular Gram matrix. We experimentally showed that the difference between training and testing is not large; thus, a high separability of classes in a low-dimensional space is realizable with KSLE-ML by assigning an appropriate value to the trade-off parameter. This offers the possibility of separability-guided feature extraction for classification. In addition, to optimize the performance of KSLEML, we conducted both kernel selection and parameter selection. As a result, it is shown that parameter selection is more important than kernel selection. We experimentally demonstrated the advantage of using KSLE-ML for visualization and for feature extraction compared with a few typical algorithms. (c) 2021 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY license ( http://creativecommons.org/licenses/by/4.0/ )
Type: article
URI: http://hdl.handle.net/2115/83634
Appears in Collections:情報科学院・情報科学研究院 (Graduate School of Information Science and Technology / Faculty of Information Science and Technology) > 雑誌発表論文等 (Peer-reviewed Journal Articles, etc)

Export metadata:

OAI-PMH ( junii2 , jpcoar_1.0 )

MathJax is now OFF:


 

 - Hokkaido University