HUSCAP logo Hokkaido Univ. logo

Hokkaido University Collection of Scholarly and Academic Papers >
情報科学研究科  >
雑誌発表論文等  >

Ensemble and Multiple Kernel Regressors : Which Is Better?

e98-a_11_2315.pdf549.28 kBPDF見る/開く

タイトル: Ensemble and Multiple Kernel Regressors : Which Is Better?
著者: Tanaka, Akira 著作を一覧する
Takebayashi, Hirofumi 著作を一覧する
Takigawa, Ichigaku 著作を一覧する
Imai, Hideyuki 著作を一覧する
Kudo, Mineichi 著作を一覧する
キーワード: kernel regression
ensemble kernel regressor
multiple kernel regressor
generalization error
reproducing kernel Hilbert spaces
発行日: 2015年11月
出版者: IEICE - The Institute of Electronics, Information and Communication Engineers
誌名: IEICE transactions on fundamentals of electronics communications and computer sciences
巻: E98A
号: 11
開始ページ: 2315
終了ページ: 2324
出版社 DOI: 10.1587/transfun.E98.A.2315
抄録: For the last few decades, learning with multiple kernels, represented by the ensemble kernel regressor and the multiple kernel regressor, has attracted much attention in the field of kernel-based machine learning. Although their efficacy was investigated numerically in many works, their theoretical ground is not investigated sufficiently, since we do not have a theoretical framework to evaluate them. In this paper, we introduce a unified framework for evaluating kernel regressors with multiple kernels. On the basis of the framework, we analyze the generalization errors of the ensemble kernel regressor and the multiple kernel regressor, and give a sufficient condition for the ensemble kernel regressor to outperform the multiple kernel regressor in terms of the generalization error in noise-free case. We also show that each kernel regressor can be better than the other without the sufficient condition by giving examples, which supports the importance of the sufficient condition.
Rights: copyright©2015 IEICE
Relation (URI):
資料タイプ: article
出現コレクション:雑誌発表論文等 (Peer-reviewed Journal Articles, etc)

提供者: 田中 章


本サイトに関するご意見・お問い合わせは repo at へお願いします。 - 北海道大学