2024-03-28T12:01:49Zhttps://eprints.lib.hokudai.ac.jp/dspace-oai/requestoai:eprints.lib.hokudai.ac.jp:2115/603582022-11-17T02:08:08Zhdl_2115_20053hdl_2115_145Ensemble and Multiple Kernel Regressors : Which Is Better?Tanaka, AkiraTakebayashi, HirofumiTakigawa, IchigakuImai, HideyukiKudo, Mineichikernel regressionensemble kernel regressormultiple kernel regressorgeneralization errorreproducing kernel Hilbert spaces007For the last few decades, learning with multiple kernels, represented by the ensemble kernel regressor and the multiple kernel regressor, has attracted much attention in the field of kernel-based machine learning. Although their efficacy was investigated numerically in many works, their theoretical ground is not investigated sufficiently, since we do not have a theoretical framework to evaluate them. In this paper, we introduce a unified framework for evaluating kernel regressors with multiple kernels. On the basis of the framework, we analyze the generalization errors of the ensemble kernel regressor and the multiple kernel regressor, and give a sufficient condition for the ensemble kernel regressor to outperform the multiple kernel regressor in terms of the generalization error in noise-free case. We also show that each kernel regressor can be better than the other without the sufficient condition by giving examples, which supports the importance of the sufficient condition.IEICE - The Institute of Electronics, Information and Communication EngineersJournal Articleapplication/pdfhttp://hdl.handle.net/2115/60358https://eprints.lib.hokudai.ac.jp/dspace/bitstream/2115/60358/1/e98-a_11_2315.pdf1745-1337IEICE transactions on fundamentals of electronics communications and computer sciencesE9811231523242015-11enginfo:doi/10.1587/transfun.E98.A.2315copyright©2015 IEICEpublisher