2024-09-08T23:34:52Zhttps://eprints.lib.hokudai.ac.jp/dspace-oai/requestoai:eprints.lib.hokudai.ac.jp:2115/655202022-11-17T02:08:08Zhdl_2115_20053hdl_2115_145Theoretical Analyses on 2-Norm-Based Multiple Kernel Regressors1000020332471Tanaka, Akira1000010213216Imai, Hideyukiopen accesscopyright©2017 IEICEmultiple kernel regressorreproducing kernel Hilbert spacegeneralization error2-norm criterion2-norm regularizer007The solution of the standard 2-norm-based multiple kernel regression problem and the theoretical limit of the considered model space are discussed in this paper. We prove that 1) The solution of the 2-norm-based multiple kernel regressor constructed by a given training data set does not generally attain the theoretical limit of the considered model space in terms of the generalization errors, even if the training data set is noise-free, 2) The solution of the 2-norm-based multiple kernel regressor is identical to the solution of the single kernel regressor under a noise free setting, in which the adopted single kernel is the sum of the same kernels used in the multiple kernel regressor; and it is also true for a noisy setting with the 2-norm-based regularizer. The first result motivates us to develop a novel framework for the multiple kernel regression problems which yields a better solution close to the theoretical limit, and the second result implies that it is enough to use the single kernel regressors with the sum of given multiple kernels instead of the multiple kernel regressors as long as the 2-norm based criterion is used.電子情報通信学会The Institute of Electronics, Information and Communication Engineers / IEICE2017-03engjournal articleVoRhttp://hdl.handle.net/2115/65520http://search.ieice.org/https://doi.org/10.1587/transfun.E100.A.8771745-1337IEICE transactions on fundamentals of electronics communications and computer sciencesE1003877887https://eprints.lib.hokudai.ac.jp/dspace/bitstream/2115/65520/1/e100-a_3_877.pdfapplication/pdf785.07 KB2017-03