|
Learning Kernels via Margin-and-Radius Ratios
|
Learning Kernels via Margin-and-Radius Ratios
Most existing MKL approaches employ the large margin principle to learning kernels. However, we point out that the margin itself can not well describe the goodness of a kernel due to the negligence of the scaling. We use the ratio between the margin and the radius of the minimal enclosing ball of data in the feature space endowed with a kernel, to measure how good the kernel is, and propose a new scaling-invariant formulation for kernel learning. Our presented formulation can handle both linear and nonlinear combination kernels. In linear combination cases, it is also invariant not only to types of norm constraints on combination coefficients but also to initial scalings of basis kernels. By establishing the differentiability of a general type of multilevel optimal value functions, we present a simple and efficient gradient-based kernel learning algorithm. Experiments show that our approach significantly outperforms other state-of- art kernel learning methods.
Video Length: 0
Date Found: January 14, 2011
Date Produced: January 12, 2011
View Count: 0
|
|
|
|
|
I got punched by an old guy, for farting near his wife. Read MoreComic book creator Stan Lee talks the future of the medium in the digital age. Panelists Zachary... Read MoreThe U.S. launch of Spotify is still on music lovers' minds. Join Zachary Levi, from NBC’s... Read MoreTuesday: Rupert Murdoch testifies before Parliament on the hacking scandal that brought down "News... Read MoreAfter a long slump, the home construction industry may be showing signs of life. But as Bill... Read More | 1 2 3 4 5 |
|
|
|