Please use this identifier to cite or link to this item: http://repositorio.unicamp.br/jspui/handle/REPOSIP/88844
Type: Artigo de evento
Title: Gradient Hyper-parameter Optimization For Manifold Regularization
Author: Becker C.O.
Ferreira P.A.V.
Abstract: Semi-supervised learning can be defined as the ability to improve the predictive performance of an algorithm by providing it with data which hasn't been previously labeled. Manifold Regularization is a semi-supervised learning approach that extends the regularization framework so as to include additional regularization penalties that are based on the graph Laplacian as the empirical estimator of the underlying manifold. The incorporation of such terms rely on additional hyper-parameters, which, together with the original kernel and regularization parameters, are known to influence algorithm behavior. This paper proposes a gradient approach to the optimization of such hyper-parameters which is based on the closed form for the generalized cross validation estimate, being valid when the learning optimality conditions can be represented as a linear system, such as is the case for Laplacian Regularized Least Squares. For the subset hyper-parameters that are integer quantities, as is the case for the Laplacian matrix hyper-parameters, we propose the optimization of the weight components of a sum of base terms. Results of computational experiments are presented to illustrate the technique proposed. © 2013 IEEE.
Editor: IEEE Computer Society
Rights: fechado
Identifier DOI: 10.1109/ICMLA.2013.145
Address: http://www.scopus.com/inward/record.url?eid=2-s2.0-84899461109&partnerID=40&md5=8ac106019099d58c5be4cd499e8a7d37
Date Issue: 2013
Appears in Collections:Unicamp - Artigos e Outros Documentos

Files in This Item:
File Description SizeFormat 
2-s2.0-84899461109.pdf411.59 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.