Please use this identifier to cite or link to this item: https://repository.iimb.ac.in/handle/2074/11207
DC FieldValueLanguage
dc.contributor.authorGaudioso, Manlio
dc.contributor.authorGorgone, Enrico
dc.contributor.authorLabb, Martine
dc.contributor.authorRodrguez-Cha, Antonio M
dc.date.accessioned2020-03-31T13:08:13Z-
dc.date.available2020-03-31T13:08:13Z-
dc.date.issued2017
dc.identifier.issn0305-0548
dc.identifier.urihttps://repository.iimb.ac.in/handle/2074/11207-
dc.description.abstractWe discuss a Lagrangian-relaxation-based heuristics for dealing with feature selection in the Support Vector Machine (SVM) framework for binary classification. In particular we embed into our objective function a weighted combination of the L1 and L0 norm of the normal to the separating hyperplane. We come out with a Mixed Binary Linear Programming problem which is suitable for a Lagrangian relaxation approach. Based on a property of the optimal multiplier setting, we apply a consolidated nonsmooth optimization ascent algorithm to solve the resulting Lagrangian dual. In the proposed approach we get, at every ascent step, both a lower bound on the optimal solution as well as a feasible solution at low computational cost. We present the results of our numerical experiments on some benchmark datasets.
dc.publisherElsevier
dc.subjectFeature Selection
dc.subjectLagrangian Relaxation
dc.subjectNonsmooth Optimization
dc.subjectSVM Classification
dc.titleLagrangian relaxation for SVM feature selection
dc.typeJournal Article
dc.identifier.doi10.1016/J.COR.2017.06.001
dc.pages137-145p.
dc.vol.noVol.87-
dc.journal.nameComputers and Operations Research
Appears in Collections:2010-2019
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.