Please use this identifier to cite or link to this item: https://repository.iimb.ac.in/handle/2074/10805
DC FieldValueLanguage
dc.contributor.authorFrangioni, Antonio
dc.contributor.authorGendron, Bernard
dc.contributor.authorGorgone, Enrico
dc.date.accessioned2020-03-12T11:55:17Z-
dc.date.available2020-03-12T11:55:17Z-
dc.date.issued2018
dc.identifier.issn1862-4472
dc.identifier.urihttps://repository.iimb.ac.in/handle/2074/10805-
dc.description.abstractWe present and computationally evaluate a variant of the fast gradient method by Nesterov that is capable of exploiting information, even if approximate, about the optimal value of the problem. This information is available in some applications, among which the computation of bounds for hard integer programs. We show that dynamically changing the smoothness parameter of the algorithm using this information results in a better convergence profile of the algorithm in practice.
dc.publisherSpringer
dc.subjectFast gradient method
dc.subjectLagrangian relaxation
dc.subjectConvex optimization
dc.titleDynamic smoothness parameter for fast gradient methods
dc.typeJournal Article
dc.identifier.doihttps://doi.org/10.1007/S11590-017-1168-Z
dc.pages43-53p.
dc.vol.noVol.12-
dc.issue.noIss.1-
dc.journal.nameOptimization Letters
Appears in Collections:2010-2019
Files in This Item:
File SizeFormat 
Gorgone_OL_2018_Vol.12_Iss.1.pdf861.12 kBAdobe PDFView/Open    Request a copy
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.