Current position: Home >> Scientific Research >> Paper Publications

On the Convergence of Smoothed Functional Stochastic Optimization Algorithms

Release Time:2019-03-11  Hits:

Indexed by: Journal Article

Date of Publication: 2015-01-01

Journal: IFAC-PapersOnLine

Included Journals: Scopus、EI

Volume: 48

Issue: 28

Page Number: 229-233

ISSN: 24058963

Abstract: Smoothed functional gradient algorithm with perturbations distributed according to the Gaussian distribution is considered for stochastic optimization problem with additive noise. A stochastic approximation algorithm with expanding truncations that uses either one-sided or two-sided gradient estimate is given. At each iteration of the algorithm only two observations are required. The algorithm is shown to be convergent under only some mild conditions ? 2015

Prev One:Recursive Nonparametric Regression with Errors in Variables

Next One:On the convergence of smoothed functional stochastic optimization algorithms