The Improved Stochastic Ranking Evolution Strategy (ISRES) is an algorithm for nonlinearly constrained global optimization, or at least semi-global, although it has heuristics to escape local optima.
Usage
isres(
x0,
fn,
lower,
upper,
hin = NULL,
heq = NULL,
maxeval = 10000,
pop.size = 20 * (length(x0) + 1),
xtol_rel = 1e-06,
nl.info = FALSE,
deprecatedBehavior = TRUE,
...
)
Arguments
- x0
initial point for searching the optimum.
- fn
objective function that is to be minimized.
- lower, upper
lower and upper bound constraints.
- hin
function defining the inequality constraints, that is
hin <= 0
for all components.- heq
function defining the equality constraints, that is
heq = 0
for all components.- maxeval
maximum number of function evaluations.
- pop.size
population size.
- xtol_rel
stopping criterion for relative change reached.
- nl.info
logical; shall the original NLopt info be shown.
- deprecatedBehavior
logical; if
TRUE
(default for now), the old behavior of the Jacobian function is used, where the equality is \(\ge 0\) instead of \(\le 0\). This will be reversed in a future release and eventually removed.- ...
additional arguments passed to the function.
Value
List with components:
- par
the optimal solution found so far.
- value
the function value corresponding to
par
.- iter
number of (outer) iterations, see
maxeval
.- convergence
integer code indicating successful completion (> 0) or a possible error number (< 0).
- message
character string produced by NLopt and giving additional information.
Details
The evolution strategy is based on a combination of a mutation rule---with a log-normal step-size update and exponential smoothing---and differential variation---a Nelder-Mead-like update rule). The fitness ranking is simply via the objective function for problems without nonlinear constraints, but when nonlinear constraints are included the stochastic ranking proposed by Runarsson and Yao is employed.
This method supports arbitrary nonlinear inequality and equality constraints in addition to the bounds constraints.
Note
The initial population size for CRS defaults to \(20x(n+1)\) in \(n\) dimensions, but this can be changed. The initial population must be at least \(n+1\).
References
Thomas Philip Runarsson and Xin Yao, ``Search biases in constrained evolutionary optimization,'' IEEE Trans. on Systems, Man, and Cybernetics Part C: Applications and Reviews, vol. 35 (no. 2), pp. 233-243 (2005).
Examples
## Rosenbrock Banana objective function
rbf <- function(x) {(1 - x[1]) ^ 2 + 100 * (x[2] - x[1] ^ 2) ^ 2}
x0 <- c(-1.2, 1)
lb <- c(-3, -3)
ub <- c(3, 3)
## The function as written above has a minimum of 0 at (1, 1)
isres(x0 = x0, fn = rbf, lower = lb, upper = ub)
#> $par
#> [1] 1.000015 1.000029
#>
#> $value
#> [1] 2.785245e-10
#>
#> $iter
#> [1] 10000
#>
#> $convergence
#> [1] 5
#>
#> $message
#> [1] "NLOPT_MAXEVAL_REACHED: Optimization stopped because maxeval (above) was reached."
#>
## Now subject to the inequality that x[1] + x[2] <= 1.5
hin <- function(x) {x[1] + x[2] - 1.5}
S <- isres(x0 = x0, fn = rbf, hin = hin, lower = lb, upper = ub,
maxeval = 2e5L, deprecatedBehavior = FALSE)
S
#> $par
#> [1] 0.8231316 0.6768683
#>
#> $value
#> [1] 0.03132831
#>
#> $iter
#> [1] 13126
#>
#> $convergence
#> [1] 4
#>
#> $message
#> [1] "NLOPT_XTOL_REACHED: Optimization stopped because xtol_rel or xtol_abs (above) was reached."
#>
sum(S$par)
#> [1] 1.5