Step 3: Generate the initial solutions, and estimate constraint and set up a parameter range. Step 4: Evaluate the fitness of inpiduals.
Step 5: Evaluate the FAC, if it is satisfied FAC = 1, go to step 12 otherwise go to step 6.
Each candidate for optimum solutions is decided by the FAC [14]. The FAC is a standard value to estimate the convergence of the initial candidate.
where fi is the row vector, formed by the fitness values of the inpiduals at the ith generation and f T is the transpose of f.
The row size depends on the number of optimum solutions according to a designer’s requirement. Theoretically the range of FAC is from 0 to 1.0. When the value is equal to 1, the convergence of optimization is completed. However, the value is difficult to converge to 1.0 considering the many candidate solutions to be evaluated. Therefore, in this study, the FAC is set to 0.9999.
Step 6: Update Sh: Sh = { ( XSh, F)| XSh ∈ RN, F ∈ R}, where XSh = [x1, x2, …, xN].
Step 7: Perform selection and crossover, and check tabu-list.
Step 8: Construct the response surface from Sh:
where α 0, α ii and α ij are coefficients calculated by LSM.
Step 9: Train the RBF network by Sh to construct the constraint conditions approximately.
Step 10: Calculate the optimum design on the response surface by TS and generate one inpidual based on X*.
Step 11: Mutate and go to step 4.
Step 12: Search the optimum solutions by the local concentration search using modified SM for best candidate.
3. Numerical examples of function optimizations
3.1 Test function
Three benchmark test functions are used to verify the efficiency of the proposed hybrid algorithm as shown in Fig. 2. These functions are often used to test optimization methods. The simulations are conducted for the 2-dimensional case. The first function is to be maximized, and the others are to be minimized.
The first one is the four-peak function, which has one global optimum with three local optima and is defined as
where −0.4 ≤ x1, x2 ≤ 1. This test function has a global optimum solution f(x) = 1.954342 at x1 = 0, x2 = 0,
(a) Four-peak function
(b) Rosenbrock function
(c) Rastrigin function
Fig. 2. Test functions.
and three local optima solutions f(x) = 1.807849, 1.705973 and 1.559480 as shown in Fig. 2(a). Conventional gradient based hill-climbing algorithms can be easily stuck to a local optimum because of their dependency on the start point, while the global search algorithm finds global optimum in general.
The Rosenbrock function is defined as
22 1 2 1 2 1 ( , ) 100( ) (1 )f x x x x x = − + −
where −2.0≤ x1, x2 ≤ 2.0. This function is called a banana function [15] whose shape is presented in Fig. 2(b). The objective of this function is to find the variable x, which minimizes the objective function. This function has only one optimum solution f(x) = 0 at x1 = 1.0 and x2 = 1.0. It is difficult to find an optimum solution because of an extremely deep valley along the parabola 212 x x = that leads to the global minimum [16].
The Rastrigin function is defined as
where −5.0≤ x1, x2 ≤ 5.0. This function is often used to evaluate the global search ability because there are many local minima around the global minimum as shown in Fig. 2(c). It is not easy to find a global minimum within a limited function call. This function has 220 local minima and one global minimum f(x) = 0 at (0, 0). 船舶结构优化设计英文文献和中文翻译(4):http://www.youerw.com/fanyi/lunwen_67792.html