Optimization techniques

If we classify the numerical optimization technique, which is based on the way of improving the design point after each iteration, there are three kinds of optimization techniques: non-gradient-based, gradient-based, and hybrid optimization techniques。 They are described briefly as  follows:

Non-gradient based optimization techniques do not require an objective function, f(x), to be differentiable because the

algorithms do not use derivatives of f(x)。 Examples of non-gradient-based optimization techniques are adaptive simulated annealing, Hooke-Jeeves direct search, and genetic algorithm (GA)。 These optimization techniques tend to reach a global optimum but require the huge number of function evaluations。 GA is a well-known non-gradient based optimization tech- nique。 It is a stochastic search or optimization algorithm that mimics Darwin’s theory of biological evolution。

Gradient-based techniques define the search directions by the gradient of the function at the current point。 In practice, there are many kinds of gradient-based optimization techniques such as generalized reduced gradient, conjugate gradient, method of feasible directions, mix integer optimization, sequential linear programming, sequential quadratic programming, and Davidon–Fletcher–Powell。 Gradient-based techniques, in general, give a quick convergence, but they may require a long run when the number of variables increases。 Gradient-based techniques can also get risk of local extremum for high nonlin- ear optimization problem。

Hybrid optimization techniques use the combination of both non-gradient based and gradient-based techniques subse- quently in order to take the advantages and reduce the disadvantages of single optimization technique。 Presenting all of these optimization techniques is beyond the scope of this    paper。

2。2。 The  common  optimization methods

The terminology optimization method used in this paper refers to whether or not the explicit objective functions are for- mulated。 For simulation-based optimization, the objective functions are often in the form of implicit equations。 The value of the objective function is unknown until simulation results are obtained。 There are two approaches that are used to resolve the optimization problem including direct optimization and metamodel-based optimization methods as shown in Fig。 1。 The detail of these two optimization methods is described as    follows。

2。2。1。 Direct optimization methods

Direct numerical optimization is an approach that explicit objective functions are not required。 Both gradient-based opti- mization techniques and non-gradient based optimization techniques can be applied to solve the optimization problem。 Sometimes, direct optimization methods combine the GA and other optimization techniques。 It is well known that GA   tends

Direct discrete optimization method

(No explicit mathematical functions showing the relationship between inputs and outputs)

Gradient-based None gradient-

optimization based

techniques optimization

(Derivative by techniques

finite difference)

Fig。  1。  Classification  of  optimization methods。

to reach a global extremum, but this method requires a large number of function evaluations。 On the contrary, gradient- based methods are efficient to guarantee a local extremum。 If these two algorithms are combined as a hybrid system, they can strengthen the advantages and remove the   disadvantages。

上一篇:刷电镀的更新英文文献和中文翻译
下一篇:汽车挡泥板注塑成型中能源效率英文文献和中文翻译

数字通信技术在塑料挤出...

快速成型制造技术英文文献和中文翻译

注射成型薄壁注塑翘曲英文文献和中文翻译

注射成型的微悬臂梁结构英文文献和中文翻译

汽车挡泥板注塑成型中能...

Moldflow软件在复杂的塑料外...

塑料盖镜片残余应力的消...

老年2型糖尿病患者运动疗...

ASP.net+sqlserver企业设备管理系统设计与开发

互联网教育”变革路径研究进展【7972字】

安康汉江网讯

LiMn1-xFexPO4正极材料合成及充放电性能研究

新課改下小學语文洧效阅...

麦秸秆还田和沼液灌溉对...

张洁小说《无字》中的女性意识

网络语言“XX体”研究

我国风险投资的发展现状问题及对策分析