毕业论文

打赏
当前位置: 毕业论文 > 外文文献翻译 >

自适应柴油发动机英文文献和中文翻译(2)

时间:2022-10-17 22:08来源:毕业论文
OPTIMIZATION METHOD The optimization approach taken in this study is based on the steepest descent method which utilizes an adaptive cost function in conjunction with a backtracking strategy for the l

OPTIMIZATION METHOD

The optimization approach taken in this study is based on the steepest descent method which utilizes an adaptive cost function in conjunction with a backtracking strategy for the line search。 The backtracking algorithm utilizes quadratic and cubic polynomials to accelerate the convergence, and the initial backtracking step employs an adaptive step size mechanism which depends on the steepness of the search direction。 Following a precise problem statement, these algorithms and their implementations are then described in more detail。

Problem Formulation

In this study we consider the constrained optimization problem where the specific fuel consumption is minimized subject to prescribed emission levels。 More formally, this is expressed as minimizing the objective function, i。e。, the specific fuel consumption, s : X→ R subject to the constraints g(x) = 0。 Here, X ⊂ Rm represents the admissible set of engine input parameters and g : X → R2 denotes the emissions which in this case are the nitric oxide and the particulate mass。

Table 1 Adaptive steepest descent algorithm。

xo = initial guess             (starting point) 

compute ρ0 (initialize penalty parameter at xo) 

for k=0, 1, 2 。。。 (repeat until optimum is reached) 

pk =−∇f(xk,ρ k) (search direction) 

xk+1 = minx∈X f(x,ρ k) 

in direction pk (line search gives new pivot xk+1) 

compute ρk+1 (new penalty parameter) 

ρk+1 = max{ρk,ρ k+1} (insure that {ρk} is non-decreasing)

end

This problem can be reformulated as an unconstrained optimization problem by introducing the penalty term g(x)TDg(x), whereD is an appropriate positive definite matrix, usually taken to be diagonal。 This leads to the unconstrained optimization problem of minimizing the cost function (also called merit or penalty function)over the admissible parameter set X, i。e。,

min x∈X

f(x;ρ) = min x∈X{s(x)+ ρg(x)TDg(x)}。 (1) Here, ρ>0 is a penalty parameter such that if x∗ ρ = minx∈X f(x;ρ) then limρ→∞x∗ ρ = x∗, wherex∗ is the solution of the original constrained optimization problem (cf。 [29])。 In practice this means that the choice of ρ yields a solution x∗ ρ which is as close to the optimum, x∗, as desired。

The Adaptive Steepest Descent Method In each iteration step, k, the steepest descent method determines a search direction, pk = −∇f(xk;ρk), at the pivot, xk。 (The gradient, ∇, is taken with respect to x, keepingρ fixed。) The cost function f is then minimized along this search direction, using the backtracking algorithm described below。 This minimization process, called line search, yields a new pivot xk+1 and a new penalty parameter ρk+1 (see below), which allows the determination of the new search direction pk+1。 This iteration process is continued until either the target is reached or a minimum is encountered。 The algorithm for this adaptive steepest descent method is given in Table 1。

The Backtracking Algorithm

The backtracking algorithm used in this study is a modified version of the one presented in [30]。 It is equipped with an additional adaptive initial step size for the first backtracking step and minimizes either a quadratic or a cubic polynomial fit to find the subsequent step sizes。 Note that during the backtracking phase the penalty parameter ρ remains constant and, therefore, in order to simplify the notation, the dependence of f on ρ is ignored in this subsection, i。e。, f(x) = f(x;ρ)。 This algorithm is summarized in Table 2。 In this algorithm, a new pivot, xk+1, is determined such that the cost function, f, is minimized in the descent direction, pk,starting from the present pivot, xk。 The endpoint of the first backtracking interval, xs, is given byxs = xk + λ/|δk|pk/||pk||, where λ is a user input constant and δk =<∇f(xk),pk >/ ||pk||is the derivative of f at xk in the direction pk, i。e。 δk = ∂f(xk)/∂pk。 If xs lies outside the parameter space (the unit hypercube in this study) then it is projected onto its boundary。 Now, using the slope δk at xk, a parabola is fitted through the three points {(xk, f(xk)),(xk,δ k),(xs, f(xs))} and its minimum,x m, is determined。 Again, if xm lies outside the parameter space then it is projected onto its boundary。 This gives the new backtracking step Δx =||xm−xk||。 If f(xm) is larger than the difference between f(xk) and the tolerance, β|δk|Δx, whereβ is a constant, then the backtracking step is changed (usually reduced) as follows: a cubic polynomial is fitted through the four points {(xk, f(xk)),(xk,δ k),(xm, f(xm),(xs, f(xs))}, and then its mini-mum point xc is determined。 Once more, if xc lies outside the unit hypercube then it is projected onto its boundary。 Next, set the new interval endpoint xs = xm and the new pivot candidate xm = xc。 This new pivot candidate xm determines the new step size Δx =||xm−xk||and the associated change in the tolerance。 This iterative process is continued until f(xm) is smaller than the difference between f(xk) and the tolerance; this then qualifies xm as the new pivot for the next search direction。 The first step in the backtracking algorithm is normally the largest distance used for finding a new pivot。 In each subsequent iteration this distance is usually reduced。 The backtracking algorithm in this study is equipped with an adaptive initial step size control to make this first step as large as possible。 This is achieved by linking the first step size to the steepness of the cost function derivative in the search direction, ∂f(xk)/∂pk = δk。 This has the effect that if δk is very steep then a small step size is chosen, and if the gradient is flat, then the first step size is taken to be large。 This behavior is achieved by taking the initial backtracking interval as Δx = λ/|δk|。 Note that the parabola fit during the first backtracking step serves the purpose of accelerating the cost function minimization。 A heuristic argument shows that in most cases, except possibly in some pathological situations, the cost function at the parabola minimum xm satisfies f(xm) < f(xk) and f(xm) < f(xs)。 The cubic polynomial fit is used in the subsequent backtracking steps to increase the speed of the cost function minimization even further。 The positive constant β helps to control the tolerance used in the search of the new pivot, and λ, also positive, provides a user-control for the initial backtracking step。 Both of these constants influence the convergence rate and the number of function evaluations。 The values taken in this study are β = 0。01 and λ = 0。5。 自适应柴油发动机英文文献和中文翻译(2):http://www.youerw.com/fanyi/lunwen_100563.html

------分隔线----------------------------
推荐内容