connection weights between the rule of the first j and the output node, all of them are adjustable parameters。
4。2Weight adjustment
The weight choice had a tremendous influence on system performances, if the weight is inappropriate, the convergence speed of neural networks would be reduced。 This article carried on the training based on the gradient descent law to the network weight。
Define objective function as follows:
As the control laws Eqs。 (7)−(8) contain sgn(s), it makes the system to produce high-frequency chattering easily。 Therefore, the saturation function is used instead of the sign function to smooth control signal。 Then, the control law can be rewritten as
u1 [hf(hh) fhfhc e
(hh)(c e ls at(s ) k s )
hxc e hhs at(s) k hs]/
M 1
(d (4) O(4) )2
(13)
(g2 h3 g1h2 g3h2 g2 h1 )
cl e4 fls at(s ) k s gu
(21)
j j
j
u2
(22)
2
where d (4) is the desired output of the network, while
(4)
5Particle swarm optimization
Oj is the actual output of the network。
Suppose that the learning rate of V, aij and bij are η1,
η2 and η3, respectively, the adjustment values are
5。1 Principle of particle swarm optimization
As seen in Eqs。 (21)−(22), the controller parameters
V M
c , c , c , α, ε , k , ε
and k
have a direct impact on the
1 (I (4) )
x l θ
1 1 2 2
a
j
M
(14)
system control law。 The greater the values of cx, cl, cθ, α,
ε1, k1, ε2 and k2, the faster the system approached the
ij
2
(aij )
sliding surface。 However, too large values would make the control excessive and may cause the system
b
M
ij
3
(bij )
chattering, which can affect the dynamic performances of approaching process。 Otherwise, the values are smaller,
where V are w(4) ,
v(4) and u(4) , respectively。 Suppose
although the system chattering is weakened, the speed
w(4) ,
(4)