**1 Answer**

written 3.1 years ago by | modified 18 months ago by |

*Error back propagation:*

- The algorithm has reawakened the engineering and scientific community to the modeling of many quantitative phenomena using neural networks.
- The algorithm allows experiential acquisition of input mapping knowledge within multi layer networks.
- Similarly, as in simple cases of the delta learning rule studied before, Input patterns are submitted during the back propagation sequentially.
- If a pattern is submitted and its classification or association is determined to be erroneous, the synaptic weights as well as the thresholds are adjusted so that the current least mean square classification error is reduced.

*Error back propagation algorithm:*

Given are P training sets {$(z_1,d_1),(z_2,d_2)......(z_p,d_p)$}

where z is I x 1, d=k x i, y=J x 1, o=k x 1

i th component of z = -1

j th component of y= -1

**Step 1:** (\eta>0) and some $E_{max}$ is choose w & v some small random values

w=k x J

v=J x I

q=1, p=1 & E=0

**Step 2:** Training cycle starts here

input: z=zp & d=dp

output: yj=f(Vj.z)

Ok=f(Wk.y)

**Step 3**: Computer error

$E=\frac{1}{2}(d_k-O_k)^2+E$

**Step 4:** Compute error signal

$\delta O_k=\frac{1}{2}(d_k-O_k)(1-O_k^2)\\ \delta y_j=\frac{1}{2}(1-y_j^2)\sum_{k=1}^K \delta O_k.W_kj$

**Step 5:** $W_kj=W_kj+n\delta O_kYj$

**Step 6**: $V_ji=V_ji+\eta \delta y_iz_i$

**Step 7:** if p < P

P=P+1

q=q+1

& goto step 2 else step 8

**Step 8:** if E < Emax terminate

else

if E>emax

E=0

p=1

& goto step 2

*Error back propagation flow chart:*