0
1.2kviews
Explain perceptron learning rule convergence theorem
1 Answer
0
16views

Perceptron Convergence Theorem:

In the classification of linearly separable patterns belonging to two classes only, the training task for the classifier was to find the weight w such that.

(w^tx>0\hspace{0.4cm} for\hspace{0.2cm}each \hspace{0.2cm}x\in X_1\ w^tx<0\hspace{0.4cm} for\hspace{0.2cm}each \hspace{0.2cm}x\in X_2\)

Completion of training with the fixed correction training rule for any initial weight …

Create a free account to keep reading this post.

and 4 others joined a min ago.

Please log in to add an answer.