Oja’s rule
Table of Contents
Hebb’s rule, but normalize weight after each increment
1. normalization
In here, defined as ensure the property that \(||w^t|| = 1\).
2. where to normalize
- after weight initializatoin
- after each weight increment
3. when to stop
- convergence criteria
- the largest change in weight smaller than a parameter \(\delta\) (small, positive, real), so \(max_i|w_i^t - w_i^{t-1} |\leq \delta\)
- (no term)
- check after weight increment + normalization
- (no term)
- note: the queried change here is on normalized weights, not raw \(\Delta w_i\)
Backlinks
unsupervised learning
unsupervised learning algorithms are learning algorithms that do not use labels.