Wednesday 13 March 2013

Machine learning

Perceptron

Repeat until convergence:
For t = 1 ... n
  1. \(y'\) = sign(\(\underline{x}_t\cdot\underline{\theta}\))
  2. If \(y'\ne y_t\) Then \(\underline{\theta} = \underline{\theta} + y_t\underline{x}_t\), Else leave \(\underline{\theta}\) unchanged.

Kernel form of the perceptron

  • Definition: for any \(\underline{x}\), define \(g(x) = \sum_{j=1}^n\alpha_jy_jK(\underline{x}_j,\underline{x})\) where \(K(\underline{x}_j,\underline{x}) = \phi(\underline{x}_j)\cdot\phi(\underline{x})\)
  • Repeat until convergence:
    • For t = 1 ... n
      1. y' = sign(g(\(\underline{x}_t\))
      2. If \(y'\ne y_t\) Then \(\alpha_t = \alpha_t + 1\)

No comments :

Post a Comment