Support Vector Machine:
Recall that the cost of example in Logistic regression: \[(-ylog(h) + (1-y)log(1-h)))\]
Where the h was: \[h = 1/(1 + exp(-theta.T@x))\]
And our optimization objective was to minimize the mean summation of each of the examples.
Support vector machine is very very similar to logistic regression:\[min C⅀ycost1(theta.T@x) + (1-y)cost(theta.T@x)) \]
you can add the regularization term if you want to.
The cost1 and cost0 functions are as follows:
The cost0(z) is zero if z is less than -1, and cost1(z) is zero if z is greater than 1.
Thus if y of a certain example is 1, the cost for that example becomes zero if $theta.T@x$ is greater than one. And in case y is zero for some example then the cost becomes zero if $theta.T@x$ is less than -1.
No comments:
Post a Comment