毫升 |逻辑回归中的成本函数
在线性回归的情况下,成本函数是 -
![由 QuickLaTeX.com 渲染 J(\Theta) = \frac{1}{m} \sum_{i = 1}^{m} \frac{1}{2} [h_{\Theta}(x^{(i)}) - y^{(i)}]^{2}](https://mangodoc.oss-cn-beijing.aliyuncs.com/geek8geeks/ML_%7C_Cost_function_in_Logistic_Regression_0.png)
但是对于逻辑回归,
![由 QuickLaTeX.com 渲染 h_{\Theta}(x) = g(\Theta^{T}x)](https://mangodoc.oss-cn-beijing.aliyuncs.com/geek8geeks/ML_%7C_Cost_function_in_Logistic_Regression_1.png)
它将导致非凸成本函数。但这会导致成本函数具有局部最优值,这是梯度下降计算全局最优值的一个非常大的问题。
因此,对于 Logistic 回归,成本函数为
![由 QuickLaTeX.com 渲染 Cost(h_{\Theta}(x),y) = \left\{\begin{matrix} -log(h_{\Theta}(x)) & if&y=1\\ -log(1-h_{\Theta}(x))& if& y = 0 \end{matrix}\right.](https://mangodoc.oss-cn-beijing.aliyuncs.com/geek8geeks/ML_%7C_Cost_function_in_Logistic_Regression_3.png)
如果 y = 1
如果 y = 1,则成本 = 0,h θ (x) = 1
但作为,
h θ (x) -> 0
成本 -> 无穷大
如果 y = 0
所以,
![由 QuickLaTeX.com 渲染 Cost(h_{\Theta}(x),y) = \left\{\begin{matrix} 0 &if &h_{\Theta}(x)=y\\ \infty & if & y=0 &and &h_{\Theta}(x)\rightarrow 1 \\ \infty & if &y=1 &and &h_{\Theta}(x)\rightarrow 0 \end{matrix}\right.](https://mangodoc.oss-cn-beijing.aliyuncs.com/geek8geeks/ML_%7C_Cost_function_in_Logistic_Regression_6.png)
![由 QuickLaTeX.com 渲染 Cost(h_{\Theta}(x),y) = -y log(h_{\Theta}(x)) - (1-y) log(1-h_{\Theta}(x))](https://mangodoc.oss-cn-beijing.aliyuncs.com/geek8geeks/ML_%7C_Cost_function_in_Logistic_Regression_7.png)
![由 QuickLaTeX.com 渲染 J({\Theta}) = \frac{-1}{m}\sum_{i=1}^{m} Cost(h_{\Theta}(x),y)](https://mangodoc.oss-cn-beijing.aliyuncs.com/geek8geeks/ML_%7C_Cost_function_in_Logistic_Regression_8.png)
为了拟合参数θ ,J(θ) 必须最小化,并且需要梯度下降。
梯度下降——看起来类似于线性回归,但不同之处在于假设 h θ (x)
![由 QuickLaTeX.com 渲染 \Theta_{j} := \Theta_{j} - \alpha \sum_{i = 1}^{m}(h_\Theta(x^{(i)})- y^{(i)})x_j^{(i)}](https://mangodoc.oss-cn-beijing.aliyuncs.com/geek8geeks/ML_%7C_Cost_function_in_Logistic_Regression_9.png)