ML——线性回归的优缺点
线性回归是一种基于监督学习的机器学习算法。它执行回归任务。回归模型是基于自变量的目标预测值。它主要用于找出变量和预测之间的关系。请参阅线性回归以获取完整参考。
让我们讨论一下线性回归的一些优点和缺点。
Advantages | Disadvantages |
---|---|
Linear Regression is simple to implement and easier to interpret the output coefficients. | On the other hand in linear regression technique outliers can have huge effects on the regression and boundaries are linear in this technique. |
When you know the relationship between the independent and dependent variable have a linear relationship, this algorithm is the best to use because of it’s less complexity to compared to other algorithms. | Diversely, linear regression assumes a linear relationship between dependent and independent variables. That means it assumes that there is a straight-line relationship between them. It assumes independence between attributes. |
Linear Regression is susceptible to over-fitting but it can be avoided using some dimensionality reduction techniques, regularization (L1 and L2) techniques and cross-validation. | But then linear regression also looks at a relationship between the mean of the dependent variables and the independent variables. Just as the mean is not a complete description of a single variable, linear regression is not a complete description of relationships among variables. |
概括:
线性回归是分析变量之间关系的好工具,但不推荐用于大多数实际应用,因为它通过假设变量之间的线性关系过度简化了现实世界的问题。