LSRL:最小二乘回归线
“最小二乘回归线”是数据分析中一种常用的统计方法,其英文全称为“Least Squares Regression Line”,常被缩写为LSRL以方便书写和快速引用。该方法通过最小化误差平方和来找到最佳拟合直线,被广泛用于统计学、经济学、机器学习等多个综合领域,用于揭示变量间的线性关系。
Least Squares Regression Line具体释义
Least Squares Regression Line的英文发音
例句
- The first, we transform nonlinear regression model into linear model with transformed variables and estimate model parameters by used the methods of ordinary least squares, principal component analysis and partial least squares regression, from which the model parameters and the regression line itself can be estimated.
- 首先,通过变量代换,把可以线性化的非线性回归模型化为线性回归模型,并用普通最小二乘法、主成分分析法和偏最小二乘法求模型中的参数和回归模型。
- Besides, the history of least one-power method and least squares method, multiple solutions and the various effects of outliers on the regression line are discussed.
- 论述了最小一乘法与最小二乘法的历史及其在解的多重性、突出点对回归线影响等方面的差异。
- In this paper, we introduce the fundamental theory of least squares support vector machine ( LS-SVM ) for regression, propose a predictor based on LS-SVM regression for sensor fault detection and data recovery, and present the principle of the predictor and its on line algorithm.
- 本文介绍了最小二乘支持向量机(LS-SVM)回归的基本原理,提出了一种基于LS-SVM回归的时间序列预测器,并将其用于传感器的故障检测和数据恢复。
本站英语缩略词为个人收集整理,可供非商业用途的复制、使用及分享,但严禁任何形式的采集或批量盗用
若LSRL词条信息存在错误、不当之处或涉及侵权,请及时联系我们处理:675289112@qq.com。