Skip to content
FacebookTwitterLinkedinYouTubeGitHubSubscribeEmailRSS
Close
Beyond Knowledge Innovation

Beyond Knowledge Innovation

Where Data Unveils Possibilities

  • Home
  • AI & ML Insights
  • Machine Learning
    • Supervised Learning
      • Introduction
      • Regression
      • Classification
    • Unsupervised Learning
      • Introduction
      • Clustering
      • Association
      • Dimensionality Reduction
    • Reinforcement Learning
    • Generative AI
  • Knowledge Base
    • Introduction To Python
    • Introduction To Data
    • Introduction to EDA
  • References
HomeImplementationSupervised LearningLinear RegressionLinear regression model coefficients
Linear Regression Supervised Learning

Linear regression model coefficients

February 28, 2024March 1, 2024CEO 273 views

Model coefficients, also known as regression coefficients or weights, are the values assigned to the features (independent variables) in a regression model. In a linear regression model, the relationship between the input features (X) and the predicted output (y) is represented as:

\(y = \beta_0 + \beta_1 x_1 + \beta_2 x_2 + \ldots + \beta_n x_n \)

Here:

  • \((y)\) is the predicted output.
  • \((\beta_0)\) is the intercept term, representing the value of (y) when all input features are zero.
  • \((\beta_1, \beta_2, \ldots, \beta_n)\) are the coefficients assigned to the corresponding input features \((x_1, x_2, \ldots, x_n)\).

The model coefficients are estimated during the training of the regression model. The goal of the training process is to find the values of \((\beta_0, \beta_1, \ldots, \beta_n)\) that minimize the difference between the predicted values and the actual values in the training data.

The coefficients provide information about the strength and direction of the relationship between each feature and the target variable. Positive coefficients indicate a positive correlation, while negative coefficients indicate a negative correlation. The magnitude of the coefficient reflects the impact of the corresponding feature on the predicted output. Larger magnitudes imply a stronger influence.

Example:

import pandas as pd
import numpy as np

from sklearn.linear_model import LinearRegression
from sklearn.metrics import mean_squared_error
from sklearn.model_selection import train_test_split

regression_model = LinearRegression()
regression_model.fit(X_train, y_train)

pd.DataFrame(np.append(regression_model.coef_, regression_model.intercept_),index=X_train.columns.tolist() + ['Intercept'] ,columns=['Coefficients'])
	 Coefficients
CRIM	   -0.113845
ZN	    0.061170
INDUS	    0.054103
CHAS	    2.517512
NX	   -22.248502
RM	    2.698413
AGE	    0.004836
DIS	   -1.534295
RAD	    0.298833
TAX	   -0.011414
PTRATIO	   -0.988915
LSTAT	   -0.586133
Intercept   49.885235

Equation of the fit

# Let us write the equation of linear regression

Equation = "Price = " + str(regression_model.intercept_)
print(Equation, end=" ")

for i in range(len(X_train.columns)):
    if i != len(X_train.columns) - 1:
        print(f"+ ({regression_model.coef_[i]})*{X_train.columns[i]}")
    else:
        print(f"+ ({regression_model.coef_[i]})*{X_train.columns[i]}")
Price = 49.88523466381736 
+ (-0.11384484836914008)*CRIM
+ (0.06117026804060645)*ZN
+ (0.05410346495874601)*INDUS
+ (2.5175119591227144)*CHAS
+ (-22.248502345084372)*NX
+ (2.6984128200099113)*RM
+ (0.004836047284751951)*AGE
+ (-1.5342953819992557)*DIS
+ (0.29883325485901313)*RAD
+ (-0.011413580552025043)*TAX
+ (-0.9889146257039406)*PTRATIO
+ (-0.5861328508499133)*LSTAT
coefficient, intercept, linear, regression

Post navigation

Previous Post
Previous post: What is PolynomialFeatures preprocessing technique?
Next Post
Next post: One-Hot Encoding

You Might Also Like

No image
Cophenetic coefficient
March 15, 2024 Comments Off on Cophenetic coefficient
No image
What is Silhouette Coefficient
March 11, 2024 Comments Off on What is Silhouette Coefficient
No image
One-Hot Encoding
February 29, 2024 Comments Off on One-Hot Encoding
No image
What is PolynomialFeatures preprocessing technique?
February 26, 2024 Comments Off on What is PolynomialFeatures preprocessing technique?
  • Recent
  • Popular
  • Random
  • No image
    8 months ago Low-Rank Factorization
  • No image
    8 months ago Perturbation Test for a Regression Model
  • No image
    8 months ago Calibration Curve for Classification Models
  • No image
    March 15, 20240Single linkage hierarchical clustering
  • No image
    April 17, 20240XGBoost (eXtreme Gradient Boosting)
  • No image
    April 17, 20240Gradient Boosting
  • No image
    March 11, 2024What is Silhouette Coefficient
  • No image
    February 29, 2024One-Hot Encoding
  • No image
    March 11, 2024What is Mahalanobis Distance
  • Implementation (55)
    • EDA (4)
    • Neural Networks (10)
    • Supervised Learning (26)
      • Classification (17)
      • Linear Regression (8)
    • Unsupervised Learning (11)
      • Clustering (8)
      • Dimensionality Reduction (3)
  • Knowledge Base (44)
    • Python (27)
    • Statistics (6)
June 2025
M T W T F S S
 1
2345678
9101112131415
16171819202122
23242526272829
30  
« Oct    

We are on

FacebookTwitterLinkedinYouTubeGitHubSubscribeEmailRSS

Subscribe

© 2025 Beyond Knowledge Innovation
FacebookTwitterLinkedinYouTubeGitHubSubscribeEmailRSS