Skip to content
FacebookTwitterLinkedinYouTubeGitHubSubscribeEmailRSS
Close
Beyond Knowledge Innovation

Beyond Knowledge Innovation

Where Data Unveils Possibilities

  • Home
  • AI & ML Insights
  • Machine Learning
    • Supervised Learning
      • Introduction
      • Regression
      • Classification
    • Unsupervised Learning
      • Introduction
      • Clustering
      • Association
      • Dimensionality Reduction
    • Reinforcement Learning
    • Generative AI
  • Knowledge Base
    • Introduction To Python
    • Introduction To Data
    • Introduction to EDA
  • References
HomeImplementationUnsupervised LearningClusteringComplete linkage hierarchical clustering
Clustering

Complete linkage hierarchical clustering

March 15, 2024March 15, 2024CEO 165 views
Complete linkage hierarchical clustering is another method used in cluster analysis, like single linkage clustering, but with a different approach to determining the distance between clusters.

In complete linkage clustering, the distance between two clusters is defined as the maximum distance between any two points in the two clusters. So, the distance between two clusters is determined by the farthest points in each cluster. This approach tends to produce more compact and spherical clusters compared to single linkage clustering.

Complete linkage clustering is less sensitive to noise and outliers compared to single linkage clustering, and it tends to handle uneven cluster sizes better. However, it can be more computationally intensive, especially for large datasets, because it involves calculating the maximum distance between points in each pair of clusters.

Overall, complete linkage hierarchical clustering is a useful method for finding compact and well-separated clusters in data, especially when the clusters have distinct boundaries.

complete, linkage, unsupervised

Post navigation

Previous Post
Previous post: Single linkage hierarchical clustering
Next Post
Next post: Cophenetic coefficient

You Might Also Like

No image
t-distributed Stochastic Neighbor Embedding (t-SNE)
March 17, 2024 Comments Off on t-distributed Stochastic Neighbor Embedding (t-SNE)
No image
Principal Component Analysis (PCA)
March 15, 2024 Comments Off on Principal Component Analysis (PCA)
No image
Unsupervised Learning Dimensionality Reduction – Feature Elimination…
March 15, 2024 Comments Off on Unsupervised Learning Dimensionality Reduction – Feature Elimination vs Extraction
No image
Single linkage hierarchical clustering
March 15, 2024 Comments Off on Single linkage hierarchical clustering
No image
Finding the optimal number of clusters (k)…
March 11, 2024 Comments Off on Finding the optimal number of clusters (k) using Elbow Method
  • Recent
  • Popular
  • Random
  • No image
    7 months ago Low-Rank Factorization
  • No image
    7 months ago Perturbation Test for a Regression Model
  • No image
    7 months ago Calibration Curve for Classification Models
  • No image
    March 15, 20240Single linkage hierarchical clustering
  • No image
    April 17, 20240XGBoost (eXtreme Gradient Boosting)
  • No image
    April 17, 20240Gradient Boosting
  • No image
    March 4, 2024Classification metrics: Accuracy, Precision, Recall, and F1-score
  • No image
    May 9, 2024Neural Network model building
  • No image
    March 11, 2024What is Silhouette Coefficient
  • Implementation (55)
    • EDA (4)
    • Neural Networks (10)
    • Supervised Learning (26)
      • Classification (17)
      • Linear Regression (8)
    • Unsupervised Learning (11)
      • Clustering (8)
      • Dimensionality Reduction (3)
  • Knowledge Base (44)
    • Python (27)
    • Statistics (6)
May 2025
M T W T F S S
 1234
567891011
12131415161718
19202122232425
262728293031  
« Oct    

We are on

FacebookTwitterLinkedinYouTubeGitHubSubscribeEmailRSS

Subscribe

© 2025 Beyond Knowledge Innovation
FacebookTwitterLinkedinYouTubeGitHubSubscribeEmailRSS