Minkowski Distance (Lₚ-Norm)
Calculator for the Minkowski distance with formulas and examples
Minkowski Distance Calculator
What is calculated?
The Minkowski distance (also called the Lₚ-norm) is a generalization of Euclidean and Manhattan distances. The parameter p determines the type of distance measurement.
Minkowski Info
Properties
Minkowski distance:
- Generalized Lₚ-norm
- Parameter p ≥ 1 required
- Special cases: Manhattan, Euclidean
- Limit: Chebyshev (p→∞)
Flexibility: By adjusting p you can cover different distance types and application scenarios.
Special cases
Related distances
→ Manhattan distance (p=1)
→ Euclidean distance (p=2)
→ Chebyshev distance (p=∞)
|
Formulas for Minkowski distance
Basic formula (Lₚ-norm)
Vector norm
Manhattan (p=1)
Euclidean (p=2)
Chebyshev (p→∞)
Weighted form
Detailed calculation example
Example: Minkowski([3,4,5], [2,3,6], p=3)
Given:
- Point A = [3, 4, 5]
- Point B = [2, 3, 6]
- Parameter p = 3
Step 1 - Absolute differences:
- |3 - 2| = 1
- |4 - 3| = 1
- |5 - 6| = 1
Step 2 - Power (p=3):
- 1³ = 1
- 1³ = 1
- 1³ = 1
Step 3 - Sum and root:
Interpretation: The cubic Minkowski distance is about 1.442, between Manhattan (3.0) and Euclidean (1.732).
p-value comparison
For points [0,0] and [3,4] at different p values
p = 1 (Manhattan)
|3| + |4| = 7
p = 2 (Euclidean)
√(3² + 4²) = 5
p = 3 (Cubic)
(3³ + 4³)^(1/3)
p = ∞ (Chebyshev)
max(3, 4) = 4
Observation: As p increases the distance approaches the maximum value.
Unit ball shapes
How the unit ball changes with p
2D unit balls (d ≤ 1):
- p = 1: Diamond ♦
- p = 2: Circle ●
- p = 4: Superellipse (flattened)
- p = ∞: Square ■
3D unit balls (d ≤ 1):
- p = 1: Octahedron (8 faces)
- p = 2: Sphere
- p = 4: Supersphere
- p = ∞: Cube
Trend: Smaller p values → sharper shapes, larger p values → squarer shapes
Practical applications
Machine Learning
- k-Nearest Neighbors (various p)
- Clustering algorithms
- Similarity measurement
- Feature matching
Data analysis
- Outlier detection
- Data quality
- Multivariate statistics
- Dimensionality reduction
Computer graphics
- Collision detection
- Pathfinding
- Texture matching
- 3D modeling
Mathematical properties
Norm properties (p ≥ 1)
- Positivity: ‖x‖ₚ ≥ 0, ‖x‖ₚ = 0 ⟺ x = 0
- Homogeneity: ‖αx‖ₚ = |α|‖x‖ₚ
- Triangle inequality: ‖x+y‖ₚ ≤ ‖x‖ₚ + ‖y‖ₚ
- Monotonicity: ‖x‖∞ ≤ ‖x‖ₚ ≤ ‖x‖₁ for p ≥ 1
Convergence properties
- Limit: lim[p→∞] ‖x‖ₚ = ‖x‖∞
- Continuity: ‖x‖ₚ is continuous in p
- Monotone relation: p₁ < p₂ ⟹ ‖x‖ₚ₂ ≤ ‖x‖ₚ₁
- Hölder inequality: Basis for triangle inequality
Relations between norms
General relation:
‖x‖∞ ≤ ‖x‖ₚ ≤ n^(1/p) ‖x‖∞
Specific inequalities:
‖x‖₂ ≤ ‖x‖₁ ≤ √n ‖x‖₂
p-parameter selection guide
When to use which p value?
p = 1 (Manhattan):
- Urban planning, navigation
- Robust to outliers
- Sparse data, LASSO
- Discrete/raster problems
p = 2 (Euclidean):
- Physical distances
- Standard ML algorithms
- Gaussian distributions
- "Natural" geometric distance
p > 2 (Higher norms):
- Emphasize dominant dimensions
- Less outlier-sensitive
- Specialized applications
- Approaches maximum norm
p = ∞ (Chebyshev):
- Worst-case scenarios
- Approximation theory
- Chessboard distance
- Uniform norms