Minimize Manhattan Distances - Problem
You are given an array points representing integer coordinates of some points on a 2D plane, where points[i] = [xi, yi].
The Manhattan distance between two points (x1, y1) and (x2, y2) is defined as |x1 - x2| + |y1 - y2|.
Return the minimum possible value for the maximum distance between any two points by removing exactly one point.
Input & Output
Example 1 — Basic Case
$
Input:
points = [[3,10],[5,15],[10,2],[4,4]]
›
Output:
12
💡 Note:
Remove point [3,10]. The maximum distance among remaining points is between [5,15] and [10,2]: |5-10| + |15-2| = 5 + 13 = 18. However, removing [5,15] gives max distance 12 between [3,10] and [10,2]: |3-10| + |10-2| = 7 + 8 = 15. The optimal is actually 12.
Example 2 — Minimum Size
$
Input:
points = [[1,1],[3,3]]
›
Output:
0
💡 Note:
With only 2 points, removing one leaves just 1 point, so maximum distance is 0.
Example 3 — Three Points
$
Input:
points = [[1,1],[1,3],[3,3]]
›
Output:
2
💡 Note:
Remove [1,1]. Remaining points [1,3] and [3,3] have distance |1-3| + |3-3| = 2. This is optimal.
Constraints
- 3 ≤ points.length ≤ 105
- points[i].length == 2
- -108 ≤ points[i][0], points[i][1] ≤ 108
Visualization
Tap to expand
Understanding the Visualization
1
Input
Array of 2D points with Manhattan distance metric
2
Process
Find optimal point to remove that minimizes maximum distance
3
Output
Minimum possible maximum distance after removal
Key Takeaway
🎯 Key Insight: Transform coordinates to efficiently identify critical points that could minimize the maximum distance when removed
💡
Explanation
AI Ready
💡 Suggestion
Tab
to accept
Esc
to dismiss
// Output will appear here after running code