Find Duplicate File in System - Problem

Given a list paths of directory info, including the directory path, and all the files with contents in this directory, return all the duplicate files in the file system in terms of their paths.

You may return the answer in any order.

A group of duplicate files consists of at least two files that have the same content.

A single directory info string in the input list has the following format:

"root/d1/d2/.../dm f1.txt(f1_content) f2.txt(f2_content) ... fn.txt(fn_content)"

It means there are n files (f1.txt, f2.txt ... fn.txt) with content (f1_content, f2_content ... fn_content) respectively in the directory "root/d1/d2/.../dm".

Note that n >= 1 and m >= 0. If m = 0, it means the directory is just the root directory.

The output is a list of groups of duplicate file paths. For each group, it contains all the file paths of the files that have the same content.

A file path is a string that has the following format: "directory_path/file_name.txt"

Input & Output

Example 1 — Basic Duplicate Detection
$ Input: paths = ["root/a 1.txt(abcd) 2.txt(efgh)", "root/c 3.txt(abcd)", "root/c/d 4.txt(efgh)", "root 4.txt(efgh)"]
Output: [["root/a/2.txt","root/c/d/4.txt","root/4.txt"],["root/a/1.txt","root/c/3.txt"]]
💡 Note: Files with content 'efgh': root/a/2.txt, root/c/d/4.txt, root/4.txt. Files with content 'abcd': root/a/1.txt, root/c/3.txt
Example 2 — No Duplicates
$ Input: paths = ["root/a 1.txt(abcd) 2.txt(efgh)", "root/c 3.txt(ijkl)"]
Output: []
💡 Note: All files have unique content, so no duplicates exist
Example 3 — Single Directory
$ Input: paths = ["root 1.txt(same) 2.txt(same) 3.txt(different)"]
Output: [["root/1.txt","root/2.txt"]]
💡 Note: Only files with 'same' content are duplicates: root/1.txt and root/2.txt

Constraints

  • 1 ≤ paths.length ≤ 2 × 104
  • 1 ≤ sum of all paths[i].length ≤ 5 × 105
  • paths[i] consists of English letters, digits, '/', '.', '(', ')', and ' '.
  • You may assume no files or directories share the same name in the same directory.
  • You may assume each given directory info represents a unique directory. A single blank space separates the directory path and file info.

Visualization

Tap to expand
Find Duplicate Files in SystemInput Directory Stringsroot/a 1.txt(abcd) 2.txt(efgh)root/c 3.txt(abcd) 4.txt(efgh)ParseGroup by Contentabcd → [root/a/1.txt, root/c/3.txt]efgh → [root/a/2.txt, root/c/4.txt]FilterOutput: Duplicate Groups[root/a/1.txt, root/c/3.txt][root/a/2.txt, root/c/4.txt]🎯 Key Insight: Hash map groups files by content automatically
Understanding the Visualization
1
Input
Array of directory strings with file info
2
Process
Parse and group files by content
3
Output
Groups of files with identical content
Key Takeaway
🎯 Key Insight: Hash map with content as key automatically groups duplicate files
Asked in
Dropbox 25 Google 20 Amazon 15
28.0K Views
Medium Frequency
~25 min Avg. Time
892 Likes
Ln 1, Col 1
Smart Actions
💡 Explanation
AI Ready
💡 Suggestion Tab to accept Esc to dismiss
// Output will appear here after running code
Code Editor Closed
Click the red button to reopen