Skip to main content
13 events
when toggle format what by license comment
Aug 8, 2020 at 15:04 history closed terdon bash Duplicate of Find duplicate files
Aug 8, 2020 at 13:20 comment added Stéphane Chazelas There are literally hundreds of similar questions on this site. Look for fdupes for instance.
S Aug 8, 2020 at 12:14 history suggested user413007
removed linux tag: question not specific to linux
Aug 8, 2020 at 12:12 comment added Notme It sounds like a good reminder Thank you
Aug 8, 2020 at 12:11 review Suggested edits
S Aug 8, 2020 at 12:14
Aug 8, 2020 at 12:00 answer added Notme timeline score: 1
Aug 8, 2020 at 11:42 comment added A.B Nice, duff appears to be made for this. I saw the command but never used it. You should post this as an answer to your own question. But don't use -r on rm, it's not needed and dangerous: use -- instead. You'd probably have to rework the xargs command for extra safety unless you are sure there are no spaces in your file names.
Aug 8, 2020 at 11:39 comment added Notme thank you my friend You have deleted the duplicate files file using duff -re . | xargs rm -r Ref link
Aug 8, 2020 at 11:09 comment added A.B Here's the idea: the key point is to do an hash (eg: sha256sum) of each file contents and then work on sorting the hashes to find duplicates, which is way faster than sorting the files' content directly. Some tools might already do part of this (eg: duff)
Aug 8, 2020 at 10:44 comment added Notme Yes, I want to delete the duplicate by the file size, but by the byte
Aug 8, 2020 at 10:24 comment added A.B are you talking about exact-byte-by-byte duplicates?
Aug 8, 2020 at 9:33 review First posts
Aug 8, 2020 at 15:06
Aug 8, 2020 at 9:32 history asked Notme CC BY-SA 4.0