[sclug] Getting rid of duplicate files
Jonathan H N Chin
jc254 at newton.cam.ac.uk
Fri Sep 29 15:04:40 UTC 2006
Tim Sutton <tim at linfiniti.com> wrote:
> I'm trying to free up space on my hard disk. In particular Im trying
> to get rid of duplicate images that dont have matchine file names. [...]
In a slightly different vein from the other suggestions:
Rob Kudla wrote a program called "findimagedupes" that looks for
images that are "similar" to each other. It used to be in debian.
I rewrote it from scratch and my version recently replaced Rob's in
unstable. You can find it in a debian repository near you, or get
the unpackaged version from:
http://www.jhnc.org/findimagedupes/
If you use it like:
find [...] -print0 | findimagedupes -0 -s SCRIPT -- -
it will (eventually) spit out a shell script called "SCRIPT".
Open the script in a text editor and change the VIEW function to do
whatever you like. For example, to open groups of images with your
preferred image viewing program, you might use something like:
VIEW(){
xv -imap -cmap "$@"
}
Note that the program will almost certainly group some files that
aren't duplicates at all, even if you use --threshold=100.
-jonathan
--
Jonathan H N Chin, 2 dan | deputy computer | Newton Institute, Cambridge, UK
<jc254 at newton.cam.ac.uk> | systems mangler | tel/fax: +44 1223 767091/330508
"respondeo etsi mutabor" --Rosenstock-Huessy
More information about the Sclug
mailing list