This is the blog of Adam Kalsey. Unusual depth and complexity. Rich, full body with a hint of nutty earthiness.
Excerpt: Run a couple of unix commands and rid yourself of duplicate MP3s. Read the whole article…
This is useful, I never knew about the tr command. I'm pretty new to bash, so am I missing something, or could you have piped the output of duff right into grep without having to save the contents to a file?
I could have, but creating the file accomplished several things. **Inspecting the job first.** My Japanese language instruction files are named "Pimsleur Learning Japanese 1.mp3" and so forth. They would have been gone. And somehow I ended up with **only** the duplicate files on a couple of songs. I only had the ...1.mp3 copy of some. So I edited out the lines I didn't want to delete. **Avoiding multiple passes.** Duff is fast, but it still takes over an hour or so to process 10k files on a NAS device over Wifi. Since I'm running the removal command multiple times (1.mp3, 2.mp3, etc) I wouldn't want to make Duff do the same job repeatedly. Just save the output and work from there. **Making sure it's really done.** By visually inspecting the file, I discovered that some duplicate files ended up with much higher numbers at the end. If iTunes sees a number at the end of a file, it just increments that when making its copy. So my copy of Nelson Riddle's Theme from Route 66 became "Theme from Route 66 67.mp3"
That's a nice way of cleaning out the dupes. No more manually deleting, that's a timesaver. It would be nice though if there was some kind of option in itunes to filter dupes from the library. Thanks for the tip.
Hi Adam, I'm having a slightly different problem. iTunes indicates that I have 28.58GB of music in the library. When I go to my Music folder and highlight all the music folders the total size is 32GB. 3.42 is a pretty big discrepancy. Do you know of any program that would match the iTunes library with the actual files on the hard drive and delete the ones that are not in the library? Thanks
create an empty working file `touch .tmpDupeFile .tmpSortedFile` build a big list of md5 signatures `find . -type f -print0 | xargs -0 md5 -r > .tmpDupeFile` sort the signatures `cat .tmpDupeFile | sort > .tmpSortedFile` create a list of duplicated files `cat .tmpSortedFile | awk '{ if ($1 == oldmd5) { printf "rm %s \n", $2 } oldmd5 = $1 }' > duplicates.sh` clean up `rm .tmpDupeFile` `rm .tmpSortedFile`
Bill, I appreciate your help but I'm afraid that I just don't know what to do with all that information since I'm not a programmer. I was hoping for some kind of software solution to my problem. Thanks.
Bill, Your awk script doesn't take into account that most iTunes filenames contain spaces. A file called "My Music" ends up creating 'rm My' instead of 'rm "My Music"' I changed it to... cat .tmpSortedFile | awk '{ if ($1 == oldmd5) { printf "rm \"%s\" \n", substr($0, index($0, " ")+1) } oldmd5 = $1 }' > duplicates.sh This grabs the whole filename and wraps it in quotes so that duplicates.sh works properly.
Adam, you are THE MAN! Thank you so much for this. I had imported my MP3's into iTunes from an external drive, and M3U files in the folders caused duplicates to be created in my iTunes library. I was just getting ready to delete my entire iTunes library and re-import because of the duplicates. Your work here saved me hours of reimporting. For some reason, I didn't find your site earlier (when Googling for iTunes duplicate solutions), but did find it when searching for info on making sure deleting items from iTunes would also delete the files (hah). I had booted to Windows to use Windows and Robocopy to move my M3U files into a backup folder. I was literally just launching Mac OS to delete my library and start over when I found your site. I'm running Mac OSX Leopard, and I'm not sure that you are?... Either way, I found some descrepancies in your method when running under OSX. I've created a blog entry on my site with the changes for OSX, including a quick tutorial on running the compilation process for Duff, for those who might not be familiar with it. (I wasn't, so I documented it as I went.) Anyway, you can check out my post at this link. http://www.togeo.com/togeo/wordpress/?p=47 Thanks again for the excellent post on your site, you've got a cool blog otherwise, too!!!
How do you send Duff's output to a text file? I'm dumb. I'm running Leopard on an iMac and I have triples and quadruples of the same songs and it's killing storage capacity.
Thanks for the informative as always post Adam. One thing that you seem to overlook is that the duplicate are not removed from iTunes library. This means that, while the files are gone off the drive, you will see multiple copies of tracks some marked with (!) markers. I used Super Remove Dead Tracks from Doug's AppleScripts to clean up the catalog. http://dougscripts.com/itunes/scripts/ss.php?sp=removedeadsuper It took awhile to run, but that finished the job and got my iTunes library back in shape.
Hi, For some reason, the 2nd step did not work for me. I use macosx 10.5.4 and had the following error: > cat duplicate.txt | grep 1.mp3 | tr '\012' ' \000' | xargs -0 rm rm: /bibliotheque/iTunesFusion/iTunes Music/Daft Punk/Unknown Album/32 Daft Punk - Revolution 909(1) 1.mp3 /bibli.. ...arepusher/Lost in Translation/04 Tommib 1.mp3 : File name too long For some reason, xargs is not splitting the entry...! I think I get the \n vs null terminated replacement and I dont see why it does not... It WORKED however by replacing the command by cat duplicate.txt | grep " 1.mp3" | while read file; do echo \"$file\"| xargs echo ; done to see what would actually be deleted and cat duplicate.txt | grep " 1.mp3" | while read file; do echo \"$file\"| xargs rm -v ; done to delete the files.
Hi, For some reason, the 2nd step did not work for me. I use macosx 10.5.4 and had the following error: > cat duplicate.txt | grep 1.mp3 | tr '\012' ' \000' | xargs -0 rm rm: /bibliotheque/iTunesFusion/iTunes Music/Daft Punk/Unknown Album/32 Daft Punk - Revolution 909(1) 1.mp3 /bibli.. ...arepusher/Lost in Translation/04 Tommib 1.mp3 : File name too long For some reason, xargs is not splitting the entry...! I think I get the \n vs null terminated replacement and I dont see why it does not... It WORKED however by replacing the command by cat duplicate.txt | grep " 1.mp3" | while read file; do echo \"$file\"| xargs echo ; done to see what would actually be deleted and cat duplicate.txt | grep " 1.mp3" | while read file; do echo \"$file\"| xargs rm -v ; done to delete the files.
Someone was looking for a software solution....try Beyond Compare by Scooter software. It's really easy to compare files and folders, then you can adjust everything into one folder or the other so you have everything in one spot. Awesome for everything, not just music. Just google it, I don't know the URL. HTH :)
To improve your one liner just use a regex like the following. cat dupes.txt | grep [1-9].mp3 | tr '\012' '\000' | xargs -0 rm
In iTunes 9.0.2 with Home Sharing setup, there is a "Show" box in the bottom left of the screen. You can select "Show items not in my library" and get a list of allegedly non-duplicate items that you can then transfer. This works preatty well. However, it is not perfect--the filtered list it showed me excluded some items that were not in my library...
Hello Adam; Your changed awk script doesn’t run; cat .tmpSortedFile | awk ‘{ if ($1 == oldmd5) { printf “rm "%s" \n”, substr($0, index($0, ” “)+1) } oldmd5 = $1 }’ > duplicates.sh -bash: syntax error near unexpected token `(' What's wrong here?
On the web page, the quotes are being changed to curly quotes. They're typographically very nice, but don't actually work as part of the unix command line. Take those double and single quotes and re-type them to make them just normal quotation marks.
I' m using "DuplicateFilesDeleter" Its a guaranteed fix for duplicates.
This discussion has been closed.
Jonathan Dingman
September 4, 2007 5:08 AM
Awesome Adam, thanks so much. I just transfered all my music over from my desktop to my Mac and I was dreading having to go through and remove all duplicate songs. This is a life saver, thanks again.