Find large files in multiple directories

I haven’t posted for awhile because I have been super busy. I don’t think I have had a chance to post anything since it was announced that the company I work for was being bought out in the beginning of May. That sale has now gone through earlier this week, but the workload has not decreased in the slightest.

Anyway, most admins know how to find the largest files inside a single directory:

du -sk * | sort -n

The other day I needed to find the largest of over 10k+ files spread across hundreds of directories. So with a little command combo magic, I used this:

for i in `find . -type f`; do du -sk $i; done | sort -n

Which runs a du -sk on every single file within the current directory and all sub-directories, pipes it to a sort, with the last files being output to the screen being the largest.

Looking at it again today, I guess I probably could have removed the for loop and just done:

du -sk `find . -type f` | sort -n

Testing it out real quick shows that it works; however, if you have as many files as I had to go through, you would have quickly exhausted du and it would have complained:

-bash: /usr/bin/du: Argument list too long

So it looks like the for loop was needed after all. Personally, I love running for loops on the command line.

Happy Father’s Day to all you fathers out there!


  1. This takes a while to run, but try

    du -mxS / | sort -n | tail -n 40

    I can’t remember where I picked that up, or what each of the options are, but it gives your the 40 largest directories on the system.

Leave a comment

Your email address will not be published. Required fields are marked *