Finding files that use the most disk space
Clash Royale CLAN TAG#URR8PPP
Is it possible to list the largest files on my hard drive? I frequently use df -H
to display my disk usage, but this only gives the percentage full, GBs remaining, etc.
I do a lot of data-intensive calculations, with a large number of small files and a very small number of very large files. Since most of my disk space used is in a very small number of files, it can be difficult to track down where these large files are. Deleting a 1 kB file does not free much space, but deleting a 100 GB file does. Is there any way to sort the files on the hard drive in terms of their size?
Thanks.
linux ubuntu df
add a comment |
Is it possible to list the largest files on my hard drive? I frequently use df -H
to display my disk usage, but this only gives the percentage full, GBs remaining, etc.
I do a lot of data-intensive calculations, with a large number of small files and a very small number of very large files. Since most of my disk space used is in a very small number of files, it can be difficult to track down where these large files are. Deleting a 1 kB file does not free much space, but deleting a 100 GB file does. Is there any way to sort the files on the hard drive in terms of their size?
Thanks.
linux ubuntu df
For 'tdu', see also: unix.stackexchange.com/questions/425615/…
– Joseph Paul
Feb 17 at 12:05
add a comment |
Is it possible to list the largest files on my hard drive? I frequently use df -H
to display my disk usage, but this only gives the percentage full, GBs remaining, etc.
I do a lot of data-intensive calculations, with a large number of small files and a very small number of very large files. Since most of my disk space used is in a very small number of files, it can be difficult to track down where these large files are. Deleting a 1 kB file does not free much space, but deleting a 100 GB file does. Is there any way to sort the files on the hard drive in terms of their size?
Thanks.
linux ubuntu df
Is it possible to list the largest files on my hard drive? I frequently use df -H
to display my disk usage, but this only gives the percentage full, GBs remaining, etc.
I do a lot of data-intensive calculations, with a large number of small files and a very small number of very large files. Since most of my disk space used is in a very small number of files, it can be difficult to track down where these large files are. Deleting a 1 kB file does not free much space, but deleting a 100 GB file does. Is there any way to sort the files on the hard drive in terms of their size?
Thanks.
linux ubuntu df
linux ubuntu df
asked Apr 24 '12 at 16:44
AndrewAndrew
4,065295575
4,065295575
For 'tdu', see also: unix.stackexchange.com/questions/425615/…
– Joseph Paul
Feb 17 at 12:05
add a comment |
For 'tdu', see also: unix.stackexchange.com/questions/425615/…
– Joseph Paul
Feb 17 at 12:05
For 'tdu', see also: unix.stackexchange.com/questions/425615/…
– Joseph Paul
Feb 17 at 12:05
For 'tdu', see also: unix.stackexchange.com/questions/425615/…
– Joseph Paul
Feb 17 at 12:05
add a comment |
9 Answers
9
active
oldest
votes
With standard available tools:
To list the top 10 largest files from the current directory: du . | sort -nr | head -n10
To list the largest directories from the current directory: du -s * | sort -nr | head -n10
UPDATE These days I usually use a more readable form (as Jay Chakra explains in another answer and leave off the | head -n10
, simply let it scroll off the screen. The last line has the largest file or directory (tree).
Sometimes, eg. when you have lots of mount points in the current directory, instead of using -x
or multiple --exclude=PATTERN
, it is handier to mount the filesystem on an unused mount point (often /mnt
) and work from there.
Mind you that when working with large (NFS) volumes, you can cause a substantial load on the storage backend (filer) when running du
over lots of (sub)directories. In that case it is better to consider setting quota
on the volume.
3
For your first option, can't you just list them withls -Sl | head
?
– Bernhard
Apr 24 '12 at 18:23
No,du
traverses the whole directory tree, whereas ls -S only checks the current directory.
– jippie
Jun 5 '12 at 17:25
add a comment |
Adding to jippie's answer
To list the largest directories from the current directory in human readable format:
du -sh * | sort -hr | head -n10
Sample:
[~]$ du -sh * | sort -hr | head -n10
48M app
11M lib
6.7M Vendor
1.1M composer.phar
488K phpcs.phar
488K phpcbf.phar
72K doc
16K nbproject
8.0K composer.lock
4.0K README.md
It makes it more convenient to read :)
add a comment |
Try ncdu
, as it can give you an overview of disk usage. From its website:
A disk usage analyzer with an ncurses interface, aimed to be run on a remote server where you don't have an entire gaphical setup, but have to do with a simple SSH connection. ncdu aims to be fast, simple and easy to use, and should be able to run in any minimal POSIX-like environment with ncurses installed.
add a comment |
(gnu)
du -max /dir | sort -n
Will display big files as well as big directories, can be used to identify where you need to do some cleanup.
du -max | sort -n | tail -1000
...
46632 ./i386/update/SuSE-SLES/8/rpm/i586/kernel-source-2.4.21-138.i586.rpm
49816 ./UnitedLinux/apt/i386/RPMS.updates/k_debug-2.4.21-138.i586.rpm
679220 ./UnitedLinux/apt/i386/RPMS.updates
679248 ./UnitedLinux/apt/i386
679252 ./UnitedLinux/apt
691820 ./UnitedLinux/i586
691836 ./i386/update/SuSE-SLES/8/rpm/i586
695192 ./i386/update/SuSE-SLES/8/rpm
695788 ./i386/update/SuSE-SLES/8
695792 ./i386/update/SuSE-SLES
695804 ./i386/update
695808 ./i386
1390184 ./UnitedLinux
(I know that's a quite old tree :p )
add a comment |
There is a simple and effective way to find size of every file and directory in Ubuntu:
Applications > Accessories > Disk Usage Analyzer
in this window click "Scan Filesystem" button on toolbar. after a short time (seconds) you have disk usage of every directory and file.
add a comment |
If you prefer a graphical tool, theres https://github.com/shundhammer/qdirstat
add a comment |
You can try with this command, it will list all files larger than 20Mb.
find / -type f -size +20000k -exec ls -lh ; 2> /dev/null
| awk ' print $NF ": " $5 ' | sort -hrk 2,2
3
If the biggest file on your filesystem is 20MB, you probably wouldn't be running low on disk space. At least with any HD made this millenium
– Kevin
Apr 24 '12 at 17:15
That's only example, you put there whatever you want. It will find everything bigger than 20MB, not only 20MB files.
– patseb
Apr 24 '12 at 19:25
ls -lh
thensort
??ls -s
orstat -c %b
are probably better.
– Mikel
Apr 24 '12 at 19:54
I don't get. My example use ls, and sort. He wanted to find files over whole disk not directory.
– patseb
Apr 24 '12 at 19:57
add a comment |
You can try with this command, it will list the large file:
ls -lrS | tail -1
shows the current directory, not the entire HDD.
– slm♦
Nov 5 '14 at 13:05
add a comment |
du -csb `ls` | sort -nr | head -n10
1
mywiki.wooledge.org/ParsingLs
– sourcejedi
Feb 12 at 14:44
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "106"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f37221%2ffinding-files-that-use-the-most-disk-space%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
9 Answers
9
active
oldest
votes
9 Answers
9
active
oldest
votes
active
oldest
votes
active
oldest
votes
With standard available tools:
To list the top 10 largest files from the current directory: du . | sort -nr | head -n10
To list the largest directories from the current directory: du -s * | sort -nr | head -n10
UPDATE These days I usually use a more readable form (as Jay Chakra explains in another answer and leave off the | head -n10
, simply let it scroll off the screen. The last line has the largest file or directory (tree).
Sometimes, eg. when you have lots of mount points in the current directory, instead of using -x
or multiple --exclude=PATTERN
, it is handier to mount the filesystem on an unused mount point (often /mnt
) and work from there.
Mind you that when working with large (NFS) volumes, you can cause a substantial load on the storage backend (filer) when running du
over lots of (sub)directories. In that case it is better to consider setting quota
on the volume.
3
For your first option, can't you just list them withls -Sl | head
?
– Bernhard
Apr 24 '12 at 18:23
No,du
traverses the whole directory tree, whereas ls -S only checks the current directory.
– jippie
Jun 5 '12 at 17:25
add a comment |
With standard available tools:
To list the top 10 largest files from the current directory: du . | sort -nr | head -n10
To list the largest directories from the current directory: du -s * | sort -nr | head -n10
UPDATE These days I usually use a more readable form (as Jay Chakra explains in another answer and leave off the | head -n10
, simply let it scroll off the screen. The last line has the largest file or directory (tree).
Sometimes, eg. when you have lots of mount points in the current directory, instead of using -x
or multiple --exclude=PATTERN
, it is handier to mount the filesystem on an unused mount point (often /mnt
) and work from there.
Mind you that when working with large (NFS) volumes, you can cause a substantial load on the storage backend (filer) when running du
over lots of (sub)directories. In that case it is better to consider setting quota
on the volume.
3
For your first option, can't you just list them withls -Sl | head
?
– Bernhard
Apr 24 '12 at 18:23
No,du
traverses the whole directory tree, whereas ls -S only checks the current directory.
– jippie
Jun 5 '12 at 17:25
add a comment |
With standard available tools:
To list the top 10 largest files from the current directory: du . | sort -nr | head -n10
To list the largest directories from the current directory: du -s * | sort -nr | head -n10
UPDATE These days I usually use a more readable form (as Jay Chakra explains in another answer and leave off the | head -n10
, simply let it scroll off the screen. The last line has the largest file or directory (tree).
Sometimes, eg. when you have lots of mount points in the current directory, instead of using -x
or multiple --exclude=PATTERN
, it is handier to mount the filesystem on an unused mount point (often /mnt
) and work from there.
Mind you that when working with large (NFS) volumes, you can cause a substantial load on the storage backend (filer) when running du
over lots of (sub)directories. In that case it is better to consider setting quota
on the volume.
With standard available tools:
To list the top 10 largest files from the current directory: du . | sort -nr | head -n10
To list the largest directories from the current directory: du -s * | sort -nr | head -n10
UPDATE These days I usually use a more readable form (as Jay Chakra explains in another answer and leave off the | head -n10
, simply let it scroll off the screen. The last line has the largest file or directory (tree).
Sometimes, eg. when you have lots of mount points in the current directory, instead of using -x
or multiple --exclude=PATTERN
, it is handier to mount the filesystem on an unused mount point (often /mnt
) and work from there.
Mind you that when working with large (NFS) volumes, you can cause a substantial load on the storage backend (filer) when running du
over lots of (sub)directories. In that case it is better to consider setting quota
on the volume.
edited Feb 25 '17 at 21:36
answered Apr 24 '12 at 17:35
jippiejippie
9,07172956
9,07172956
3
For your first option, can't you just list them withls -Sl | head
?
– Bernhard
Apr 24 '12 at 18:23
No,du
traverses the whole directory tree, whereas ls -S only checks the current directory.
– jippie
Jun 5 '12 at 17:25
add a comment |
3
For your first option, can't you just list them withls -Sl | head
?
– Bernhard
Apr 24 '12 at 18:23
No,du
traverses the whole directory tree, whereas ls -S only checks the current directory.
– jippie
Jun 5 '12 at 17:25
3
3
For your first option, can't you just list them with
ls -Sl | head
?– Bernhard
Apr 24 '12 at 18:23
For your first option, can't you just list them with
ls -Sl | head
?– Bernhard
Apr 24 '12 at 18:23
No,
du
traverses the whole directory tree, whereas ls -S only checks the current directory.– jippie
Jun 5 '12 at 17:25
No,
du
traverses the whole directory tree, whereas ls -S only checks the current directory.– jippie
Jun 5 '12 at 17:25
add a comment |
Adding to jippie's answer
To list the largest directories from the current directory in human readable format:
du -sh * | sort -hr | head -n10
Sample:
[~]$ du -sh * | sort -hr | head -n10
48M app
11M lib
6.7M Vendor
1.1M composer.phar
488K phpcs.phar
488K phpcbf.phar
72K doc
16K nbproject
8.0K composer.lock
4.0K README.md
It makes it more convenient to read :)
add a comment |
Adding to jippie's answer
To list the largest directories from the current directory in human readable format:
du -sh * | sort -hr | head -n10
Sample:
[~]$ du -sh * | sort -hr | head -n10
48M app
11M lib
6.7M Vendor
1.1M composer.phar
488K phpcs.phar
488K phpcbf.phar
72K doc
16K nbproject
8.0K composer.lock
4.0K README.md
It makes it more convenient to read :)
add a comment |
Adding to jippie's answer
To list the largest directories from the current directory in human readable format:
du -sh * | sort -hr | head -n10
Sample:
[~]$ du -sh * | sort -hr | head -n10
48M app
11M lib
6.7M Vendor
1.1M composer.phar
488K phpcs.phar
488K phpcbf.phar
72K doc
16K nbproject
8.0K composer.lock
4.0K README.md
It makes it more convenient to read :)
Adding to jippie's answer
To list the largest directories from the current directory in human readable format:
du -sh * | sort -hr | head -n10
Sample:
[~]$ du -sh * | sort -hr | head -n10
48M app
11M lib
6.7M Vendor
1.1M composer.phar
488K phpcs.phar
488K phpcbf.phar
72K doc
16K nbproject
8.0K composer.lock
4.0K README.md
It makes it more convenient to read :)
edited Aug 9 '15 at 20:53
answered Aug 9 '15 at 7:05
Jay ChakraJay Chakra
41145
41145
add a comment |
add a comment |
Try ncdu
, as it can give you an overview of disk usage. From its website:
A disk usage analyzer with an ncurses interface, aimed to be run on a remote server where you don't have an entire gaphical setup, but have to do with a simple SSH connection. ncdu aims to be fast, simple and easy to use, and should be able to run in any minimal POSIX-like environment with ncurses installed.
add a comment |
Try ncdu
, as it can give you an overview of disk usage. From its website:
A disk usage analyzer with an ncurses interface, aimed to be run on a remote server where you don't have an entire gaphical setup, but have to do with a simple SSH connection. ncdu aims to be fast, simple and easy to use, and should be able to run in any minimal POSIX-like environment with ncurses installed.
add a comment |
Try ncdu
, as it can give you an overview of disk usage. From its website:
A disk usage analyzer with an ncurses interface, aimed to be run on a remote server where you don't have an entire gaphical setup, but have to do with a simple SSH connection. ncdu aims to be fast, simple and easy to use, and should be able to run in any minimal POSIX-like environment with ncurses installed.
Try ncdu
, as it can give you an overview of disk usage. From its website:
A disk usage analyzer with an ncurses interface, aimed to be run on a remote server where you don't have an entire gaphical setup, but have to do with a simple SSH connection. ncdu aims to be fast, simple and easy to use, and should be able to run in any minimal POSIX-like environment with ncurses installed.
answered Apr 24 '12 at 16:57
RenanRenan
14.6k65578
14.6k65578
add a comment |
add a comment |
(gnu)
du -max /dir | sort -n
Will display big files as well as big directories, can be used to identify where you need to do some cleanup.
du -max | sort -n | tail -1000
...
46632 ./i386/update/SuSE-SLES/8/rpm/i586/kernel-source-2.4.21-138.i586.rpm
49816 ./UnitedLinux/apt/i386/RPMS.updates/k_debug-2.4.21-138.i586.rpm
679220 ./UnitedLinux/apt/i386/RPMS.updates
679248 ./UnitedLinux/apt/i386
679252 ./UnitedLinux/apt
691820 ./UnitedLinux/i586
691836 ./i386/update/SuSE-SLES/8/rpm/i586
695192 ./i386/update/SuSE-SLES/8/rpm
695788 ./i386/update/SuSE-SLES/8
695792 ./i386/update/SuSE-SLES
695804 ./i386/update
695808 ./i386
1390184 ./UnitedLinux
(I know that's a quite old tree :p )
add a comment |
(gnu)
du -max /dir | sort -n
Will display big files as well as big directories, can be used to identify where you need to do some cleanup.
du -max | sort -n | tail -1000
...
46632 ./i386/update/SuSE-SLES/8/rpm/i586/kernel-source-2.4.21-138.i586.rpm
49816 ./UnitedLinux/apt/i386/RPMS.updates/k_debug-2.4.21-138.i586.rpm
679220 ./UnitedLinux/apt/i386/RPMS.updates
679248 ./UnitedLinux/apt/i386
679252 ./UnitedLinux/apt
691820 ./UnitedLinux/i586
691836 ./i386/update/SuSE-SLES/8/rpm/i586
695192 ./i386/update/SuSE-SLES/8/rpm
695788 ./i386/update/SuSE-SLES/8
695792 ./i386/update/SuSE-SLES
695804 ./i386/update
695808 ./i386
1390184 ./UnitedLinux
(I know that's a quite old tree :p )
add a comment |
(gnu)
du -max /dir | sort -n
Will display big files as well as big directories, can be used to identify where you need to do some cleanup.
du -max | sort -n | tail -1000
...
46632 ./i386/update/SuSE-SLES/8/rpm/i586/kernel-source-2.4.21-138.i586.rpm
49816 ./UnitedLinux/apt/i386/RPMS.updates/k_debug-2.4.21-138.i586.rpm
679220 ./UnitedLinux/apt/i386/RPMS.updates
679248 ./UnitedLinux/apt/i386
679252 ./UnitedLinux/apt
691820 ./UnitedLinux/i586
691836 ./i386/update/SuSE-SLES/8/rpm/i586
695192 ./i386/update/SuSE-SLES/8/rpm
695788 ./i386/update/SuSE-SLES/8
695792 ./i386/update/SuSE-SLES
695804 ./i386/update
695808 ./i386
1390184 ./UnitedLinux
(I know that's a quite old tree :p )
(gnu)
du -max /dir | sort -n
Will display big files as well as big directories, can be used to identify where you need to do some cleanup.
du -max | sort -n | tail -1000
...
46632 ./i386/update/SuSE-SLES/8/rpm/i586/kernel-source-2.4.21-138.i586.rpm
49816 ./UnitedLinux/apt/i386/RPMS.updates/k_debug-2.4.21-138.i586.rpm
679220 ./UnitedLinux/apt/i386/RPMS.updates
679248 ./UnitedLinux/apt/i386
679252 ./UnitedLinux/apt
691820 ./UnitedLinux/i586
691836 ./i386/update/SuSE-SLES/8/rpm/i586
695192 ./i386/update/SuSE-SLES/8/rpm
695788 ./i386/update/SuSE-SLES/8
695792 ./i386/update/SuSE-SLES
695804 ./i386/update
695808 ./i386
1390184 ./UnitedLinux
(I know that's a quite old tree :p )
edited Feb 14 at 11:33
answered Nov 5 '14 at 16:53
EmmanuelEmmanuel
3,06911120
3,06911120
add a comment |
add a comment |
There is a simple and effective way to find size of every file and directory in Ubuntu:
Applications > Accessories > Disk Usage Analyzer
in this window click "Scan Filesystem" button on toolbar. after a short time (seconds) you have disk usage of every directory and file.
add a comment |
There is a simple and effective way to find size of every file and directory in Ubuntu:
Applications > Accessories > Disk Usage Analyzer
in this window click "Scan Filesystem" button on toolbar. after a short time (seconds) you have disk usage of every directory and file.
add a comment |
There is a simple and effective way to find size of every file and directory in Ubuntu:
Applications > Accessories > Disk Usage Analyzer
in this window click "Scan Filesystem" button on toolbar. after a short time (seconds) you have disk usage of every directory and file.
There is a simple and effective way to find size of every file and directory in Ubuntu:
Applications > Accessories > Disk Usage Analyzer
in this window click "Scan Filesystem" button on toolbar. after a short time (seconds) you have disk usage of every directory and file.
answered Apr 30 '12 at 6:26
SamSam
1,33531628
1,33531628
add a comment |
add a comment |
If you prefer a graphical tool, theres https://github.com/shundhammer/qdirstat
add a comment |
If you prefer a graphical tool, theres https://github.com/shundhammer/qdirstat
add a comment |
If you prefer a graphical tool, theres https://github.com/shundhammer/qdirstat
If you prefer a graphical tool, theres https://github.com/shundhammer/qdirstat
answered Feb 12 at 14:43
Hannes SnögrenHannes Snögren
111
111
add a comment |
add a comment |
You can try with this command, it will list all files larger than 20Mb.
find / -type f -size +20000k -exec ls -lh ; 2> /dev/null
| awk ' print $NF ": " $5 ' | sort -hrk 2,2
3
If the biggest file on your filesystem is 20MB, you probably wouldn't be running low on disk space. At least with any HD made this millenium
– Kevin
Apr 24 '12 at 17:15
That's only example, you put there whatever you want. It will find everything bigger than 20MB, not only 20MB files.
– patseb
Apr 24 '12 at 19:25
ls -lh
thensort
??ls -s
orstat -c %b
are probably better.
– Mikel
Apr 24 '12 at 19:54
I don't get. My example use ls, and sort. He wanted to find files over whole disk not directory.
– patseb
Apr 24 '12 at 19:57
add a comment |
You can try with this command, it will list all files larger than 20Mb.
find / -type f -size +20000k -exec ls -lh ; 2> /dev/null
| awk ' print $NF ": " $5 ' | sort -hrk 2,2
3
If the biggest file on your filesystem is 20MB, you probably wouldn't be running low on disk space. At least with any HD made this millenium
– Kevin
Apr 24 '12 at 17:15
That's only example, you put there whatever you want. It will find everything bigger than 20MB, not only 20MB files.
– patseb
Apr 24 '12 at 19:25
ls -lh
thensort
??ls -s
orstat -c %b
are probably better.
– Mikel
Apr 24 '12 at 19:54
I don't get. My example use ls, and sort. He wanted to find files over whole disk not directory.
– patseb
Apr 24 '12 at 19:57
add a comment |
You can try with this command, it will list all files larger than 20Mb.
find / -type f -size +20000k -exec ls -lh ; 2> /dev/null
| awk ' print $NF ": " $5 ' | sort -hrk 2,2
You can try with this command, it will list all files larger than 20Mb.
find / -type f -size +20000k -exec ls -lh ; 2> /dev/null
| awk ' print $NF ": " $5 ' | sort -hrk 2,2
edited Apr 30 '12 at 5:56
Mat
39.7k8121127
39.7k8121127
answered Apr 24 '12 at 16:56
patsebpatseb
111
111
3
If the biggest file on your filesystem is 20MB, you probably wouldn't be running low on disk space. At least with any HD made this millenium
– Kevin
Apr 24 '12 at 17:15
That's only example, you put there whatever you want. It will find everything bigger than 20MB, not only 20MB files.
– patseb
Apr 24 '12 at 19:25
ls -lh
thensort
??ls -s
orstat -c %b
are probably better.
– Mikel
Apr 24 '12 at 19:54
I don't get. My example use ls, and sort. He wanted to find files over whole disk not directory.
– patseb
Apr 24 '12 at 19:57
add a comment |
3
If the biggest file on your filesystem is 20MB, you probably wouldn't be running low on disk space. At least with any HD made this millenium
– Kevin
Apr 24 '12 at 17:15
That's only example, you put there whatever you want. It will find everything bigger than 20MB, not only 20MB files.
– patseb
Apr 24 '12 at 19:25
ls -lh
thensort
??ls -s
orstat -c %b
are probably better.
– Mikel
Apr 24 '12 at 19:54
I don't get. My example use ls, and sort. He wanted to find files over whole disk not directory.
– patseb
Apr 24 '12 at 19:57
3
3
If the biggest file on your filesystem is 20MB, you probably wouldn't be running low on disk space. At least with any HD made this millenium
– Kevin
Apr 24 '12 at 17:15
If the biggest file on your filesystem is 20MB, you probably wouldn't be running low on disk space. At least with any HD made this millenium
– Kevin
Apr 24 '12 at 17:15
That's only example, you put there whatever you want. It will find everything bigger than 20MB, not only 20MB files.
– patseb
Apr 24 '12 at 19:25
That's only example, you put there whatever you want. It will find everything bigger than 20MB, not only 20MB files.
– patseb
Apr 24 '12 at 19:25
ls -lh
then sort
?? ls -s
or stat -c %b
are probably better.– Mikel
Apr 24 '12 at 19:54
ls -lh
then sort
?? ls -s
or stat -c %b
are probably better.– Mikel
Apr 24 '12 at 19:54
I don't get. My example use ls, and sort. He wanted to find files over whole disk not directory.
– patseb
Apr 24 '12 at 19:57
I don't get. My example use ls, and sort. He wanted to find files over whole disk not directory.
– patseb
Apr 24 '12 at 19:57
add a comment |
You can try with this command, it will list the large file:
ls -lrS | tail -1
shows the current directory, not the entire HDD.
– slm♦
Nov 5 '14 at 13:05
add a comment |
You can try with this command, it will list the large file:
ls -lrS | tail -1
shows the current directory, not the entire HDD.
– slm♦
Nov 5 '14 at 13:05
add a comment |
You can try with this command, it will list the large file:
ls -lrS | tail -1
You can try with this command, it will list the large file:
ls -lrS | tail -1
edited Nov 5 '14 at 12:40
Anthon
61k17104166
61k17104166
answered Nov 5 '14 at 12:29
thivathiva
1
1
shows the current directory, not the entire HDD.
– slm♦
Nov 5 '14 at 13:05
add a comment |
shows the current directory, not the entire HDD.
– slm♦
Nov 5 '14 at 13:05
shows the current directory, not the entire HDD.
– slm♦
Nov 5 '14 at 13:05
shows the current directory, not the entire HDD.
– slm♦
Nov 5 '14 at 13:05
add a comment |
du -csb `ls` | sort -nr | head -n10
1
mywiki.wooledge.org/ParsingLs
– sourcejedi
Feb 12 at 14:44
add a comment |
du -csb `ls` | sort -nr | head -n10
1
mywiki.wooledge.org/ParsingLs
– sourcejedi
Feb 12 at 14:44
add a comment |
du -csb `ls` | sort -nr | head -n10
du -csb `ls` | sort -nr | head -n10
edited Feb 12 at 16:56
Jeff Schaller
43.1k1159137
43.1k1159137
answered Feb 12 at 14:36
yhjyhj
11
11
1
mywiki.wooledge.org/ParsingLs
– sourcejedi
Feb 12 at 14:44
add a comment |
1
mywiki.wooledge.org/ParsingLs
– sourcejedi
Feb 12 at 14:44
1
1
mywiki.wooledge.org/ParsingLs
– sourcejedi
Feb 12 at 14:44
mywiki.wooledge.org/ParsingLs
– sourcejedi
Feb 12 at 14:44
add a comment |
Thanks for contributing an answer to Unix & Linux Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f37221%2ffinding-files-that-use-the-most-disk-space%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
For 'tdu', see also: unix.stackexchange.com/questions/425615/…
– Joseph Paul
Feb 17 at 12:05