Get size of multiple folders - How to calculate server resources consumption of the process?
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
I need to get the size of multiple folders (with subfolders and several files) and update a database, in order to limit the users upload to my server. This process must be done periodically, in the shortest time possible.
I created a crontab to execute this procedure each 30 minutes, but would be great if this time interval could be reduced to 10 minutes or less. I have no idea how much resources this process is taking from server. I have a humble VPS with 1 GB RAM, 1 CPU, Debian 8, Apache2 and MariaDB.
My question is: how to calculate server resources consumption of this process, in order to estimate which time interval to use?
Btw, here is the PHP function I am using to get folder (and subfolders) size:
function dirSize($dir) awk 'BEGINsum=0 sum=sum+$1 END print sum'", 'r');
$size = intval(fgets($cmd, 80));
pclose($cmd);
return $size;
Thanks!
EDIT:
I received a good suggestion from @meuh to update the database each time a file is uploaded or deleted, adding or subtracting the single file size. I'd like to know, in terms of resources consumption, if this is a better approach. Thanks!
debian php resources
 |Â
show 4 more comments
up vote
0
down vote
favorite
I need to get the size of multiple folders (with subfolders and several files) and update a database, in order to limit the users upload to my server. This process must be done periodically, in the shortest time possible.
I created a crontab to execute this procedure each 30 minutes, but would be great if this time interval could be reduced to 10 minutes or less. I have no idea how much resources this process is taking from server. I have a humble VPS with 1 GB RAM, 1 CPU, Debian 8, Apache2 and MariaDB.
My question is: how to calculate server resources consumption of this process, in order to estimate which time interval to use?
Btw, here is the PHP function I am using to get folder (and subfolders) size:
function dirSize($dir) awk 'BEGINsum=0 sum=sum+$1 END print sum'", 'r');
$size = intval(fgets($cmd, 80));
pclose($cmd);
return $size;
Thanks!
EDIT:
I received a good suggestion from @meuh to update the database each time a file is uploaded or deleted, adding or subtracting the single file size. I'd like to know, in terms of resources consumption, if this is a better approach. Thanks!
debian php resources
If you want to manually observe resource usage most people start withtop
.
â B Layer
Nov 5 '17 at 14:04
It would make more sense to calculate this only for a given user when they were actually trying to upload a file.
â meuh
Nov 5 '17 at 14:10
@BLayer, I am preparing the server to start activities, today we have only few test directories and files. But I will try to add more files and usehtop
. Do you think it is fast enough to show resources changes when testing the process with, let's say, 10 folders and 10,000 files?
â Guybrush
Nov 5 '17 at 14:28
@meuh, files will be uploaded at all time, by multiple users at same time. So I do not think it is a good idea to calculate the folder size everytime a new file is uploaded...
â Guybrush
Nov 5 '17 at 14:29
2
If your php code is the only thing allowing new files to be created, then clearly you should simply keep in your database a count of how many bytes each user has used so far, and update it every time they upload a new file. Adding a number to an entry in a database is much better than doing adu
, as this will make your disk cache unable to hold anything useful except all the directories thatdu
keeps reading.
â meuh
Nov 5 '17 at 14:34
 |Â
show 4 more comments
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I need to get the size of multiple folders (with subfolders and several files) and update a database, in order to limit the users upload to my server. This process must be done periodically, in the shortest time possible.
I created a crontab to execute this procedure each 30 minutes, but would be great if this time interval could be reduced to 10 minutes or less. I have no idea how much resources this process is taking from server. I have a humble VPS with 1 GB RAM, 1 CPU, Debian 8, Apache2 and MariaDB.
My question is: how to calculate server resources consumption of this process, in order to estimate which time interval to use?
Btw, here is the PHP function I am using to get folder (and subfolders) size:
function dirSize($dir) awk 'BEGINsum=0 sum=sum+$1 END print sum'", 'r');
$size = intval(fgets($cmd, 80));
pclose($cmd);
return $size;
Thanks!
EDIT:
I received a good suggestion from @meuh to update the database each time a file is uploaded or deleted, adding or subtracting the single file size. I'd like to know, in terms of resources consumption, if this is a better approach. Thanks!
debian php resources
I need to get the size of multiple folders (with subfolders and several files) and update a database, in order to limit the users upload to my server. This process must be done periodically, in the shortest time possible.
I created a crontab to execute this procedure each 30 minutes, but would be great if this time interval could be reduced to 10 minutes or less. I have no idea how much resources this process is taking from server. I have a humble VPS with 1 GB RAM, 1 CPU, Debian 8, Apache2 and MariaDB.
My question is: how to calculate server resources consumption of this process, in order to estimate which time interval to use?
Btw, here is the PHP function I am using to get folder (and subfolders) size:
function dirSize($dir) awk 'BEGINsum=0 sum=sum+$1 END print sum'", 'r');
$size = intval(fgets($cmd, 80));
pclose($cmd);
return $size;
Thanks!
EDIT:
I received a good suggestion from @meuh to update the database each time a file is uploaded or deleted, adding or subtracting the single file size. I'd like to know, in terms of resources consumption, if this is a better approach. Thanks!
debian php resources
edited Nov 5 '17 at 16:28
asked Nov 5 '17 at 13:21
Guybrush
1306
1306
If you want to manually observe resource usage most people start withtop
.
â B Layer
Nov 5 '17 at 14:04
It would make more sense to calculate this only for a given user when they were actually trying to upload a file.
â meuh
Nov 5 '17 at 14:10
@BLayer, I am preparing the server to start activities, today we have only few test directories and files. But I will try to add more files and usehtop
. Do you think it is fast enough to show resources changes when testing the process with, let's say, 10 folders and 10,000 files?
â Guybrush
Nov 5 '17 at 14:28
@meuh, files will be uploaded at all time, by multiple users at same time. So I do not think it is a good idea to calculate the folder size everytime a new file is uploaded...
â Guybrush
Nov 5 '17 at 14:29
2
If your php code is the only thing allowing new files to be created, then clearly you should simply keep in your database a count of how many bytes each user has used so far, and update it every time they upload a new file. Adding a number to an entry in a database is much better than doing adu
, as this will make your disk cache unable to hold anything useful except all the directories thatdu
keeps reading.
â meuh
Nov 5 '17 at 14:34
 |Â
show 4 more comments
If you want to manually observe resource usage most people start withtop
.
â B Layer
Nov 5 '17 at 14:04
It would make more sense to calculate this only for a given user when they were actually trying to upload a file.
â meuh
Nov 5 '17 at 14:10
@BLayer, I am preparing the server to start activities, today we have only few test directories and files. But I will try to add more files and usehtop
. Do you think it is fast enough to show resources changes when testing the process with, let's say, 10 folders and 10,000 files?
â Guybrush
Nov 5 '17 at 14:28
@meuh, files will be uploaded at all time, by multiple users at same time. So I do not think it is a good idea to calculate the folder size everytime a new file is uploaded...
â Guybrush
Nov 5 '17 at 14:29
2
If your php code is the only thing allowing new files to be created, then clearly you should simply keep in your database a count of how many bytes each user has used so far, and update it every time they upload a new file. Adding a number to an entry in a database is much better than doing adu
, as this will make your disk cache unable to hold anything useful except all the directories thatdu
keeps reading.
â meuh
Nov 5 '17 at 14:34
If you want to manually observe resource usage most people start with
top
.â B Layer
Nov 5 '17 at 14:04
If you want to manually observe resource usage most people start with
top
.â B Layer
Nov 5 '17 at 14:04
It would make more sense to calculate this only for a given user when they were actually trying to upload a file.
â meuh
Nov 5 '17 at 14:10
It would make more sense to calculate this only for a given user when they were actually trying to upload a file.
â meuh
Nov 5 '17 at 14:10
@BLayer, I am preparing the server to start activities, today we have only few test directories and files. But I will try to add more files and use
htop
. Do you think it is fast enough to show resources changes when testing the process with, let's say, 10 folders and 10,000 files?â Guybrush
Nov 5 '17 at 14:28
@BLayer, I am preparing the server to start activities, today we have only few test directories and files. But I will try to add more files and use
htop
. Do you think it is fast enough to show resources changes when testing the process with, let's say, 10 folders and 10,000 files?â Guybrush
Nov 5 '17 at 14:28
@meuh, files will be uploaded at all time, by multiple users at same time. So I do not think it is a good idea to calculate the folder size everytime a new file is uploaded...
â Guybrush
Nov 5 '17 at 14:29
@meuh, files will be uploaded at all time, by multiple users at same time. So I do not think it is a good idea to calculate the folder size everytime a new file is uploaded...
â Guybrush
Nov 5 '17 at 14:29
2
2
If your php code is the only thing allowing new files to be created, then clearly you should simply keep in your database a count of how many bytes each user has used so far, and update it every time they upload a new file. Adding a number to an entry in a database is much better than doing a
du
, as this will make your disk cache unable to hold anything useful except all the directories that du
keeps reading.â meuh
Nov 5 '17 at 14:34
If your php code is the only thing allowing new files to be created, then clearly you should simply keep in your database a count of how many bytes each user has used so far, and update it every time they upload a new file. Adding a number to an entry in a database is much better than doing a
du
, as this will make your disk cache unable to hold anything useful except all the directories that du
keeps reading.â meuh
Nov 5 '17 at 14:34
 |Â
show 4 more comments
1 Answer
1
active
oldest
votes
up vote
0
down vote
You could try the du
command:
du - estimate file space usage
Example:
$ du -d 0 -h Documents/
3.2M Documents/
The du
command has lots of options, you could even pass it all the directories at once and get a total of the disk usage (check man du
for more info).
To find the time taken (in a practical way), you could prepend time
to it:
$ time du -d 0 -h Documents/
3.2M Documents/
real 0m0.004s
user 0m0.002s
sys 0m0.001s
This time likely varies with the amount of files and folders in a directory.
Hi Aaditya! Ok,du
seems to be much faster thanls
used in current function. But my question is about resources consumption. Do you think I can usedu
each 10 minutes to get the size from maybe 10,000 directories, each one with up to 10 subdirectories and hundreds of files, without heavely impact in the server resources? Thank you!
â Guybrush
Nov 5 '17 at 14:00
1
I find thatdu -s
is pretty fast even with large file trees. Note that the tool description says " estimate file space usage". I think it's fairly well optimized.
â B Layer
Nov 5 '17 at 14:03
@BLayer, "estimate" size is enough for me. I will do a research aboutdu
to understand how it works. Maybe it gets the size from some directories pre-calculated information table, generated by Linux. Thanks!
â Guybrush
Nov 5 '17 at 14:32
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
You could try the du
command:
du - estimate file space usage
Example:
$ du -d 0 -h Documents/
3.2M Documents/
The du
command has lots of options, you could even pass it all the directories at once and get a total of the disk usage (check man du
for more info).
To find the time taken (in a practical way), you could prepend time
to it:
$ time du -d 0 -h Documents/
3.2M Documents/
real 0m0.004s
user 0m0.002s
sys 0m0.001s
This time likely varies with the amount of files and folders in a directory.
Hi Aaditya! Ok,du
seems to be much faster thanls
used in current function. But my question is about resources consumption. Do you think I can usedu
each 10 minutes to get the size from maybe 10,000 directories, each one with up to 10 subdirectories and hundreds of files, without heavely impact in the server resources? Thank you!
â Guybrush
Nov 5 '17 at 14:00
1
I find thatdu -s
is pretty fast even with large file trees. Note that the tool description says " estimate file space usage". I think it's fairly well optimized.
â B Layer
Nov 5 '17 at 14:03
@BLayer, "estimate" size is enough for me. I will do a research aboutdu
to understand how it works. Maybe it gets the size from some directories pre-calculated information table, generated by Linux. Thanks!
â Guybrush
Nov 5 '17 at 14:32
add a comment |Â
up vote
0
down vote
You could try the du
command:
du - estimate file space usage
Example:
$ du -d 0 -h Documents/
3.2M Documents/
The du
command has lots of options, you could even pass it all the directories at once and get a total of the disk usage (check man du
for more info).
To find the time taken (in a practical way), you could prepend time
to it:
$ time du -d 0 -h Documents/
3.2M Documents/
real 0m0.004s
user 0m0.002s
sys 0m0.001s
This time likely varies with the amount of files and folders in a directory.
Hi Aaditya! Ok,du
seems to be much faster thanls
used in current function. But my question is about resources consumption. Do you think I can usedu
each 10 minutes to get the size from maybe 10,000 directories, each one with up to 10 subdirectories and hundreds of files, without heavely impact in the server resources? Thank you!
â Guybrush
Nov 5 '17 at 14:00
1
I find thatdu -s
is pretty fast even with large file trees. Note that the tool description says " estimate file space usage". I think it's fairly well optimized.
â B Layer
Nov 5 '17 at 14:03
@BLayer, "estimate" size is enough for me. I will do a research aboutdu
to understand how it works. Maybe it gets the size from some directories pre-calculated information table, generated by Linux. Thanks!
â Guybrush
Nov 5 '17 at 14:32
add a comment |Â
up vote
0
down vote
up vote
0
down vote
You could try the du
command:
du - estimate file space usage
Example:
$ du -d 0 -h Documents/
3.2M Documents/
The du
command has lots of options, you could even pass it all the directories at once and get a total of the disk usage (check man du
for more info).
To find the time taken (in a practical way), you could prepend time
to it:
$ time du -d 0 -h Documents/
3.2M Documents/
real 0m0.004s
user 0m0.002s
sys 0m0.001s
This time likely varies with the amount of files and folders in a directory.
You could try the du
command:
du - estimate file space usage
Example:
$ du -d 0 -h Documents/
3.2M Documents/
The du
command has lots of options, you could even pass it all the directories at once and get a total of the disk usage (check man du
for more info).
To find the time taken (in a practical way), you could prepend time
to it:
$ time du -d 0 -h Documents/
3.2M Documents/
real 0m0.004s
user 0m0.002s
sys 0m0.001s
This time likely varies with the amount of files and folders in a directory.
answered Nov 5 '17 at 13:43
Aaditya Bagga
190119
190119
Hi Aaditya! Ok,du
seems to be much faster thanls
used in current function. But my question is about resources consumption. Do you think I can usedu
each 10 minutes to get the size from maybe 10,000 directories, each one with up to 10 subdirectories and hundreds of files, without heavely impact in the server resources? Thank you!
â Guybrush
Nov 5 '17 at 14:00
1
I find thatdu -s
is pretty fast even with large file trees. Note that the tool description says " estimate file space usage". I think it's fairly well optimized.
â B Layer
Nov 5 '17 at 14:03
@BLayer, "estimate" size is enough for me. I will do a research aboutdu
to understand how it works. Maybe it gets the size from some directories pre-calculated information table, generated by Linux. Thanks!
â Guybrush
Nov 5 '17 at 14:32
add a comment |Â
Hi Aaditya! Ok,du
seems to be much faster thanls
used in current function. But my question is about resources consumption. Do you think I can usedu
each 10 minutes to get the size from maybe 10,000 directories, each one with up to 10 subdirectories and hundreds of files, without heavely impact in the server resources? Thank you!
â Guybrush
Nov 5 '17 at 14:00
1
I find thatdu -s
is pretty fast even with large file trees. Note that the tool description says " estimate file space usage". I think it's fairly well optimized.
â B Layer
Nov 5 '17 at 14:03
@BLayer, "estimate" size is enough for me. I will do a research aboutdu
to understand how it works. Maybe it gets the size from some directories pre-calculated information table, generated by Linux. Thanks!
â Guybrush
Nov 5 '17 at 14:32
Hi Aaditya! Ok,
du
seems to be much faster than ls
used in current function. But my question is about resources consumption. Do you think I can use du
each 10 minutes to get the size from maybe 10,000 directories, each one with up to 10 subdirectories and hundreds of files, without heavely impact in the server resources? Thank you!â Guybrush
Nov 5 '17 at 14:00
Hi Aaditya! Ok,
du
seems to be much faster than ls
used in current function. But my question is about resources consumption. Do you think I can use du
each 10 minutes to get the size from maybe 10,000 directories, each one with up to 10 subdirectories and hundreds of files, without heavely impact in the server resources? Thank you!â Guybrush
Nov 5 '17 at 14:00
1
1
I find that
du -s
is pretty fast even with large file trees. Note that the tool description says " estimate file space usage". I think it's fairly well optimized.â B Layer
Nov 5 '17 at 14:03
I find that
du -s
is pretty fast even with large file trees. Note that the tool description says " estimate file space usage". I think it's fairly well optimized.â B Layer
Nov 5 '17 at 14:03
@BLayer, "estimate" size is enough for me. I will do a research about
du
to understand how it works. Maybe it gets the size from some directories pre-calculated information table, generated by Linux. Thanks!â Guybrush
Nov 5 '17 at 14:32
@BLayer, "estimate" size is enough for me. I will do a research about
du
to understand how it works. Maybe it gets the size from some directories pre-calculated information table, generated by Linux. Thanks!â Guybrush
Nov 5 '17 at 14:32
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f402653%2fget-size-of-multiple-folders-how-to-calculate-server-resources-consumption-of%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
If you want to manually observe resource usage most people start with
top
.â B Layer
Nov 5 '17 at 14:04
It would make more sense to calculate this only for a given user when they were actually trying to upload a file.
â meuh
Nov 5 '17 at 14:10
@BLayer, I am preparing the server to start activities, today we have only few test directories and files. But I will try to add more files and use
htop
. Do you think it is fast enough to show resources changes when testing the process with, let's say, 10 folders and 10,000 files?â Guybrush
Nov 5 '17 at 14:28
@meuh, files will be uploaded at all time, by multiple users at same time. So I do not think it is a good idea to calculate the folder size everytime a new file is uploaded...
â Guybrush
Nov 5 '17 at 14:29
2
If your php code is the only thing allowing new files to be created, then clearly you should simply keep in your database a count of how many bytes each user has used so far, and update it every time they upload a new file. Adding a number to an entry in a database is much better than doing a
du
, as this will make your disk cache unable to hold anything useful except all the directories thatdu
keeps reading.â meuh
Nov 5 '17 at 14:34