a cron job to delete a specific file once it reaches more then 1gb size?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP












1















My server broke down with exenstive damages when a log file reached 27 gigs in a few hours. Log files are compressed daily, and are usually very small, maybe up to 10mb in 24 hours. Today there was an issue, and it caused a stack trace to be printed 20 times a second, I was sleeping, the log turned into 27gb, and when hard drive became full, serious damages and data loss occured.
The log file will always be called "latest.log"
I need a cron job that will delete this file if it becomes larger then 1GB to prevent this train wreck in the future.
Thanks for helping me.










share|improve this question






















  • can you be more specific? Some frameworks allow you to define rotation and/or a max log size.

    – Rui F Ribeiro
    Nov 15 '15 at 10:37







  • 1





    you can limit a process's max file size with ulimit -f.

    – meuh
    Nov 15 '15 at 10:40











  • but then the process does not hangs, meuh?

    – Rui F Ribeiro
    Nov 15 '15 at 10:44











  • Anything that prints such messages without limiting rate (or just printing the first few before giving up the chatter, or just quitting if a serious problem repeats too often) is suspicious of not-up-to-snuff programming...

    – vonbrand
    Nov 15 '15 at 13:43















1















My server broke down with exenstive damages when a log file reached 27 gigs in a few hours. Log files are compressed daily, and are usually very small, maybe up to 10mb in 24 hours. Today there was an issue, and it caused a stack trace to be printed 20 times a second, I was sleeping, the log turned into 27gb, and when hard drive became full, serious damages and data loss occured.
The log file will always be called "latest.log"
I need a cron job that will delete this file if it becomes larger then 1GB to prevent this train wreck in the future.
Thanks for helping me.










share|improve this question






















  • can you be more specific? Some frameworks allow you to define rotation and/or a max log size.

    – Rui F Ribeiro
    Nov 15 '15 at 10:37







  • 1





    you can limit a process's max file size with ulimit -f.

    – meuh
    Nov 15 '15 at 10:40











  • but then the process does not hangs, meuh?

    – Rui F Ribeiro
    Nov 15 '15 at 10:44











  • Anything that prints such messages without limiting rate (or just printing the first few before giving up the chatter, or just quitting if a serious problem repeats too often) is suspicious of not-up-to-snuff programming...

    – vonbrand
    Nov 15 '15 at 13:43













1












1








1


1






My server broke down with exenstive damages when a log file reached 27 gigs in a few hours. Log files are compressed daily, and are usually very small, maybe up to 10mb in 24 hours. Today there was an issue, and it caused a stack trace to be printed 20 times a second, I was sleeping, the log turned into 27gb, and when hard drive became full, serious damages and data loss occured.
The log file will always be called "latest.log"
I need a cron job that will delete this file if it becomes larger then 1GB to prevent this train wreck in the future.
Thanks for helping me.










share|improve this question














My server broke down with exenstive damages when a log file reached 27 gigs in a few hours. Log files are compressed daily, and are usually very small, maybe up to 10mb in 24 hours. Today there was an issue, and it caused a stack trace to be printed 20 times a second, I was sleeping, the log turned into 27gb, and when hard drive became full, serious damages and data loss occured.
The log file will always be called "latest.log"
I need a cron job that will delete this file if it becomes larger then 1GB to prevent this train wreck in the future.
Thanks for helping me.







linux cron






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 15 '15 at 10:31









user2656801user2656801

1316




1316












  • can you be more specific? Some frameworks allow you to define rotation and/or a max log size.

    – Rui F Ribeiro
    Nov 15 '15 at 10:37







  • 1





    you can limit a process's max file size with ulimit -f.

    – meuh
    Nov 15 '15 at 10:40











  • but then the process does not hangs, meuh?

    – Rui F Ribeiro
    Nov 15 '15 at 10:44











  • Anything that prints such messages without limiting rate (or just printing the first few before giving up the chatter, or just quitting if a serious problem repeats too often) is suspicious of not-up-to-snuff programming...

    – vonbrand
    Nov 15 '15 at 13:43

















  • can you be more specific? Some frameworks allow you to define rotation and/or a max log size.

    – Rui F Ribeiro
    Nov 15 '15 at 10:37







  • 1





    you can limit a process's max file size with ulimit -f.

    – meuh
    Nov 15 '15 at 10:40











  • but then the process does not hangs, meuh?

    – Rui F Ribeiro
    Nov 15 '15 at 10:44











  • Anything that prints such messages without limiting rate (or just printing the first few before giving up the chatter, or just quitting if a serious problem repeats too often) is suspicious of not-up-to-snuff programming...

    – vonbrand
    Nov 15 '15 at 13:43
















can you be more specific? Some frameworks allow you to define rotation and/or a max log size.

– Rui F Ribeiro
Nov 15 '15 at 10:37






can you be more specific? Some frameworks allow you to define rotation and/or a max log size.

– Rui F Ribeiro
Nov 15 '15 at 10:37





1




1





you can limit a process's max file size with ulimit -f.

– meuh
Nov 15 '15 at 10:40





you can limit a process's max file size with ulimit -f.

– meuh
Nov 15 '15 at 10:40













but then the process does not hangs, meuh?

– Rui F Ribeiro
Nov 15 '15 at 10:44





but then the process does not hangs, meuh?

– Rui F Ribeiro
Nov 15 '15 at 10:44













Anything that prints such messages without limiting rate (or just printing the first few before giving up the chatter, or just quitting if a serious problem repeats too often) is suspicious of not-up-to-snuff programming...

– vonbrand
Nov 15 '15 at 13:43





Anything that prints such messages without limiting rate (or just printing the first few before giving up the chatter, or just quitting if a serious problem repeats too often) is suspicious of not-up-to-snuff programming...

– vonbrand
Nov 15 '15 at 13:43










1 Answer
1






active

oldest

votes


















0














I would not do it with a cron, albeit if you insist in that, a simple line of find would do that. Be aware that you would need to restart the service, because files in Unix only die when they are not being used anymore.



Back in crontab:



*/10 * * * * find /dir -name latest.log -size +1GB -exec rm -f ; -exec command_to_restart_your_service ;


As you talk about stacks, I assume you are talking about tomcat. Have a look at the following post. Nevertheless, if that server is so important, I would forward all the logs to a remote log server. Why are you having damages, are you running an SQL server there too? I would run it in a separate server.



Here is the link to the article about limiting logs in size.



https://stackoverflow.com/questions/8342336/how-to-set-maximim-number-of-rolls-and-maximum-log-size-for-tomcat






share|improve this answer

























  • No, it's latest.log from java minecraft. You said you didn't want me to use a cron job, what do you suggest instead? I'm very inexperienced and I welcome any help or advice. Do I just put "find /home/minecraft/multicraft/servers/server1/ -name latest.log -size +1GB -exec rm -f " in /etc/rc.local?

    – user2656801
    Nov 15 '15 at 11:09












  • I edit it out for running every 10 minutes from cron. Mind you the cron must be the user which is running java, or root. I did not put such a long dir name just for clarity.

    – Rui F Ribeiro
    Nov 15 '15 at 11:15







  • 3





    deleting the file does not help if a process still has it open. even truncating it will not help (the nameless file keeps on growing as writes happen). you need to restart whatever is logging there to get it to re-open that file (after you delete it). in unix/linux, removing a file only remove the name. the file only goes away once there are zero names and zero opens.

    – Skaperen
    Nov 15 '15 at 11:24












  • you should be careful with running -exec rm -f it will automatically delete any files the find algorithm matches. you should at least test the command first by replacing -exec rm-f with -exec ls -l

    – the_velour_fog
    Nov 15 '15 at 11:25






  • 2





    I would look into using logrotate - managing log files when they hit a pretermined size or date is exactly what its designed for. Or as @RuiFRibeiro suggests - getting the logs off the server is a good idea - you could do that with rsyslog

    – the_velour_fog
    Nov 15 '15 at 11:32











Your Answer








StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "106"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f243126%2fa-cron-job-to-delete-a-specific-file-once-it-reaches-more-then-1gb-size%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









0














I would not do it with a cron, albeit if you insist in that, a simple line of find would do that. Be aware that you would need to restart the service, because files in Unix only die when they are not being used anymore.



Back in crontab:



*/10 * * * * find /dir -name latest.log -size +1GB -exec rm -f ; -exec command_to_restart_your_service ;


As you talk about stacks, I assume you are talking about tomcat. Have a look at the following post. Nevertheless, if that server is so important, I would forward all the logs to a remote log server. Why are you having damages, are you running an SQL server there too? I would run it in a separate server.



Here is the link to the article about limiting logs in size.



https://stackoverflow.com/questions/8342336/how-to-set-maximim-number-of-rolls-and-maximum-log-size-for-tomcat






share|improve this answer

























  • No, it's latest.log from java minecraft. You said you didn't want me to use a cron job, what do you suggest instead? I'm very inexperienced and I welcome any help or advice. Do I just put "find /home/minecraft/multicraft/servers/server1/ -name latest.log -size +1GB -exec rm -f " in /etc/rc.local?

    – user2656801
    Nov 15 '15 at 11:09












  • I edit it out for running every 10 minutes from cron. Mind you the cron must be the user which is running java, or root. I did not put such a long dir name just for clarity.

    – Rui F Ribeiro
    Nov 15 '15 at 11:15







  • 3





    deleting the file does not help if a process still has it open. even truncating it will not help (the nameless file keeps on growing as writes happen). you need to restart whatever is logging there to get it to re-open that file (after you delete it). in unix/linux, removing a file only remove the name. the file only goes away once there are zero names and zero opens.

    – Skaperen
    Nov 15 '15 at 11:24












  • you should be careful with running -exec rm -f it will automatically delete any files the find algorithm matches. you should at least test the command first by replacing -exec rm-f with -exec ls -l

    – the_velour_fog
    Nov 15 '15 at 11:25






  • 2





    I would look into using logrotate - managing log files when they hit a pretermined size or date is exactly what its designed for. Or as @RuiFRibeiro suggests - getting the logs off the server is a good idea - you could do that with rsyslog

    – the_velour_fog
    Nov 15 '15 at 11:32
















0














I would not do it with a cron, albeit if you insist in that, a simple line of find would do that. Be aware that you would need to restart the service, because files in Unix only die when they are not being used anymore.



Back in crontab:



*/10 * * * * find /dir -name latest.log -size +1GB -exec rm -f ; -exec command_to_restart_your_service ;


As you talk about stacks, I assume you are talking about tomcat. Have a look at the following post. Nevertheless, if that server is so important, I would forward all the logs to a remote log server. Why are you having damages, are you running an SQL server there too? I would run it in a separate server.



Here is the link to the article about limiting logs in size.



https://stackoverflow.com/questions/8342336/how-to-set-maximim-number-of-rolls-and-maximum-log-size-for-tomcat






share|improve this answer

























  • No, it's latest.log from java minecraft. You said you didn't want me to use a cron job, what do you suggest instead? I'm very inexperienced and I welcome any help or advice. Do I just put "find /home/minecraft/multicraft/servers/server1/ -name latest.log -size +1GB -exec rm -f " in /etc/rc.local?

    – user2656801
    Nov 15 '15 at 11:09












  • I edit it out for running every 10 minutes from cron. Mind you the cron must be the user which is running java, or root. I did not put such a long dir name just for clarity.

    – Rui F Ribeiro
    Nov 15 '15 at 11:15







  • 3





    deleting the file does not help if a process still has it open. even truncating it will not help (the nameless file keeps on growing as writes happen). you need to restart whatever is logging there to get it to re-open that file (after you delete it). in unix/linux, removing a file only remove the name. the file only goes away once there are zero names and zero opens.

    – Skaperen
    Nov 15 '15 at 11:24












  • you should be careful with running -exec rm -f it will automatically delete any files the find algorithm matches. you should at least test the command first by replacing -exec rm-f with -exec ls -l

    – the_velour_fog
    Nov 15 '15 at 11:25






  • 2





    I would look into using logrotate - managing log files when they hit a pretermined size or date is exactly what its designed for. Or as @RuiFRibeiro suggests - getting the logs off the server is a good idea - you could do that with rsyslog

    – the_velour_fog
    Nov 15 '15 at 11:32














0












0








0







I would not do it with a cron, albeit if you insist in that, a simple line of find would do that. Be aware that you would need to restart the service, because files in Unix only die when they are not being used anymore.



Back in crontab:



*/10 * * * * find /dir -name latest.log -size +1GB -exec rm -f ; -exec command_to_restart_your_service ;


As you talk about stacks, I assume you are talking about tomcat. Have a look at the following post. Nevertheless, if that server is so important, I would forward all the logs to a remote log server. Why are you having damages, are you running an SQL server there too? I would run it in a separate server.



Here is the link to the article about limiting logs in size.



https://stackoverflow.com/questions/8342336/how-to-set-maximim-number-of-rolls-and-maximum-log-size-for-tomcat






share|improve this answer















I would not do it with a cron, albeit if you insist in that, a simple line of find would do that. Be aware that you would need to restart the service, because files in Unix only die when they are not being used anymore.



Back in crontab:



*/10 * * * * find /dir -name latest.log -size +1GB -exec rm -f ; -exec command_to_restart_your_service ;


As you talk about stacks, I assume you are talking about tomcat. Have a look at the following post. Nevertheless, if that server is so important, I would forward all the logs to a remote log server. Why are you having damages, are you running an SQL server there too? I would run it in a separate server.



Here is the link to the article about limiting logs in size.



https://stackoverflow.com/questions/8342336/how-to-set-maximim-number-of-rolls-and-maximum-log-size-for-tomcat







share|improve this answer














share|improve this answer



share|improve this answer








edited May 23 '17 at 12:40









Community

1




1










answered Nov 15 '15 at 10:51









Rui F RibeiroRui F Ribeiro

40.3k1479137




40.3k1479137












  • No, it's latest.log from java minecraft. You said you didn't want me to use a cron job, what do you suggest instead? I'm very inexperienced and I welcome any help or advice. Do I just put "find /home/minecraft/multicraft/servers/server1/ -name latest.log -size +1GB -exec rm -f " in /etc/rc.local?

    – user2656801
    Nov 15 '15 at 11:09












  • I edit it out for running every 10 minutes from cron. Mind you the cron must be the user which is running java, or root. I did not put such a long dir name just for clarity.

    – Rui F Ribeiro
    Nov 15 '15 at 11:15







  • 3





    deleting the file does not help if a process still has it open. even truncating it will not help (the nameless file keeps on growing as writes happen). you need to restart whatever is logging there to get it to re-open that file (after you delete it). in unix/linux, removing a file only remove the name. the file only goes away once there are zero names and zero opens.

    – Skaperen
    Nov 15 '15 at 11:24












  • you should be careful with running -exec rm -f it will automatically delete any files the find algorithm matches. you should at least test the command first by replacing -exec rm-f with -exec ls -l

    – the_velour_fog
    Nov 15 '15 at 11:25






  • 2





    I would look into using logrotate - managing log files when they hit a pretermined size or date is exactly what its designed for. Or as @RuiFRibeiro suggests - getting the logs off the server is a good idea - you could do that with rsyslog

    – the_velour_fog
    Nov 15 '15 at 11:32


















  • No, it's latest.log from java minecraft. You said you didn't want me to use a cron job, what do you suggest instead? I'm very inexperienced and I welcome any help or advice. Do I just put "find /home/minecraft/multicraft/servers/server1/ -name latest.log -size +1GB -exec rm -f " in /etc/rc.local?

    – user2656801
    Nov 15 '15 at 11:09












  • I edit it out for running every 10 minutes from cron. Mind you the cron must be the user which is running java, or root. I did not put such a long dir name just for clarity.

    – Rui F Ribeiro
    Nov 15 '15 at 11:15







  • 3





    deleting the file does not help if a process still has it open. even truncating it will not help (the nameless file keeps on growing as writes happen). you need to restart whatever is logging there to get it to re-open that file (after you delete it). in unix/linux, removing a file only remove the name. the file only goes away once there are zero names and zero opens.

    – Skaperen
    Nov 15 '15 at 11:24












  • you should be careful with running -exec rm -f it will automatically delete any files the find algorithm matches. you should at least test the command first by replacing -exec rm-f with -exec ls -l

    – the_velour_fog
    Nov 15 '15 at 11:25






  • 2





    I would look into using logrotate - managing log files when they hit a pretermined size or date is exactly what its designed for. Or as @RuiFRibeiro suggests - getting the logs off the server is a good idea - you could do that with rsyslog

    – the_velour_fog
    Nov 15 '15 at 11:32

















No, it's latest.log from java minecraft. You said you didn't want me to use a cron job, what do you suggest instead? I'm very inexperienced and I welcome any help or advice. Do I just put "find /home/minecraft/multicraft/servers/server1/ -name latest.log -size +1GB -exec rm -f " in /etc/rc.local?

– user2656801
Nov 15 '15 at 11:09






No, it's latest.log from java minecraft. You said you didn't want me to use a cron job, what do you suggest instead? I'm very inexperienced and I welcome any help or advice. Do I just put "find /home/minecraft/multicraft/servers/server1/ -name latest.log -size +1GB -exec rm -f " in /etc/rc.local?

– user2656801
Nov 15 '15 at 11:09














I edit it out for running every 10 minutes from cron. Mind you the cron must be the user which is running java, or root. I did not put such a long dir name just for clarity.

– Rui F Ribeiro
Nov 15 '15 at 11:15






I edit it out for running every 10 minutes from cron. Mind you the cron must be the user which is running java, or root. I did not put such a long dir name just for clarity.

– Rui F Ribeiro
Nov 15 '15 at 11:15





3




3





deleting the file does not help if a process still has it open. even truncating it will not help (the nameless file keeps on growing as writes happen). you need to restart whatever is logging there to get it to re-open that file (after you delete it). in unix/linux, removing a file only remove the name. the file only goes away once there are zero names and zero opens.

– Skaperen
Nov 15 '15 at 11:24






deleting the file does not help if a process still has it open. even truncating it will not help (the nameless file keeps on growing as writes happen). you need to restart whatever is logging there to get it to re-open that file (after you delete it). in unix/linux, removing a file only remove the name. the file only goes away once there are zero names and zero opens.

– Skaperen
Nov 15 '15 at 11:24














you should be careful with running -exec rm -f it will automatically delete any files the find algorithm matches. you should at least test the command first by replacing -exec rm-f with -exec ls -l

– the_velour_fog
Nov 15 '15 at 11:25





you should be careful with running -exec rm -f it will automatically delete any files the find algorithm matches. you should at least test the command first by replacing -exec rm-f with -exec ls -l

– the_velour_fog
Nov 15 '15 at 11:25




2




2





I would look into using logrotate - managing log files when they hit a pretermined size or date is exactly what its designed for. Or as @RuiFRibeiro suggests - getting the logs off the server is a good idea - you could do that with rsyslog

– the_velour_fog
Nov 15 '15 at 11:32






I would look into using logrotate - managing log files when they hit a pretermined size or date is exactly what its designed for. Or as @RuiFRibeiro suggests - getting the logs off the server is a good idea - you could do that with rsyslog

– the_velour_fog
Nov 15 '15 at 11:32


















draft saved

draft discarded
















































Thanks for contributing an answer to Unix & Linux Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f243126%2fa-cron-job-to-delete-a-specific-file-once-it-reaches-more-then-1gb-size%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown






Popular posts from this blog

How to check contact read email or not when send email to Individual?

Displaying single band from multi-band raster using QGIS

How many registers does an x86_64 CPU actually have?