synchronise with remote machine via http, and delete older files

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












I have a folder in my system(ubuntu) that gets synchronized with work using wget. File-names are in the following format A156.0.1.x, A156.0.y, A156.0.z, A156.0.a, A156.0.b. All files are created at some time in my office and all have the same time and date. Rsync and any other connection to the office is not permitted.



I am synchronizing 4 times a day and there is not a pattern of how often the files will be created. There might not be a change in the folders for a couple of weeks or it might be 10 times in a day. Once the new file is created it will be named something like A156.1.[a,b,x,y,z]. Each file is huge (~500MB).



So i am ending having more than one set of files (5), in my system, and i have in total 10 files×500MB = 5GB.



Is there any easy script that can be run by cron to check frequently the folder and delete the older files? So i will end up only with the latest set of 5 ones. i could run something like delete files that are older than x days, but we are never sure when the next set of files will get created.







share|improve this question






















  • Are files ever edited, or removed? Or are they only added? (on remote server)
    – ctrl-alt-delor
    Jun 25 at 7:51














up vote
0
down vote

favorite












I have a folder in my system(ubuntu) that gets synchronized with work using wget. File-names are in the following format A156.0.1.x, A156.0.y, A156.0.z, A156.0.a, A156.0.b. All files are created at some time in my office and all have the same time and date. Rsync and any other connection to the office is not permitted.



I am synchronizing 4 times a day and there is not a pattern of how often the files will be created. There might not be a change in the folders for a couple of weeks or it might be 10 times in a day. Once the new file is created it will be named something like A156.1.[a,b,x,y,z]. Each file is huge (~500MB).



So i am ending having more than one set of files (5), in my system, and i have in total 10 files×500MB = 5GB.



Is there any easy script that can be run by cron to check frequently the folder and delete the older files? So i will end up only with the latest set of 5 ones. i could run something like delete files that are older than x days, but we are never sure when the next set of files will get created.







share|improve this question






















  • Are files ever edited, or removed? Or are they only added? (on remote server)
    – ctrl-alt-delor
    Jun 25 at 7:51












up vote
0
down vote

favorite









up vote
0
down vote

favorite











I have a folder in my system(ubuntu) that gets synchronized with work using wget. File-names are in the following format A156.0.1.x, A156.0.y, A156.0.z, A156.0.a, A156.0.b. All files are created at some time in my office and all have the same time and date. Rsync and any other connection to the office is not permitted.



I am synchronizing 4 times a day and there is not a pattern of how often the files will be created. There might not be a change in the folders for a couple of weeks or it might be 10 times in a day. Once the new file is created it will be named something like A156.1.[a,b,x,y,z]. Each file is huge (~500MB).



So i am ending having more than one set of files (5), in my system, and i have in total 10 files×500MB = 5GB.



Is there any easy script that can be run by cron to check frequently the folder and delete the older files? So i will end up only with the latest set of 5 ones. i could run something like delete files that are older than x days, but we are never sure when the next set of files will get created.







share|improve this question














I have a folder in my system(ubuntu) that gets synchronized with work using wget. File-names are in the following format A156.0.1.x, A156.0.y, A156.0.z, A156.0.a, A156.0.b. All files are created at some time in my office and all have the same time and date. Rsync and any other connection to the office is not permitted.



I am synchronizing 4 times a day and there is not a pattern of how often the files will be created. There might not be a change in the folders for a couple of weeks or it might be 10 times in a day. Once the new file is created it will be named something like A156.1.[a,b,x,y,z]. Each file is huge (~500MB).



So i am ending having more than one set of files (5), in my system, and i have in total 10 files×500MB = 5GB.



Is there any easy script that can be run by cron to check frequently the folder and delete the older files? So i will end up only with the latest set of 5 ones. i could run something like delete files that are older than x days, but we are never sure when the next set of files will get created.









share|improve this question













share|improve this question




share|improve this question








edited Jun 25 at 7:45









ctrl-alt-delor

8,79031947




8,79031947










asked Jan 4 at 14:28









john

566




566











  • Are files ever edited, or removed? Or are they only added? (on remote server)
    – ctrl-alt-delor
    Jun 25 at 7:51
















  • Are files ever edited, or removed? Or are they only added? (on remote server)
    – ctrl-alt-delor
    Jun 25 at 7:51















Are files ever edited, or removed? Or are they only added? (on remote server)
– ctrl-alt-delor
Jun 25 at 7:51




Are files ever edited, or removed? Or are they only added? (on remote server)
– ctrl-alt-delor
Jun 25 at 7:51










3 Answers
3






active

oldest

votes

















up vote
0
down vote













You can use find piped into sort to list files sorted by date, and then use cut on the output to generate a list of files, and then use rm delete all files except the latest 5 files. Running this periodically should have the result you are looking for.



I don't know of an existing script, but this should be fairly trivial to implement.






share|improve this answer




















  • Now i've lost it.
    – john
    Jan 5 at 6:35










  • @imbuedHope does find work over http?
    – ctrl-alt-delor
    Jun 25 at 7:38

















up vote
0
down vote













If you can use zsh, it has glob qualifiers that make this pretty easy:



zsh -c 'rm work-folder/*(om[6,-1])'


That says to select all of the files in the work-folder directory, ordered o by modification time m, and further selecting only the range from 6 to the end. This leaves the most recent 5 files in the folder.



This assumes that you have 6 or more files in the directory to begin with; you could wrap a test around the removal to be safer (all in zsh):



files=(work/*(om))
[ $#files[@] -gt 5 ] && echo rm "$files[6,-1]"


It's more work in bash, as you need to call stat on each file and keep track yourself, along this line.






share|improve this answer




















  • does it work over http?
    – ctrl-alt-delor
    Jun 25 at 7:39










  • I don’t understand, @ctrl-alt-delor ; it’s a zsh command, and is not meant for use in the HTTP protocol. Did you have a separate Question?
    – Jeff Schaller
    Jun 27 at 21:04










  • Read the question:
    – ctrl-alt-delor
    Jun 28 at 7:22

















up vote
0
down vote













The following script will display a list of "new files" and "old files" in a directory. By "new files" is meant files that have been modified after the last run of the script, and by "old files" is meant that have not been modified since the last run of the script.



The script writes the output of date to a "timestamp file", and uses this file in the next run to determine what files have changed. on the first run, no output will be produced.



The script should be run manually, and as it is written it will only give you an opportunity to detect what files have been modified in a particular directory.



#!/bin/sh

topdir=$HOME # change this to point to the top dir where your files are

stamp="$topdir/timestamp"

if [ -f "$stamp" ]; then
echo 'New files:'
find "$topdir" -type f ! -name timestamp -newer "$stamp"

echo 'Old files:'
find "$topdir" -type f ! -name timestamp ! -newer "$stamp"
fi

date >"$stamp"


This could be modified to



  • prompt the user for deleting the old files,

  • detect only files matching a certain pattern (using -name 'pattern', e.g. -name 'A156.1.[abxyz]'),

  • look at the inode change time ("ctime") instead of the modification time (using -cnewer instead of -newer if your find supports it),

  • etc.





share|improve this answer




















  • does find work over http?
    – ctrl-alt-delor
    Jun 25 at 7:38










  • @ctrl-alt-delor No.
    – Kusalananda
    Jun 25 at 7:41










Your Answer







StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "106"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: false,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);








 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f414780%2fsynchronise-with-remote-machine-via-http-and-delete-older-files%23new-answer', 'question_page');

);

Post as a guest






























3 Answers
3






active

oldest

votes








3 Answers
3






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
0
down vote













You can use find piped into sort to list files sorted by date, and then use cut on the output to generate a list of files, and then use rm delete all files except the latest 5 files. Running this periodically should have the result you are looking for.



I don't know of an existing script, but this should be fairly trivial to implement.






share|improve this answer




















  • Now i've lost it.
    – john
    Jan 5 at 6:35










  • @imbuedHope does find work over http?
    – ctrl-alt-delor
    Jun 25 at 7:38














up vote
0
down vote













You can use find piped into sort to list files sorted by date, and then use cut on the output to generate a list of files, and then use rm delete all files except the latest 5 files. Running this periodically should have the result you are looking for.



I don't know of an existing script, but this should be fairly trivial to implement.






share|improve this answer




















  • Now i've lost it.
    – john
    Jan 5 at 6:35










  • @imbuedHope does find work over http?
    – ctrl-alt-delor
    Jun 25 at 7:38












up vote
0
down vote










up vote
0
down vote









You can use find piped into sort to list files sorted by date, and then use cut on the output to generate a list of files, and then use rm delete all files except the latest 5 files. Running this periodically should have the result you are looking for.



I don't know of an existing script, but this should be fairly trivial to implement.






share|improve this answer












You can use find piped into sort to list files sorted by date, and then use cut on the output to generate a list of files, and then use rm delete all files except the latest 5 files. Running this periodically should have the result you are looking for.



I don't know of an existing script, but this should be fairly trivial to implement.







share|improve this answer












share|improve this answer



share|improve this answer










answered Jan 4 at 14:39









imbuedHope

1968




1968











  • Now i've lost it.
    – john
    Jan 5 at 6:35










  • @imbuedHope does find work over http?
    – ctrl-alt-delor
    Jun 25 at 7:38
















  • Now i've lost it.
    – john
    Jan 5 at 6:35










  • @imbuedHope does find work over http?
    – ctrl-alt-delor
    Jun 25 at 7:38















Now i've lost it.
– john
Jan 5 at 6:35




Now i've lost it.
– john
Jan 5 at 6:35












@imbuedHope does find work over http?
– ctrl-alt-delor
Jun 25 at 7:38




@imbuedHope does find work over http?
– ctrl-alt-delor
Jun 25 at 7:38












up vote
0
down vote













If you can use zsh, it has glob qualifiers that make this pretty easy:



zsh -c 'rm work-folder/*(om[6,-1])'


That says to select all of the files in the work-folder directory, ordered o by modification time m, and further selecting only the range from 6 to the end. This leaves the most recent 5 files in the folder.



This assumes that you have 6 or more files in the directory to begin with; you could wrap a test around the removal to be safer (all in zsh):



files=(work/*(om))
[ $#files[@] -gt 5 ] && echo rm "$files[6,-1]"


It's more work in bash, as you need to call stat on each file and keep track yourself, along this line.






share|improve this answer




















  • does it work over http?
    – ctrl-alt-delor
    Jun 25 at 7:39










  • I don’t understand, @ctrl-alt-delor ; it’s a zsh command, and is not meant for use in the HTTP protocol. Did you have a separate Question?
    – Jeff Schaller
    Jun 27 at 21:04










  • Read the question:
    – ctrl-alt-delor
    Jun 28 at 7:22














up vote
0
down vote













If you can use zsh, it has glob qualifiers that make this pretty easy:



zsh -c 'rm work-folder/*(om[6,-1])'


That says to select all of the files in the work-folder directory, ordered o by modification time m, and further selecting only the range from 6 to the end. This leaves the most recent 5 files in the folder.



This assumes that you have 6 or more files in the directory to begin with; you could wrap a test around the removal to be safer (all in zsh):



files=(work/*(om))
[ $#files[@] -gt 5 ] && echo rm "$files[6,-1]"


It's more work in bash, as you need to call stat on each file and keep track yourself, along this line.






share|improve this answer




















  • does it work over http?
    – ctrl-alt-delor
    Jun 25 at 7:39










  • I don’t understand, @ctrl-alt-delor ; it’s a zsh command, and is not meant for use in the HTTP protocol. Did you have a separate Question?
    – Jeff Schaller
    Jun 27 at 21:04










  • Read the question:
    – ctrl-alt-delor
    Jun 28 at 7:22












up vote
0
down vote










up vote
0
down vote









If you can use zsh, it has glob qualifiers that make this pretty easy:



zsh -c 'rm work-folder/*(om[6,-1])'


That says to select all of the files in the work-folder directory, ordered o by modification time m, and further selecting only the range from 6 to the end. This leaves the most recent 5 files in the folder.



This assumes that you have 6 or more files in the directory to begin with; you could wrap a test around the removal to be safer (all in zsh):



files=(work/*(om))
[ $#files[@] -gt 5 ] && echo rm "$files[6,-1]"


It's more work in bash, as you need to call stat on each file and keep track yourself, along this line.






share|improve this answer












If you can use zsh, it has glob qualifiers that make this pretty easy:



zsh -c 'rm work-folder/*(om[6,-1])'


That says to select all of the files in the work-folder directory, ordered o by modification time m, and further selecting only the range from 6 to the end. This leaves the most recent 5 files in the folder.



This assumes that you have 6 or more files in the directory to begin with; you could wrap a test around the removal to be safer (all in zsh):



files=(work/*(om))
[ $#files[@] -gt 5 ] && echo rm "$files[6,-1]"


It's more work in bash, as you need to call stat on each file and keep track yourself, along this line.







share|improve this answer












share|improve this answer



share|improve this answer










answered Jan 9 at 18:14









Jeff Schaller

31.8k848109




31.8k848109











  • does it work over http?
    – ctrl-alt-delor
    Jun 25 at 7:39










  • I don’t understand, @ctrl-alt-delor ; it’s a zsh command, and is not meant for use in the HTTP protocol. Did you have a separate Question?
    – Jeff Schaller
    Jun 27 at 21:04










  • Read the question:
    – ctrl-alt-delor
    Jun 28 at 7:22
















  • does it work over http?
    – ctrl-alt-delor
    Jun 25 at 7:39










  • I don’t understand, @ctrl-alt-delor ; it’s a zsh command, and is not meant for use in the HTTP protocol. Did you have a separate Question?
    – Jeff Schaller
    Jun 27 at 21:04










  • Read the question:
    – ctrl-alt-delor
    Jun 28 at 7:22















does it work over http?
– ctrl-alt-delor
Jun 25 at 7:39




does it work over http?
– ctrl-alt-delor
Jun 25 at 7:39












I don’t understand, @ctrl-alt-delor ; it’s a zsh command, and is not meant for use in the HTTP protocol. Did you have a separate Question?
– Jeff Schaller
Jun 27 at 21:04




I don’t understand, @ctrl-alt-delor ; it’s a zsh command, and is not meant for use in the HTTP protocol. Did you have a separate Question?
– Jeff Schaller
Jun 27 at 21:04












Read the question:
– ctrl-alt-delor
Jun 28 at 7:22




Read the question:
– ctrl-alt-delor
Jun 28 at 7:22










up vote
0
down vote













The following script will display a list of "new files" and "old files" in a directory. By "new files" is meant files that have been modified after the last run of the script, and by "old files" is meant that have not been modified since the last run of the script.



The script writes the output of date to a "timestamp file", and uses this file in the next run to determine what files have changed. on the first run, no output will be produced.



The script should be run manually, and as it is written it will only give you an opportunity to detect what files have been modified in a particular directory.



#!/bin/sh

topdir=$HOME # change this to point to the top dir where your files are

stamp="$topdir/timestamp"

if [ -f "$stamp" ]; then
echo 'New files:'
find "$topdir" -type f ! -name timestamp -newer "$stamp"

echo 'Old files:'
find "$topdir" -type f ! -name timestamp ! -newer "$stamp"
fi

date >"$stamp"


This could be modified to



  • prompt the user for deleting the old files,

  • detect only files matching a certain pattern (using -name 'pattern', e.g. -name 'A156.1.[abxyz]'),

  • look at the inode change time ("ctime") instead of the modification time (using -cnewer instead of -newer if your find supports it),

  • etc.





share|improve this answer




















  • does find work over http?
    – ctrl-alt-delor
    Jun 25 at 7:38










  • @ctrl-alt-delor No.
    – Kusalananda
    Jun 25 at 7:41














up vote
0
down vote













The following script will display a list of "new files" and "old files" in a directory. By "new files" is meant files that have been modified after the last run of the script, and by "old files" is meant that have not been modified since the last run of the script.



The script writes the output of date to a "timestamp file", and uses this file in the next run to determine what files have changed. on the first run, no output will be produced.



The script should be run manually, and as it is written it will only give you an opportunity to detect what files have been modified in a particular directory.



#!/bin/sh

topdir=$HOME # change this to point to the top dir where your files are

stamp="$topdir/timestamp"

if [ -f "$stamp" ]; then
echo 'New files:'
find "$topdir" -type f ! -name timestamp -newer "$stamp"

echo 'Old files:'
find "$topdir" -type f ! -name timestamp ! -newer "$stamp"
fi

date >"$stamp"


This could be modified to



  • prompt the user for deleting the old files,

  • detect only files matching a certain pattern (using -name 'pattern', e.g. -name 'A156.1.[abxyz]'),

  • look at the inode change time ("ctime") instead of the modification time (using -cnewer instead of -newer if your find supports it),

  • etc.





share|improve this answer




















  • does find work over http?
    – ctrl-alt-delor
    Jun 25 at 7:38










  • @ctrl-alt-delor No.
    – Kusalananda
    Jun 25 at 7:41












up vote
0
down vote










up vote
0
down vote









The following script will display a list of "new files" and "old files" in a directory. By "new files" is meant files that have been modified after the last run of the script, and by "old files" is meant that have not been modified since the last run of the script.



The script writes the output of date to a "timestamp file", and uses this file in the next run to determine what files have changed. on the first run, no output will be produced.



The script should be run manually, and as it is written it will only give you an opportunity to detect what files have been modified in a particular directory.



#!/bin/sh

topdir=$HOME # change this to point to the top dir where your files are

stamp="$topdir/timestamp"

if [ -f "$stamp" ]; then
echo 'New files:'
find "$topdir" -type f ! -name timestamp -newer "$stamp"

echo 'Old files:'
find "$topdir" -type f ! -name timestamp ! -newer "$stamp"
fi

date >"$stamp"


This could be modified to



  • prompt the user for deleting the old files,

  • detect only files matching a certain pattern (using -name 'pattern', e.g. -name 'A156.1.[abxyz]'),

  • look at the inode change time ("ctime") instead of the modification time (using -cnewer instead of -newer if your find supports it),

  • etc.





share|improve this answer












The following script will display a list of "new files" and "old files" in a directory. By "new files" is meant files that have been modified after the last run of the script, and by "old files" is meant that have not been modified since the last run of the script.



The script writes the output of date to a "timestamp file", and uses this file in the next run to determine what files have changed. on the first run, no output will be produced.



The script should be run manually, and as it is written it will only give you an opportunity to detect what files have been modified in a particular directory.



#!/bin/sh

topdir=$HOME # change this to point to the top dir where your files are

stamp="$topdir/timestamp"

if [ -f "$stamp" ]; then
echo 'New files:'
find "$topdir" -type f ! -name timestamp -newer "$stamp"

echo 'Old files:'
find "$topdir" -type f ! -name timestamp ! -newer "$stamp"
fi

date >"$stamp"


This could be modified to



  • prompt the user for deleting the old files,

  • detect only files matching a certain pattern (using -name 'pattern', e.g. -name 'A156.1.[abxyz]'),

  • look at the inode change time ("ctime") instead of the modification time (using -cnewer instead of -newer if your find supports it),

  • etc.






share|improve this answer












share|improve this answer



share|improve this answer










answered Jun 25 at 7:03









Kusalananda

104k13204323




104k13204323











  • does find work over http?
    – ctrl-alt-delor
    Jun 25 at 7:38










  • @ctrl-alt-delor No.
    – Kusalananda
    Jun 25 at 7:41
















  • does find work over http?
    – ctrl-alt-delor
    Jun 25 at 7:38










  • @ctrl-alt-delor No.
    – Kusalananda
    Jun 25 at 7:41















does find work over http?
– ctrl-alt-delor
Jun 25 at 7:38




does find work over http?
– ctrl-alt-delor
Jun 25 at 7:38












@ctrl-alt-delor No.
– Kusalananda
Jun 25 at 7:41




@ctrl-alt-delor No.
– Kusalananda
Jun 25 at 7:41












 

draft saved


draft discarded


























 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f414780%2fsynchronise-with-remote-machine-via-http-and-delete-older-files%23new-answer', 'question_page');

);

Post as a guest













































































Popular posts from this blog

How to check contact read email or not when send email to Individual?

Bahrain

Postfix configuration issue with fips on centos 7; mailgun relay