synchronise with remote machine via http, and delete older files
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
I have a folder in my system(ubuntu) that gets synchronized with work using wget
. File-names are in the following format A156.0.1.x, A156.0.y, A156.0.z, A156.0.a, A156.0.b. All files are created at some time in my office and all have the same time and date. Rsync and any other connection to the office is not permitted.
I am synchronizing 4 times a day and there is not a pattern of how often the files will be created. There might not be a change in the folders for a couple of weeks or it might be 10 times in a day. Once the new file is created it will be named something like A156.1.[a,b,x,y,z]. Each file is huge (~500MB).
So i am ending having more than one set of files (5), in my system, and i have in total 10 filesÃÂ500MB = 5GB.
Is there any easy script that can be run by cron to check frequently the folder and delete the older files? So i will end up only with the latest set of 5 ones. i could run something like delete files that are older than x days, but we are never sure when the next set of files will get created.
bash files date rm
add a comment |Â
up vote
0
down vote
favorite
I have a folder in my system(ubuntu) that gets synchronized with work using wget
. File-names are in the following format A156.0.1.x, A156.0.y, A156.0.z, A156.0.a, A156.0.b. All files are created at some time in my office and all have the same time and date. Rsync and any other connection to the office is not permitted.
I am synchronizing 4 times a day and there is not a pattern of how often the files will be created. There might not be a change in the folders for a couple of weeks or it might be 10 times in a day. Once the new file is created it will be named something like A156.1.[a,b,x,y,z]. Each file is huge (~500MB).
So i am ending having more than one set of files (5), in my system, and i have in total 10 filesÃÂ500MB = 5GB.
Is there any easy script that can be run by cron to check frequently the folder and delete the older files? So i will end up only with the latest set of 5 ones. i could run something like delete files that are older than x days, but we are never sure when the next set of files will get created.
bash files date rm
Are files ever edited, or removed? Or are they only added? (on remote server)
â ctrl-alt-delor
Jun 25 at 7:51
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I have a folder in my system(ubuntu) that gets synchronized with work using wget
. File-names are in the following format A156.0.1.x, A156.0.y, A156.0.z, A156.0.a, A156.0.b. All files are created at some time in my office and all have the same time and date. Rsync and any other connection to the office is not permitted.
I am synchronizing 4 times a day and there is not a pattern of how often the files will be created. There might not be a change in the folders for a couple of weeks or it might be 10 times in a day. Once the new file is created it will be named something like A156.1.[a,b,x,y,z]. Each file is huge (~500MB).
So i am ending having more than one set of files (5), in my system, and i have in total 10 filesÃÂ500MB = 5GB.
Is there any easy script that can be run by cron to check frequently the folder and delete the older files? So i will end up only with the latest set of 5 ones. i could run something like delete files that are older than x days, but we are never sure when the next set of files will get created.
bash files date rm
I have a folder in my system(ubuntu) that gets synchronized with work using wget
. File-names are in the following format A156.0.1.x, A156.0.y, A156.0.z, A156.0.a, A156.0.b. All files are created at some time in my office and all have the same time and date. Rsync and any other connection to the office is not permitted.
I am synchronizing 4 times a day and there is not a pattern of how often the files will be created. There might not be a change in the folders for a couple of weeks or it might be 10 times in a day. Once the new file is created it will be named something like A156.1.[a,b,x,y,z]. Each file is huge (~500MB).
So i am ending having more than one set of files (5), in my system, and i have in total 10 filesÃÂ500MB = 5GB.
Is there any easy script that can be run by cron to check frequently the folder and delete the older files? So i will end up only with the latest set of 5 ones. i could run something like delete files that are older than x days, but we are never sure when the next set of files will get created.
bash files date rm
edited Jun 25 at 7:45
ctrl-alt-delor
8,79031947
8,79031947
asked Jan 4 at 14:28
john
566
566
Are files ever edited, or removed? Or are they only added? (on remote server)
â ctrl-alt-delor
Jun 25 at 7:51
add a comment |Â
Are files ever edited, or removed? Or are they only added? (on remote server)
â ctrl-alt-delor
Jun 25 at 7:51
Are files ever edited, or removed? Or are they only added? (on remote server)
â ctrl-alt-delor
Jun 25 at 7:51
Are files ever edited, or removed? Or are they only added? (on remote server)
â ctrl-alt-delor
Jun 25 at 7:51
add a comment |Â
3 Answers
3
active
oldest
votes
up vote
0
down vote
You can use find
piped into sort
to list files sorted by date, and then use cut
on the output to generate a list of files, and then use rm
delete all files except the latest 5 files. Running this periodically should have the result you are looking for.
I don't know of an existing script, but this should be fairly trivial to implement.
Now i've lost it.
â john
Jan 5 at 6:35
@imbuedHope doesfind
work overhttp
?
â ctrl-alt-delor
Jun 25 at 7:38
add a comment |Â
up vote
0
down vote
If you can use zsh, it has glob qualifiers that make this pretty easy:
zsh -c 'rm work-folder/*(om[6,-1])'
That says to select all of the files in the work-folder directory, ordered o
by modification time m
, and further selecting only the range from 6 to the end. This leaves the most recent 5 files in the folder.
This assumes that you have 6 or more files in the directory to begin with; you could wrap a test around the removal to be safer (all in zsh):
files=(work/*(om))
[ $#files[@] -gt 5 ] && echo rm "$files[6,-1]"
It's more work in bash, as you need to call stat
on each file and keep track yourself, along this line.
does it work over http?
â ctrl-alt-delor
Jun 25 at 7:39
I donâÂÂt understand, @ctrl-alt-delor ; itâÂÂs a zsh command, and is not meant for use in the HTTP protocol. Did you have a separate Question?
â Jeff Schaller
Jun 27 at 21:04
Read the question:
â ctrl-alt-delor
Jun 28 at 7:22
add a comment |Â
up vote
0
down vote
The following script will display a list of "new files" and "old files" in a directory. By "new files" is meant files that have been modified after the last run of the script, and by "old files" is meant that have not been modified since the last run of the script.
The script writes the output of date
to a "timestamp file", and uses this file in the next run to determine what files have changed. on the first run, no output will be produced.
The script should be run manually, and as it is written it will only give you an opportunity to detect what files have been modified in a particular directory.
#!/bin/sh
topdir=$HOME # change this to point to the top dir where your files are
stamp="$topdir/timestamp"
if [ -f "$stamp" ]; then
echo 'New files:'
find "$topdir" -type f ! -name timestamp -newer "$stamp"
echo 'Old files:'
find "$topdir" -type f ! -name timestamp ! -newer "$stamp"
fi
date >"$stamp"
This could be modified to
- prompt the user for deleting the old files,
- detect only files matching a certain pattern (using
-name 'pattern'
, e.g.-name 'A156.1.[abxyz]'
), - look at the inode change time ("ctime") instead of the modification time (using
-cnewer
instead of-newer
if yourfind
supports it), - etc.
does find work over http?
â ctrl-alt-delor
Jun 25 at 7:38
@ctrl-alt-delor No.
â Kusalananda
Jun 25 at 7:41
add a comment |Â
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
You can use find
piped into sort
to list files sorted by date, and then use cut
on the output to generate a list of files, and then use rm
delete all files except the latest 5 files. Running this periodically should have the result you are looking for.
I don't know of an existing script, but this should be fairly trivial to implement.
Now i've lost it.
â john
Jan 5 at 6:35
@imbuedHope doesfind
work overhttp
?
â ctrl-alt-delor
Jun 25 at 7:38
add a comment |Â
up vote
0
down vote
You can use find
piped into sort
to list files sorted by date, and then use cut
on the output to generate a list of files, and then use rm
delete all files except the latest 5 files. Running this periodically should have the result you are looking for.
I don't know of an existing script, but this should be fairly trivial to implement.
Now i've lost it.
â john
Jan 5 at 6:35
@imbuedHope doesfind
work overhttp
?
â ctrl-alt-delor
Jun 25 at 7:38
add a comment |Â
up vote
0
down vote
up vote
0
down vote
You can use find
piped into sort
to list files sorted by date, and then use cut
on the output to generate a list of files, and then use rm
delete all files except the latest 5 files. Running this periodically should have the result you are looking for.
I don't know of an existing script, but this should be fairly trivial to implement.
You can use find
piped into sort
to list files sorted by date, and then use cut
on the output to generate a list of files, and then use rm
delete all files except the latest 5 files. Running this periodically should have the result you are looking for.
I don't know of an existing script, but this should be fairly trivial to implement.
answered Jan 4 at 14:39
imbuedHope
1968
1968
Now i've lost it.
â john
Jan 5 at 6:35
@imbuedHope doesfind
work overhttp
?
â ctrl-alt-delor
Jun 25 at 7:38
add a comment |Â
Now i've lost it.
â john
Jan 5 at 6:35
@imbuedHope doesfind
work overhttp
?
â ctrl-alt-delor
Jun 25 at 7:38
Now i've lost it.
â john
Jan 5 at 6:35
Now i've lost it.
â john
Jan 5 at 6:35
@imbuedHope does
find
work over http
?â ctrl-alt-delor
Jun 25 at 7:38
@imbuedHope does
find
work over http
?â ctrl-alt-delor
Jun 25 at 7:38
add a comment |Â
up vote
0
down vote
If you can use zsh, it has glob qualifiers that make this pretty easy:
zsh -c 'rm work-folder/*(om[6,-1])'
That says to select all of the files in the work-folder directory, ordered o
by modification time m
, and further selecting only the range from 6 to the end. This leaves the most recent 5 files in the folder.
This assumes that you have 6 or more files in the directory to begin with; you could wrap a test around the removal to be safer (all in zsh):
files=(work/*(om))
[ $#files[@] -gt 5 ] && echo rm "$files[6,-1]"
It's more work in bash, as you need to call stat
on each file and keep track yourself, along this line.
does it work over http?
â ctrl-alt-delor
Jun 25 at 7:39
I donâÂÂt understand, @ctrl-alt-delor ; itâÂÂs a zsh command, and is not meant for use in the HTTP protocol. Did you have a separate Question?
â Jeff Schaller
Jun 27 at 21:04
Read the question:
â ctrl-alt-delor
Jun 28 at 7:22
add a comment |Â
up vote
0
down vote
If you can use zsh, it has glob qualifiers that make this pretty easy:
zsh -c 'rm work-folder/*(om[6,-1])'
That says to select all of the files in the work-folder directory, ordered o
by modification time m
, and further selecting only the range from 6 to the end. This leaves the most recent 5 files in the folder.
This assumes that you have 6 or more files in the directory to begin with; you could wrap a test around the removal to be safer (all in zsh):
files=(work/*(om))
[ $#files[@] -gt 5 ] && echo rm "$files[6,-1]"
It's more work in bash, as you need to call stat
on each file and keep track yourself, along this line.
does it work over http?
â ctrl-alt-delor
Jun 25 at 7:39
I donâÂÂt understand, @ctrl-alt-delor ; itâÂÂs a zsh command, and is not meant for use in the HTTP protocol. Did you have a separate Question?
â Jeff Schaller
Jun 27 at 21:04
Read the question:
â ctrl-alt-delor
Jun 28 at 7:22
add a comment |Â
up vote
0
down vote
up vote
0
down vote
If you can use zsh, it has glob qualifiers that make this pretty easy:
zsh -c 'rm work-folder/*(om[6,-1])'
That says to select all of the files in the work-folder directory, ordered o
by modification time m
, and further selecting only the range from 6 to the end. This leaves the most recent 5 files in the folder.
This assumes that you have 6 or more files in the directory to begin with; you could wrap a test around the removal to be safer (all in zsh):
files=(work/*(om))
[ $#files[@] -gt 5 ] && echo rm "$files[6,-1]"
It's more work in bash, as you need to call stat
on each file and keep track yourself, along this line.
If you can use zsh, it has glob qualifiers that make this pretty easy:
zsh -c 'rm work-folder/*(om[6,-1])'
That says to select all of the files in the work-folder directory, ordered o
by modification time m
, and further selecting only the range from 6 to the end. This leaves the most recent 5 files in the folder.
This assumes that you have 6 or more files in the directory to begin with; you could wrap a test around the removal to be safer (all in zsh):
files=(work/*(om))
[ $#files[@] -gt 5 ] && echo rm "$files[6,-1]"
It's more work in bash, as you need to call stat
on each file and keep track yourself, along this line.
answered Jan 9 at 18:14
Jeff Schaller
31.8k848109
31.8k848109
does it work over http?
â ctrl-alt-delor
Jun 25 at 7:39
I donâÂÂt understand, @ctrl-alt-delor ; itâÂÂs a zsh command, and is not meant for use in the HTTP protocol. Did you have a separate Question?
â Jeff Schaller
Jun 27 at 21:04
Read the question:
â ctrl-alt-delor
Jun 28 at 7:22
add a comment |Â
does it work over http?
â ctrl-alt-delor
Jun 25 at 7:39
I donâÂÂt understand, @ctrl-alt-delor ; itâÂÂs a zsh command, and is not meant for use in the HTTP protocol. Did you have a separate Question?
â Jeff Schaller
Jun 27 at 21:04
Read the question:
â ctrl-alt-delor
Jun 28 at 7:22
does it work over http?
â ctrl-alt-delor
Jun 25 at 7:39
does it work over http?
â ctrl-alt-delor
Jun 25 at 7:39
I donâÂÂt understand, @ctrl-alt-delor ; itâÂÂs a zsh command, and is not meant for use in the HTTP protocol. Did you have a separate Question?
â Jeff Schaller
Jun 27 at 21:04
I donâÂÂt understand, @ctrl-alt-delor ; itâÂÂs a zsh command, and is not meant for use in the HTTP protocol. Did you have a separate Question?
â Jeff Schaller
Jun 27 at 21:04
Read the question:
â ctrl-alt-delor
Jun 28 at 7:22
Read the question:
â ctrl-alt-delor
Jun 28 at 7:22
add a comment |Â
up vote
0
down vote
The following script will display a list of "new files" and "old files" in a directory. By "new files" is meant files that have been modified after the last run of the script, and by "old files" is meant that have not been modified since the last run of the script.
The script writes the output of date
to a "timestamp file", and uses this file in the next run to determine what files have changed. on the first run, no output will be produced.
The script should be run manually, and as it is written it will only give you an opportunity to detect what files have been modified in a particular directory.
#!/bin/sh
topdir=$HOME # change this to point to the top dir where your files are
stamp="$topdir/timestamp"
if [ -f "$stamp" ]; then
echo 'New files:'
find "$topdir" -type f ! -name timestamp -newer "$stamp"
echo 'Old files:'
find "$topdir" -type f ! -name timestamp ! -newer "$stamp"
fi
date >"$stamp"
This could be modified to
- prompt the user for deleting the old files,
- detect only files matching a certain pattern (using
-name 'pattern'
, e.g.-name 'A156.1.[abxyz]'
), - look at the inode change time ("ctime") instead of the modification time (using
-cnewer
instead of-newer
if yourfind
supports it), - etc.
does find work over http?
â ctrl-alt-delor
Jun 25 at 7:38
@ctrl-alt-delor No.
â Kusalananda
Jun 25 at 7:41
add a comment |Â
up vote
0
down vote
The following script will display a list of "new files" and "old files" in a directory. By "new files" is meant files that have been modified after the last run of the script, and by "old files" is meant that have not been modified since the last run of the script.
The script writes the output of date
to a "timestamp file", and uses this file in the next run to determine what files have changed. on the first run, no output will be produced.
The script should be run manually, and as it is written it will only give you an opportunity to detect what files have been modified in a particular directory.
#!/bin/sh
topdir=$HOME # change this to point to the top dir where your files are
stamp="$topdir/timestamp"
if [ -f "$stamp" ]; then
echo 'New files:'
find "$topdir" -type f ! -name timestamp -newer "$stamp"
echo 'Old files:'
find "$topdir" -type f ! -name timestamp ! -newer "$stamp"
fi
date >"$stamp"
This could be modified to
- prompt the user for deleting the old files,
- detect only files matching a certain pattern (using
-name 'pattern'
, e.g.-name 'A156.1.[abxyz]'
), - look at the inode change time ("ctime") instead of the modification time (using
-cnewer
instead of-newer
if yourfind
supports it), - etc.
does find work over http?
â ctrl-alt-delor
Jun 25 at 7:38
@ctrl-alt-delor No.
â Kusalananda
Jun 25 at 7:41
add a comment |Â
up vote
0
down vote
up vote
0
down vote
The following script will display a list of "new files" and "old files" in a directory. By "new files" is meant files that have been modified after the last run of the script, and by "old files" is meant that have not been modified since the last run of the script.
The script writes the output of date
to a "timestamp file", and uses this file in the next run to determine what files have changed. on the first run, no output will be produced.
The script should be run manually, and as it is written it will only give you an opportunity to detect what files have been modified in a particular directory.
#!/bin/sh
topdir=$HOME # change this to point to the top dir where your files are
stamp="$topdir/timestamp"
if [ -f "$stamp" ]; then
echo 'New files:'
find "$topdir" -type f ! -name timestamp -newer "$stamp"
echo 'Old files:'
find "$topdir" -type f ! -name timestamp ! -newer "$stamp"
fi
date >"$stamp"
This could be modified to
- prompt the user for deleting the old files,
- detect only files matching a certain pattern (using
-name 'pattern'
, e.g.-name 'A156.1.[abxyz]'
), - look at the inode change time ("ctime") instead of the modification time (using
-cnewer
instead of-newer
if yourfind
supports it), - etc.
The following script will display a list of "new files" and "old files" in a directory. By "new files" is meant files that have been modified after the last run of the script, and by "old files" is meant that have not been modified since the last run of the script.
The script writes the output of date
to a "timestamp file", and uses this file in the next run to determine what files have changed. on the first run, no output will be produced.
The script should be run manually, and as it is written it will only give you an opportunity to detect what files have been modified in a particular directory.
#!/bin/sh
topdir=$HOME # change this to point to the top dir where your files are
stamp="$topdir/timestamp"
if [ -f "$stamp" ]; then
echo 'New files:'
find "$topdir" -type f ! -name timestamp -newer "$stamp"
echo 'Old files:'
find "$topdir" -type f ! -name timestamp ! -newer "$stamp"
fi
date >"$stamp"
This could be modified to
- prompt the user for deleting the old files,
- detect only files matching a certain pattern (using
-name 'pattern'
, e.g.-name 'A156.1.[abxyz]'
), - look at the inode change time ("ctime") instead of the modification time (using
-cnewer
instead of-newer
if yourfind
supports it), - etc.
answered Jun 25 at 7:03
Kusalananda
104k13204323
104k13204323
does find work over http?
â ctrl-alt-delor
Jun 25 at 7:38
@ctrl-alt-delor No.
â Kusalananda
Jun 25 at 7:41
add a comment |Â
does find work over http?
â ctrl-alt-delor
Jun 25 at 7:38
@ctrl-alt-delor No.
â Kusalananda
Jun 25 at 7:41
does find work over http?
â ctrl-alt-delor
Jun 25 at 7:38
does find work over http?
â ctrl-alt-delor
Jun 25 at 7:38
@ctrl-alt-delor No.
â Kusalananda
Jun 25 at 7:41
@ctrl-alt-delor No.
â Kusalananda
Jun 25 at 7:41
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f414780%2fsynchronise-with-remote-machine-via-http-and-delete-older-files%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Are files ever edited, or removed? Or are they only added? (on remote server)
â ctrl-alt-delor
Jun 25 at 7:51