Does rsync --compare-dest evaluate once or after each file?
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
I am using rsync to archive a very large number of files from a 36TB ZFS pool to three large separate disks.
- I don't care which file ends up on which of the three disks
- I do not want duplicates
- I would like to run them in tandem
If I use rsync --compare-dest=/mnt/ez1 --compare-dest=/mnt/ez2 /store1/files /mnt/ez3/
, will rsync "reevaluate" those comparison destinations after completing each file? Or is that list of files it intends to sync determined up front?
rsync
add a comment |
up vote
0
down vote
favorite
I am using rsync to archive a very large number of files from a 36TB ZFS pool to three large separate disks.
- I don't care which file ends up on which of the three disks
- I do not want duplicates
- I would like to run them in tandem
If I use rsync --compare-dest=/mnt/ez1 --compare-dest=/mnt/ez2 /store1/files /mnt/ez3/
, will rsync "reevaluate" those comparison destinations after completing each file? Or is that list of files it intends to sync determined up front?
rsync
If you have tworsync
processes handling the same file at the same time, neither will have anything for--compare-dest
to match.
– roaima
16 hours ago
You should consider having the-a
flag (--archive
) for yourrsync
so that metadata is copied across. This will help subsequent runs perform more efficiently.
– roaima
16 hours ago
It might be simpler to create a complete list of all the files, then split this into 3 disjoint lists, then use 3 rsyncs working each on one list and one destination disk. This allows you to redo a checking rsync easily.
– meuh
16 hours ago
@roaima I omitted my flags, but I am usingrsync -auvP
currently.
– erode
8 hours ago
@roaima I think I understand what you mean by having multiple rsync processes working on the same file at the same time creating a big problem. The files are generally pretty large (3-40GB) so I figured it might be safe for them to work in the same directory simultaneously because it might be unlikely for them to collide. Or even if I could have one process work "backwards" through the list?
– erode
8 hours ago
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I am using rsync to archive a very large number of files from a 36TB ZFS pool to three large separate disks.
- I don't care which file ends up on which of the three disks
- I do not want duplicates
- I would like to run them in tandem
If I use rsync --compare-dest=/mnt/ez1 --compare-dest=/mnt/ez2 /store1/files /mnt/ez3/
, will rsync "reevaluate" those comparison destinations after completing each file? Or is that list of files it intends to sync determined up front?
rsync
I am using rsync to archive a very large number of files from a 36TB ZFS pool to three large separate disks.
- I don't care which file ends up on which of the three disks
- I do not want duplicates
- I would like to run them in tandem
If I use rsync --compare-dest=/mnt/ez1 --compare-dest=/mnt/ez2 /store1/files /mnt/ez3/
, will rsync "reevaluate" those comparison destinations after completing each file? Or is that list of files it intends to sync determined up front?
rsync
rsync
asked 21 hours ago
erode
1063
1063
If you have tworsync
processes handling the same file at the same time, neither will have anything for--compare-dest
to match.
– roaima
16 hours ago
You should consider having the-a
flag (--archive
) for yourrsync
so that metadata is copied across. This will help subsequent runs perform more efficiently.
– roaima
16 hours ago
It might be simpler to create a complete list of all the files, then split this into 3 disjoint lists, then use 3 rsyncs working each on one list and one destination disk. This allows you to redo a checking rsync easily.
– meuh
16 hours ago
@roaima I omitted my flags, but I am usingrsync -auvP
currently.
– erode
8 hours ago
@roaima I think I understand what you mean by having multiple rsync processes working on the same file at the same time creating a big problem. The files are generally pretty large (3-40GB) so I figured it might be safe for them to work in the same directory simultaneously because it might be unlikely for them to collide. Or even if I could have one process work "backwards" through the list?
– erode
8 hours ago
add a comment |
If you have tworsync
processes handling the same file at the same time, neither will have anything for--compare-dest
to match.
– roaima
16 hours ago
You should consider having the-a
flag (--archive
) for yourrsync
so that metadata is copied across. This will help subsequent runs perform more efficiently.
– roaima
16 hours ago
It might be simpler to create a complete list of all the files, then split this into 3 disjoint lists, then use 3 rsyncs working each on one list and one destination disk. This allows you to redo a checking rsync easily.
– meuh
16 hours ago
@roaima I omitted my flags, but I am usingrsync -auvP
currently.
– erode
8 hours ago
@roaima I think I understand what you mean by having multiple rsync processes working on the same file at the same time creating a big problem. The files are generally pretty large (3-40GB) so I figured it might be safe for them to work in the same directory simultaneously because it might be unlikely for them to collide. Or even if I could have one process work "backwards" through the list?
– erode
8 hours ago
If you have two
rsync
processes handling the same file at the same time, neither will have anything for --compare-dest
to match.– roaima
16 hours ago
If you have two
rsync
processes handling the same file at the same time, neither will have anything for --compare-dest
to match.– roaima
16 hours ago
You should consider having the
-a
flag (--archive
) for your rsync
so that metadata is copied across. This will help subsequent runs perform more efficiently.– roaima
16 hours ago
You should consider having the
-a
flag (--archive
) for your rsync
so that metadata is copied across. This will help subsequent runs perform more efficiently.– roaima
16 hours ago
It might be simpler to create a complete list of all the files, then split this into 3 disjoint lists, then use 3 rsyncs working each on one list and one destination disk. This allows you to redo a checking rsync easily.
– meuh
16 hours ago
It might be simpler to create a complete list of all the files, then split this into 3 disjoint lists, then use 3 rsyncs working each on one list and one destination disk. This allows you to redo a checking rsync easily.
– meuh
16 hours ago
@roaima I omitted my flags, but I am using
rsync -auvP
currently.– erode
8 hours ago
@roaima I omitted my flags, but I am using
rsync -auvP
currently.– erode
8 hours ago
@roaima I think I understand what you mean by having multiple rsync processes working on the same file at the same time creating a big problem. The files are generally pretty large (3-40GB) so I figured it might be safe for them to work in the same directory simultaneously because it might be unlikely for them to collide. Or even if I could have one process work "backwards" through the list?
– erode
8 hours ago
@roaima I think I understand what you mean by having multiple rsync processes working on the same file at the same time creating a big problem. The files are generally pretty large (3-40GB) so I figured it might be safe for them to work in the same directory simultaneously because it might be unlikely for them to collide. Or even if I could have one process work "backwards" through the list?
– erode
8 hours ago
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f481368%2fdoes-rsync-compare-dest-evaluate-once-or-after-each-file%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
If you have two
rsync
processes handling the same file at the same time, neither will have anything for--compare-dest
to match.– roaima
16 hours ago
You should consider having the
-a
flag (--archive
) for yourrsync
so that metadata is copied across. This will help subsequent runs perform more efficiently.– roaima
16 hours ago
It might be simpler to create a complete list of all the files, then split this into 3 disjoint lists, then use 3 rsyncs working each on one list and one destination disk. This allows you to redo a checking rsync easily.
– meuh
16 hours ago
@roaima I omitted my flags, but I am using
rsync -auvP
currently.– erode
8 hours ago
@roaima I think I understand what you mean by having multiple rsync processes working on the same file at the same time creating a big problem. The files are generally pretty large (3-40GB) so I figured it might be safe for them to work in the same directory simultaneously because it might be unlikely for them to collide. Or even if I could have one process work "backwards" through the list?
– erode
8 hours ago