Discussion:
DFSR and large folders
(too old to reply)
David Goldsmith
2006-07-19 15:15:02 UTC
Permalink
If I were to set up DFS Replication for a user home directory folder, which
has roughly 70GB of data currently in it, would it be better to set it up on
the original folder and let it replicate the entire contents during initial
replication, or to set it up on new empty folders, then move the data into
the folders at a measured rate and let it replicate gradually? I'm just
concerned about overburdening the server when it's trying to stage that
amount of data during initial replication...any thoughts?
Ned Pyle [MSFT]
2006-07-20 12:28:16 UTC
Permalink
Hi,

The fastest and lowest cost (CPU/Network performance) method is to prestage
the data. So you'd take a backup of just the home dir directory into a BKF
file with NTBACKUP, then restore the file to the new server.

I always recommend NTBACKUP or some real backup tool as opposed to
zipping/RARing because it's faster and safer.

Once you prestage the data and setup the replication, the initial sync will
only need to use RDC to confirm that the files are unchanged and mark
everything correctly in the database. I would still recommend doing this
after hours of course, if only to prevent conflicts with user data while
everything gets in sync.
--
Ned Pyle
Microsoft Enterprise Platforms Support

All postings on this newsgroup are provided "AS IS" with no warranties, and
confer no rights.


For more information please visit
http://www.microsoft.com/info/cpyright.mspx to find terms of use.
Post by David Goldsmith
If I were to set up DFS Replication for a user home directory folder, which
has roughly 70GB of data currently in it, would it be better to set it up on
the original folder and let it replicate the entire contents during initial
replication, or to set it up on new empty folders, then move the data into
the folders at a measured rate and let it replicate gradually? I'm just
concerned about overburdening the server when it's trying to stage that
amount of data during initial replication...any thoughts?
LIMKT
2006-08-18 03:55:01 UTC
Permalink
Hi,

What do mean by safer? Could you please elaborate?
I have a situation where I prestage the data (abt 30GB), but it seems that
the DFS is replicating all over again. I have copied the source data to a USB
hdd and copy the data into the destination folder. Have I missed something?

I read that FRS does not work if data are prestage in this manner, will it
work for DFS?

Thanks
KT
Post by Ned Pyle [MSFT]
Hi,
The fastest and lowest cost (CPU/Network performance) method is to prestage
the data. So you'd take a backup of just the home dir directory into a BKF
file with NTBACKUP, then restore the file to the new server.
I always recommend NTBACKUP or some real backup tool as opposed to
zipping/RARing because it's faster and safer.
Once you prestage the data and setup the replication, the initial sync will
only need to use RDC to confirm that the files are unchanged and mark
everything correctly in the database. I would still recommend doing this
after hours of course, if only to prevent conflicts with user data while
everything gets in sync.
--
Ned Pyle
Microsoft Enterprise Platforms Support
All postings on this newsgroup are provided "AS IS" with no warranties, and
confer no rights.
For more information please visit
http://www.microsoft.com/info/cpyright.mspx to find terms of use.
Post by David Goldsmith
If I were to set up DFS Replication for a user home directory folder, which
has roughly 70GB of data currently in it, would it be better to set it up on
the original folder and let it replicate the entire contents during initial
replication, or to set it up on new empty folders, then move the data into
the folders at a measured rate and let it replicate gradually? I'm just
concerned about overburdening the server when it's trying to stage that
amount of data during initial replication...any thoughts?
Ned Pyle [MSFT]
2006-08-18 18:01:40 UTC
Permalink
If you are using FRS it's a completely different scenario - FRS is going to
re-replicate data if prestaging was not done with exact steps as outlined in
http://support.microsoft.com/kb/266679/en-us
--
Ned Pyle
Microsoft Enterprise Platforms Support

All postings on this newsgroup are provided "AS IS" with no warranties, and
confer no rights.
For more information please visit
http://www.microsoft.com/info/cpyright.mspx to find terms of use.
Post by LIMKT
Hi,
What do mean by safer? Could you please elaborate?
I have a situation where I prestage the data (abt 30GB), but it seems that
the DFS is replicating all over again. I have copied the source data to a USB
hdd and copy the data into the destination folder. Have I missed something?
I read that FRS does not work if data are prestage in this manner, will it
work for DFS?
Thanks
KT
Post by Ned Pyle [MSFT]
Hi,
The fastest and lowest cost (CPU/Network performance) method is to prestage
the data. So you'd take a backup of just the home dir directory into a BKF
file with NTBACKUP, then restore the file to the new server.
I always recommend NTBACKUP or some real backup tool as opposed to
zipping/RARing because it's faster and safer.
Once you prestage the data and setup the replication, the initial sync will
only need to use RDC to confirm that the files are unchanged and mark
everything correctly in the database. I would still recommend doing this
after hours of course, if only to prevent conflicts with user data while
everything gets in sync.
--
Ned Pyle
Microsoft Enterprise Platforms Support
All postings on this newsgroup are provided "AS IS" with no warranties, and
confer no rights.
For more information please visit
http://www.microsoft.com/info/cpyright.mspx to find terms of use.
Post by David Goldsmith
If I were to set up DFS Replication for a user home directory folder, which
has roughly 70GB of data currently in it, would it be better to set it
up
on
the original folder and let it replicate the entire contents during initial
replication, or to set it up on new empty folders, then move the data into
the folders at a measured rate and let it replicate gradually? I'm just
concerned about overburdening the server when it's trying to stage that
amount of data during initial replication...any thoughts?
LIMKT
2006-08-18 22:09:02 UTC
Permalink
I am using W2K3 R2 for all my servers in the replication, so I should be
using the new DFS-R, right?
However, my prestaging does not seems to work (this is already server number
3).
The replication seems to replicate all the data again.....why could be wrong.

You mentioned abt using backup tools which is safer? Why is it safer?
I only use copy/robocopy to copy the data to USB HDD and back.

Appreciate your help
Thanks
KT
Post by Ned Pyle [MSFT]
If you are using FRS it's a completely different scenario - FRS is going to
re-replicate data if prestaging was not done with exact steps as outlined in
http://support.microsoft.com/kb/266679/en-us
--
Ned Pyle
Microsoft Enterprise Platforms Support
All postings on this newsgroup are provided "AS IS" with no warranties, and
confer no rights.
For more information please visit
http://www.microsoft.com/info/cpyright.mspx to find terms of use.
Post by LIMKT
Hi,
What do mean by safer? Could you please elaborate?
I have a situation where I prestage the data (abt 30GB), but it seems that
the DFS is replicating all over again. I have copied the source data to a USB
hdd and copy the data into the destination folder. Have I missed something?
I read that FRS does not work if data are prestage in this manner, will it
work for DFS?
Thanks
KT
Post by Ned Pyle [MSFT]
Hi,
The fastest and lowest cost (CPU/Network performance) method is to prestage
the data. So you'd take a backup of just the home dir directory into a BKF
file with NTBACKUP, then restore the file to the new server.
I always recommend NTBACKUP or some real backup tool as opposed to
zipping/RARing because it's faster and safer.
Once you prestage the data and setup the replication, the initial sync will
only need to use RDC to confirm that the files are unchanged and mark
everything correctly in the database. I would still recommend doing this
after hours of course, if only to prevent conflicts with user data while
everything gets in sync.
--
Ned Pyle
Microsoft Enterprise Platforms Support
All postings on this newsgroup are provided "AS IS" with no warranties, and
confer no rights.
For more information please visit
http://www.microsoft.com/info/cpyright.mspx to find terms of use.
Post by David Goldsmith
If I were to set up DFS Replication for a user home directory folder, which
has roughly 70GB of data currently in it, would it be better to set it
up
on
the original folder and let it replicate the entire contents during initial
replication, or to set it up on new empty folders, then move the data into
the folders at a measured rate and let it replicate gradually? I'm just
concerned about overburdening the server when it's trying to stage that
amount of data during initial replication...any thoughts?
Ned Pyle [MSFT]
2006-08-21 13:38:00 UTC
Permalink
Post by LIMKT
I am using W2K3 R2 for all my servers in the replication, so I should be
using the new DFS-R, right?
Only if you told it to... did you?
--
Ned Pyle
Microsoft Enterprise Platforms Support

All postings on this newsgroup are provided "AS IS" with no warranties, and
confer no rights.
For more information please visit
http://www.microsoft.com/info/cpyright.mspx to find terms of use.
Post by LIMKT
I am using W2K3 R2 for all my servers in the replication, so I should be
using the new DFS-R, right?
However, my prestaging does not seems to work (this is already server number
3).
The replication seems to replicate all the data again.....why could be wrong.
You mentioned abt using backup tools which is safer? Why is it safer?
I only use copy/robocopy to copy the data to USB HDD and back.
Appreciate your help
Thanks
KT
Post by Ned Pyle [MSFT]
If you are using FRS it's a completely different scenario - FRS is going to
re-replicate data if prestaging was not done with exact steps as outlined in
http://support.microsoft.com/kb/266679/en-us
--
Ned Pyle
Microsoft Enterprise Platforms Support
All postings on this newsgroup are provided "AS IS" with no warranties, and
confer no rights.
For more information please visit
http://www.microsoft.com/info/cpyright.mspx to find terms of use.
Post by LIMKT
Hi,
What do mean by safer? Could you please elaborate?
I have a situation where I prestage the data (abt 30GB), but it seems that
the DFS is replicating all over again. I have copied the source data to
a
USB
hdd and copy the data into the destination folder. Have I missed something?
I read that FRS does not work if data are prestage in this manner, will it
work for DFS?
Thanks
KT
Post by Ned Pyle [MSFT]
Hi,
The fastest and lowest cost (CPU/Network performance) method is to prestage
the data. So you'd take a backup of just the home dir directory into a BKF
file with NTBACKUP, then restore the file to the new server.
I always recommend NTBACKUP or some real backup tool as opposed to
zipping/RARing because it's faster and safer.
Once you prestage the data and setup the replication, the initial sync will
only need to use RDC to confirm that the files are unchanged and mark
everything correctly in the database. I would still recommend doing this
after hours of course, if only to prevent conflicts with user data while
everything gets in sync.
--
Ned Pyle
Microsoft Enterprise Platforms Support
All postings on this newsgroup are provided "AS IS" with no
warranties,
and
confer no rights.
For more information please visit
http://www.microsoft.com/info/cpyright.mspx to find terms of use.
Post by David Goldsmith
If I were to set up DFS Replication for a user home directory
folder,
which
has roughly 70GB of data currently in it, would it be better to set it
up
on
the original folder and let it replicate the entire contents during initial
replication, or to set it up on new empty folders, then move the
data
into
the folders at a measured rate and let it replicate gradually? I'm just
concerned about overburdening the server when it's trying to stage that
amount of data during initial replication...any thoughts?
LIMKT
2006-08-21 15:16:02 UTC
Permalink
Yes I did...so why is the pre-staging not working?
Post by Ned Pyle [MSFT]
Post by LIMKT
I am using W2K3 R2 for all my servers in the replication, so I should be
using the new DFS-R, right?
Only if you told it to... did you?
--
Ned Pyle
Microsoft Enterprise Platforms Support
All postings on this newsgroup are provided "AS IS" with no warranties, and
confer no rights.
For more information please visit
http://www.microsoft.com/info/cpyright.mspx to find terms of use.
Post by LIMKT
I am using W2K3 R2 for all my servers in the replication, so I should be
using the new DFS-R, right?
However, my prestaging does not seems to work (this is already server number
3).
The replication seems to replicate all the data again.....why could be wrong.
You mentioned abt using backup tools which is safer? Why is it safer?
I only use copy/robocopy to copy the data to USB HDD and back.
Appreciate your help
Thanks
KT
Post by Ned Pyle [MSFT]
If you are using FRS it's a completely different scenario - FRS is going to
re-replicate data if prestaging was not done with exact steps as outlined in
http://support.microsoft.com/kb/266679/en-us
--
Ned Pyle
Microsoft Enterprise Platforms Support
All postings on this newsgroup are provided "AS IS" with no warranties, and
confer no rights.
For more information please visit
http://www.microsoft.com/info/cpyright.mspx to find terms of use.
Post by LIMKT
Hi,
What do mean by safer? Could you please elaborate?
I have a situation where I prestage the data (abt 30GB), but it seems that
the DFS is replicating all over again. I have copied the source data to
a
USB
hdd and copy the data into the destination folder. Have I missed something?
I read that FRS does not work if data are prestage in this manner, will it
work for DFS?
Thanks
KT
Post by Ned Pyle [MSFT]
Hi,
The fastest and lowest cost (CPU/Network performance) method is to prestage
the data. So you'd take a backup of just the home dir directory into a BKF
file with NTBACKUP, then restore the file to the new server.
I always recommend NTBACKUP or some real backup tool as opposed to
zipping/RARing because it's faster and safer.
Once you prestage the data and setup the replication, the initial sync will
only need to use RDC to confirm that the files are unchanged and mark
everything correctly in the database. I would still recommend doing this
after hours of course, if only to prevent conflicts with user data while
everything gets in sync.
--
Ned Pyle
Microsoft Enterprise Platforms Support
All postings on this newsgroup are provided "AS IS" with no
warranties,
and
confer no rights.
For more information please visit
http://www.microsoft.com/info/cpyright.mspx to find terms of use.
Post by David Goldsmith
If I were to set up DFS Replication for a user home directory
folder,
which
has roughly 70GB of data currently in it, would it be better to set it
up
on
the original folder and let it replicate the entire contents during initial
replication, or to set it up on new empty folders, then move the
data
into
the folders at a measured rate and let it replicate gradually? I'm just
concerned about overburdening the server when it's trying to stage that
amount of data during initial replication...any thoughts?
Ned Pyle [MSFT]
2006-08-21 15:58:15 UTC
Permalink
I guess I'm wondering why you think it's replicating them all over again? Is
this because you see all the files going to staging?

If so - that does not mean re-replicating. All files over 64KB will attempt
to be staged so that they can have RDC hashes computed for later use in
saving bandwidth in replication. It does not necessarily mean that fiels are
actually being replicated. Check out a post a little ways up called 'DFS
Report' that can tell you how to see if files are actually being replicated
(plus you can always use the DFSR Health Report and Perfmon to see in what
quantities, if you are unconcerned about specific files).
--
Ned Pyle
Microsoft Enterprise Platforms Support

All postings on this newsgroup are provided "AS IS" with no warranties, and
confer no rights.
For more information please visit
http://www.microsoft.com/info/cpyright.mspx to find terms of use.
Continue reading on narkive:
Loading...