The fastest and lowest cost (CPU/Network performance) method is to prestage
the data. So you'd take a backup of just the home dir directory into a BKF
file with NTBACKUP, then restore the file to the new server.
I always recommend NTBACKUP or some real backup tool as opposed to
zipping/RARing because it's faster and safer.
Once you prestage the data and setup the replication, the initial sync will
only need to use RDC to confirm that the files are unchanged and mark
everything correctly in the database. I would still recommend doing this
after hours of course, if only to prevent conflicts with user data while
everything gets in sync.
Microsoft Enterprise Platforms Support
All postings on this newsgroup are provided "AS IS" with no warranties, and
confer no rights.
For more information please visit
Post by David Goldsmith
If I were to set up DFS Replication for a user home directory folder, which
has roughly 70GB of data currently in it, would it be better to set it up on
the original folder and let it replicate the entire contents during initial
replication, or to set it up on new empty folders, then move the data into
the folders at a measured rate and let it replicate gradually? I'm just
concerned about overburdening the server when it's trying to stage that
amount of data during initial replication...any thoughts?