2006-07-19 22:04:03 UTC
I'm currently investigating using DFS-R in a 'Data Collection' scenario,
where our branch offices will replicate data back to our data center where
the data can then centrally be backed up and such. After setting it up in
our test lab though I had some questions I can not readily find answers for
and I'm hoping someone in this newsgroup can point me in the right direction.
1) In a data-collection scenario, where a branch office would be replicating
changed data back to a central server (and for arguements sake lets say the
replication schedule is set to only replicate the data in the evening) is
there anyway to determine which file(s) did not replicate during the
replication window? For example, locked files, open files, encrypted files,
etc. Can this report be run in any sort of automated fashion?
2) When using cross-site RDC is there any way to view or show it is actually
working? For example, using the DFS Management MMC you can run a report that
shows the benfits of remote differential compression on how much data is
replicated. I assume cross-site RDC benefits are included in that metric. Can
you seperate it though?
3) In the same scenario as question #1, if a user deletes a folder at the
branch office (and assuming it isn't available for recovery via the local
servers VSS) is there a way to force the main office's hub server to
replicate the folder back? My testing seems to indicate that unless I'm
restoring a different version of the folder and its files it will not
replicate it down to the branch office - this is expected though, else you
would never be able to delete a file at the branch office without it just
replicating back from the hub server. But is there a way to over-ride the
default behaviour, to in essence mark the existing folder on the hub server
as "authoritative", for lack of a better word, so DFS-R recognizes it as the
good copy and forces the branch server to replicate it back (or retrieve it
from the conflict and deleted items folder)?