Services that move a lot of data - tens of gigabytes at a time, I mean - need three things: off-peak scheduling, bandwidth limiting and error recovery. You need to be able to schedule the file transfers for times when the network is not in high demand, or for separate quota times for domestic users. It's a kind of Quality of Service setting, I suppose. Bandwidth limiting means you can set caps on the rate at which data is to be transferred, staying under certain quotas or, again, to maintain the quality of other services. Error recovery is essential, too, since operations of that size are more likely to run into problems sooner or later, and retrying manually is a very large burden. If you can't recover automatically from errors, everything is going to take much longer and be much more frustrating.
Mokalus of Borg
PS - A lot of our problems at work would be solved if we had software that did this.
PPS - Guess I'd better get writing.