What's the current general opinion on using dump to backup active file
systems? Is this asking for trouble? What kinds of problems can be
expeted when restoring dumps made from active file systems? The
machined in question would be running HP-UX 9 and 10, and SunOS 4.1.3
I have until now used mainly tar and cpio to do my backups, but have
been considering switching to Amanda, and Amanda really needs dump's
feature of a quick estimate of the number of blocks the dump will
need. Emulating dump with tar or cpio means getting the size estimate
(GNU tar's --totals) is very slow.
Perhaps a good idea would be to implement a --no-output switch to GNU
tar that would cause it not to actually open any files or write any
archive, just calculate the size the archive would be, and write it
out as with --totals. Or maybe GNU tar could automatically check if
the archive is actually being written to /dev/null, and in that case
skip opening and reading the files.