I'm determening the parameters to use for the dump command. We use a new
HP 2000DC dat unit in our network.
I've never had any problems determining the parameters before because they
were stated in the manuals. But this time, they're not.
When I calculate the density to use for the HP 2000DC drive ( average +/- 8Mb
stated in the manual for a 90 meter tape), I do something like:
90m = 3542 inch
8 Gb = 8 * 1024 * 1024 * 1024 * 8 bits
bits / inch available = BPI
But, when I look to the parameters sun proposes for it's 2.3 Gb exabyte tape
unit ( 6000 feet -- 54000 BPI) it seems this method does not work.
So, how can i calculate the right numbers to use for the DUMP command to use??? Or do
i use trail and error?
--
(040-7)35983 | nlevdpsb.c849095i