algorithm for maximal fit (cd-r backups) needed

algorithm for maximal fit (cd-r backups) needed

Post by Sam Hold » Mon, 20 Sep 1999 04:00:00





Quote:>looking for ideas in an optimal solution toward solving a problem in
>cd-r backups (using linux, initially at least; maybe more o/s' after
>the general algorithm is worked out and implemented).

>the idea is to take a large hard disk (or partition) and break it up
>into directories and/or files and having the collection of these
>objects span several cd-r's when making a backup.  and minimizing
>wasted trailing space on each cd-r in the saveset.

>initially I'm thinking of taking a dir listing and sorting it by file
>size (largest -> smallest).  then figuring out how many cd-r's it
>would take to cover that saveset (rough guess).  perhaps next I'll
>round-robin save the largest files going from cd-r to cd-r (idea being
>that once you've placed the largest files on each of the cd-r's, the
>smaller ones will be easier to figure out where to place, in the
>spaces on the discs that are unclaimed).

>in the end, I'd like to have the smallest # of cd-r's with the least
>amount of empty (wasted) space on each.

This is _really_ _really_ _really_ hard. If you manage to come up with
a solution then make sure you publish it.

You are trying to solve the 'Minimum Bin Packing' problem.

Here it is more formally :

Given a finite set U of items (files on the disk in this case), a size  for
each (size of the file in this case), and a positive integer bin capacity B
(capacity of the cd-r media in this case), give the minimum number of
bins required to pack all of U.

The very bad news is that this problem is in NP. And thus not solvable in
reasonable time with any known algorithm. Approximations exist, I don't
know them off the top of my head but here are some references :

New worst-case results for the bin-packing problem
        Simchi-Levi, D. (1994),
        Naval Res. Logistics 41, 579-585.

A simple proof of the inequality 'MFFD(L) <= 71/60 OPT(L) + 78/71, all L' for
                the MFFD bin-pack algorithm
        Yue, M. (1991),
        Technical Report RRR # 20-91, Rutcor, Rutgers Center for Operations
                Research, Rutgers University, New Jersey.

A 71/60 theorem for bin-packing
        Johnson, D. S., and Garey, M. R. (1985),
        J. Complexity 1, 65-106.

Algorithm text books should have simplistic algorithms that make sure
you get a reasonable packing. Your proposed solutions is one of those, and
it is reasonable - espicially since cd-r media is cheap anyway.

Quote:

>for files that are larger than a single cd-r, they will obviously have
>to be split.  that's another problem - what size and how many chunks
>to split them into?  first idea: take the dumb approach and divide
>them by the media size so that you fill each cd-r with as much of the
>file as possible then span to the next one, etc.  but there are
>probably better ideas that would allow better optimization since I
>think the larger the chunks to place, the harder it will be to fill
>all the gaps and leave the least amount of wasted space (just guessing
>- maybe I'm wrong).

>I'm not so much concerned with restore or backup speed, just
>minimizing wasted space so that archiving will take the minimal amount
>of physical discs.

If you allow files to be split into pieces then it is trivial to
get optimal packing. You just add the files one by one, and when you
run out of room, you add the rest of the file onto the next disk.

This doesn't effect back-up too much. It makes restoring a bit more tedious if
the file you wish to restore happens to be split across media.

Quote:

>so, any ideas? ;-) I've not seen this done before (for freeware, at
>least) and I'd like to make an opensource linux cd-r backup util based
>on this idea.  and since linux can read dos/win partitions, its
>conceivable that a bootable linux ramdisk image could be used to
>backup/restore dos/win files..

Backing up win files is fraught with danger due to the stupidity of files
with two names, where the shorter names are stored in system registries on
occasion.

Quote:

>any ideas on this subject?

Is just writing a tar file to the CD reasonable - with no file-system? I don't
know if that would work, but it should give you the maximum amount of space
on the cd-r. Plus you then have a tar file to restore from, so existing
software should grok it.

Spliting the files amongst the cd-r's is still a problem that has to be solved.
But adding using the largest-fit heuristic you mentioned above should
give reasonable results, since there are (usually) lots of small files to pad
with.

--
Sam

Simple rule: include files should never include include files.
        --Rob Pike

 
 
 

algorithm for maximal fit (cd-r backups) needed

Post by Martin Payn » Mon, 20 Sep 1999 04:00:00





>>looking for ideas in an optimal solution toward solving a problem in
>>cd-r backups (using linux, initially at least; maybe more o/s' after
>>the general algorithm is worked out and implemented).

>>the idea is to take a large hard disk (or partition) and break it up
>>into directories and/or files and having the collection of these
>>objects span several cd-r's when making a backup.  and minimizing
>>wasted trailing space on each cd-r in the saveset.

>>initially I'm thinking of taking a dir listing and sorting it by file
>>size (largest -> smallest).  then figuring out how many cd-r's it
>>would take to cover that saveset (rough guess).  perhaps next I'll
>>round-robin save the largest files going from cd-r to cd-r (idea being
>>that once you've placed the largest files on each of the cd-r's, the
>>smaller ones will be easier to figure out where to place, in the
>>spaces on the discs that are unclaimed).

>>in the end, I'd like to have the smallest # of cd-r's with the least
>>amount of empty (wasted) space on each.

>This is _really_ _really_ _really_ hard. If you manage to come up with
>a solution then make sure you publish it.

>You are trying to solve the 'Minimum Bin Packing' problem.

>Here it is more formally :

>Given a finite set U of items (files on the disk in this case), a size  for
>each (size of the file in this case), and a positive integer bin capacity B
>(capacity of the cd-r media in this case), give the minimum number of
>bins required to pack all of U.

>The very bad news is that this problem is in NP. And thus not solvable in
>reasonable time with any known algorithm. Approximations exist, I don't
>know them off the top of my head but here are some references :

>New worst-case results for the bin-packing problem
> Simchi-Levi, D. (1994),
> Naval Res. Logistics 41, 579-585.

>A simple proof of the inequality 'MFFD(L) <= 71/60 OPT(L) + 78/71, all L'
for
> the MFFD bin-pack algorithm
> Yue, M. (1991),
> Technical Report RRR # 20-91, Rutcor, Rutgers Center for Operations
> Research, Rutgers University, New Jersey.

>A 71/60 theorem for bin-packing
> Johnson, D. S., and Garey, M. R. (1985),
> J. Complexity 1, 65-106.

>Algorithm text books should have simplistic algorithms that make sure
>you get a reasonable packing. Your proposed solutions is one of those, and
>it is reasonable - espicially since cd-r media is cheap anyway.

There is one simplification for your problem - given that you know the total
size of the backup set (say 1,000MB) and that you want it to fit on as few
CDs as possible, then you can calculate the minimum number of CDs reqd (i.e.
two CD in this case).

Your algorithm then only has to search for a solution to 'how do I divide
these files among two CDs', and can stop as soon as it has a solution.

Unfortunately the space calculation will itself be very complicated - what
is the minimum size that a file can be on your CD-Rs? Does a 10 byte file
consume a 2K block on the CD-R? What is the overhead per file of the
directory?

Don't forget that when you write one file onto a directory the packet
software has to consume a whole directory block (2K) to store the pointer to
that one file. If you are writing files in descending size order then when
you later write a second file into that directory it will have to write
another whole directory block to point to it. When you write several files
into one directory one after another I presume the packet software must
cache the directory and only write the directory block when it becomes full.
If you are splitting the contents of a directory between discs you will
certainly need to re-sort the files to be written on each disc so that all
files for one directory are written together.

These issues could make it next to impossible to exactly specify how much
space will be consumed on one CD.

Generally, all the backup programs that I have come across use a special
format on the backup medium (even if it's just something like Zip or Tar
which I think is a Unix equivalent?). They process the files in directory
order, and automatically span files as necessary onto the next disc/tape as
required. This isn't really appropriate if you're trying to create a
natively readable directory structure on your CD-Rs, though.

From a useability perspective, I think you need to put more consideration
into the restore mechanism.

If you have to restore one whole directory from your backups, you really
want to be able to do this from the minimum number of discs. This implies to
me that you need to attempt to keep each directory tree together on as few
disks as possible.

- Show quoted text -

Quote:

>>for files that are larger than a single cd-r, they will obviously have
>>to be split.  that's another problem - what size and how many chunks
>>to split them into?  first idea: take the dumb approach and divide
>>them by the media size so that you fill each cd-r with as much of the
>>file as possible then span to the next one, etc.  but there are
>>probably better ideas that would allow better optimization since I
>>think the larger the chunks to place, the harder it will be to fill
>>all the gaps and leave the least amount of wasted space (just guessing
>>- maybe I'm wrong).

>>I'm not so much concerned with restore or backup speed, just
>>minimizing wasted space so that archiving will take the minimal amount
>>of physical discs.

>If you allow files to be split into pieces then it is trivial to
>get optimal packing. You just add the files one by one, and when you
>run out of room, you add the rest of the file onto the next disk.

>This doesn't effect back-up too much. It makes restoring a bit more tedious
if
>the file you wish to restore happens to be split across media.

>>so, any ideas? ;-) I've not seen this done before (for freeware, at
>>least) and I'd like to make an opensource linux cd-r backup util based
>>on this idea.  and since linux can read dos/win partitions, its
>>conceivable that a bootable linux ramdisk image could be used to
>>backup/restore dos/win files..

>Backing up win files is fraught with danger due to the stupidity of files
>with two names, where the shorter names are stored in system registries on
>occasion.

>>any ideas on this subject?

>Is just writing a tar file to the CD reasonable - with no file-system? I
don't
>know if that would work, but it should give you the maximum amount of space
>on the cd-r. Plus you then have a tar file to restore from, so existing
>software should grok it.

>Spliting the files amongst the cd-r's is still a problem that has to be
solved.
>But adding using the largest-fit heuristic you mentioned above should
>give reasonable results, since there are (usually) lots of small files to
pad
>with.

>--
>Sam

>Simple rule: include files should never include include files.
> --Rob Pike


 
 
 

algorithm for maximal fit (cd-r backups) needed

Post by Andy McFadd » Mon, 20 Sep 1999 04:00:00




>>You are trying to solve the 'Minimum Bin Packing' problem.

Yes, but an *ideal* solution isn't necessary, just a good one.

Quote:>There is one simplification for your problem - given that you know the total
>size of the backup set (say 1,000MB) and that you want it to fit on as few
>CDs as possible, then you can calculate the minimum number of CDs reqd (i.e.
>two CD in this case).

>Your algorithm then only has to search for a solution to 'how do I divide
>these files among two CDs', and can stop as soon as it has a solution.

I'm not sure I'd take that approach.  I'd rather fill up one disc to the
maximum possible extent, then move on to the next.

Quote:>Unfortunately the space calculation will itself be very complicated - what
>is the minimum size that a file can be on your CD-Rs? Does a 10 byte file
>consume a 2K block on the CD-R? What is the overhead per file of the
>directory?

Yes, indeed.  A close familiarity with ISO-9660 will be essential.  Of
course, the backup program will probably be generating the ISO-9660 image
anyway, so it's not redundant work.

Quote:>These issues could make it next to impossible to exactly specify how much
>space will be consumed on one CD.

I disagree.  You pick a file that looks like it'll fit, and add it.  If
you go over the amount of space available, you revert your change, and
try again with a smaller file.  If there are no smaller files, you
pronounce that disc complete, and move on.

Quote:>If you have to restore one whole directory from your backups, you really
>want to be able to do this from the minimum number of discs. This implies to
>me that you need to attempt to keep each directory tree together on as few
>disks as possible.

A good point.  "Best effort" may be good enough here.  You back up things
more often than you restore them, but if that's not the case for the
expected use of this, then you have to optimize differently.

Anyway.  Assuming locality is not vital, you can probably get away by
starting out with grabbing the largest file, and working on down.  It
won't be perfect, but life is like that. :-)

The problem of dealing with files >= 650MB remains, of course.  Some
transparent file compression won't hurt, but it will make the space
calculation prohibitively slow (because you have to compress it before you
know how big it is).  You're also going to need lots of free space to do
the backup, if you're going to create the entire ~650MB disc image up
front and then write it.

Allowing the last file on the disc to span onto the next one is a MUCH
easier approach to this...

--

CD-Recordable FAQ - http://www.fadden.com/cdrfaq/ (a/k/a www.spies.com/~fadden)
Fight Internet Spam - http://spam.abuse.net/spam/ & news.admin.net-abuse.email

 
 
 

algorithm for maximal fit (cd-r backups) needed

Post by Tim Kroese » Mon, 20 Sep 1999 04:00:00


As I recall there used to be a great program for doing exactly what you want
with floppy discs years ago.   Never split files up either, but as I recall
you had to have all the files in the same directory.    I used it to archive
downloaded zip files.   Sorry but I don't remember the name though.

Tim K




> posted:
> >On Sun, 19 Sep 1999 08:52:48 GMT, Bryan



> >>looking for ideas in an optimal solution toward solving a problem in
> >>cd-r backups (using linux, initially at least; maybe more o/s' after
> >>the general algorithm is worked out and implemented).

> >>the idea is to take a large hard disk (or partition) and break it up
> >>into directories and/or files and having the collection of these
> >>objects span several cd-r's when making a backup.  and minimizing
> >>wasted trailing space on each cd-r in the saveset.

> >>initially I'm thinking of taking a dir listing and sorting it by file
> >>size (largest -> smallest).  then figuring out how many cd-r's it
> >>would take to cover that saveset (rough guess).  perhaps next I'll
> >>round-robin save the largest files going from cd-r to cd-r (idea being
> >>that once you've placed the largest files on each of the cd-r's, the
> >>smaller ones will be easier to figure out where to place, in the
> >>spaces on the discs that are unclaimed).

> >>in the end, I'd like to have the smallest # of cd-r's with the least
> >>amount of empty (wasted) space on each.

> >This is _really_ _really_ _really_ hard. If you manage to come up with
> >a solution then make sure you publish it.

> >You are trying to solve the 'Minimum Bin Packing' problem.

> It's fair enough to say that solving to provable optimality is, um,
> impractical.

> But it shouldn't be that hard, for realistic cases, to find
> *reasonable* approximate answers.

> After all, there is ample literature out there on approximations of
> the Travelling Salesman Problem...

> It *is* fair to say that there are reasonable ways of doing
> approximate packing; I'd start with something like the following:

> --> Sort files into a list ordered by size.

> --> Greedily fill a bin with the biggest files that fit.

> --> When it fills, if it is less than {epsilon} full, record the
>     current state, forbid the last file from being dropped in, step
>     back, and add a different file and move towards optimality.

> This is essentially the way tabu search works; a decent variation
> would involve stepping back *several* steps for the given "bin."

> If we were talking about trying to allocate space optimally for a
> large number of files that average on the order of 100MB apiece, it
> would indeed be difficult to get a "high quality" answer.  But with
> the typical filesystem that has swarms of fairly small files,
> approximation should be quite good enough to get decent results.

> It's certainly likely that the last CD will not be completely full;
> with the decreasing cost of CDs, I'm not inclined to worry even about
> using an extra CD or two.  That might cost a couple of bucks for CD-Rs
> that can only be used once.  With CD-RW's the cost isn't *that* much
> more, and they're reusable a generation or three down the road...
> --
> ((LAMBDA (X) (X X)) (LAMBDA (X) (X X)))


 
 
 

algorithm for maximal fit (cd-r backups) needed

Post by Christopher B. Brow » Tue, 21 Sep 1999 04:00:00



posted:



>>looking for ideas in an optimal solution toward solving a problem in
>>cd-r backups (using linux, initially at least; maybe more o/s' after
>>the general algorithm is worked out and implemented).

>>the idea is to take a large hard disk (or partition) and break it up
>>into directories and/or files and having the collection of these
>>objects span several cd-r's when making a backup.  and minimizing
>>wasted trailing space on each cd-r in the saveset.

>>initially I'm thinking of taking a dir listing and sorting it by file
>>size (largest -> smallest).  then figuring out how many cd-r's it
>>would take to cover that saveset (rough guess).  perhaps next I'll
>>round-robin save the largest files going from cd-r to cd-r (idea being
>>that once you've placed the largest files on each of the cd-r's, the
>>smaller ones will be easier to figure out where to place, in the
>>spaces on the discs that are unclaimed).

>>in the end, I'd like to have the smallest # of cd-r's with the least
>>amount of empty (wasted) space on each.

>This is _really_ _really_ _really_ hard. If you manage to come up with
>a solution then make sure you publish it.

>You are trying to solve the 'Minimum Bin Packing' problem.

It's fair enough to say that solving to provable optimality is, um,
impractical.

But it shouldn't be that hard, for realistic cases, to find
*reasonable* approximate answers.

After all, there is ample literature out there on approximations of
the Travelling Salesman Problem...

It *is* fair to say that there are reasonable ways of doing
approximate packing; I'd start with something like the following:

--> Sort files into a list ordered by size.

--> Greedily fill a bin with the biggest files that fit.

--> When it fills, if it is less than {epsilon} full, record the
    current state, forbid the last file from being dropped in, step
    back, and add a different file and move towards optimality.

This is essentially the way tabu search works; a decent variation
would involve stepping back *several* steps for the given "bin."

If we were talking about trying to allocate space optimally for a
large number of files that average on the order of 100MB apiece, it
would indeed be difficult to get a "high quality" answer.  But with
the typical filesystem that has swarms of fairly small files,
approximation should be quite good enough to get decent results.

It's certainly likely that the last CD will not be completely full;
with the decreasing cost of CDs, I'm not inclined to worry even about
using an extra CD or two.  That might cost a couple of bucks for CD-Rs
that can only be used once.  With CD-RW's the cost isn't *that* much
more, and they're reusable a generation or three down the road...
--
((LAMBDA (X) (X X)) (LAMBDA (X) (X X)))

 
 
 

algorithm for maximal fit (cd-r backups) needed

Post by Ken Withero » Tue, 21 Sep 1999 04:00:00



> As I recall there used to be a great program for doing exactly what you want
> with floppy discs years ago.   Never split files up either, but as I recall
> you had to have all the files in the same directory.    I used it to archive
> downloaded zip files.   Sorry but I don't remember the name though.

I too used to use one of these programs, in this case FFIT. It doesn't
come with source( only a DOS binary ) but I've uploaded it to
http://www.frontiernet.net/~phantoml/ffit162.zip for anyone who wants it.
 
 
 

algorithm for maximal fit (cd-r backups) needed

Post by Tim Kroese » Tue, 21 Sep 1999 04:00:00


That's IT!

Worked great, and I bet the math is in their somewhere...

Tim K



> > As I recall there used to be a great program for doing exactly what you
want
> > with floppy discs years ago.   Never split files up either, but as I
recall
> > you had to have all the files in the same directory.    I used it to
archive
> > downloaded zip files.   Sorry but I don't remember the name though.

> I too used to use one of these programs, in this case FFIT. It doesn't
> come with source( only a DOS binary ) but I've uploaded it to
> http://www.frontiernet.net/~phantoml/ffit162.zip for anyone who wants it.

 
 
 

algorithm for maximal fit (cd-r backups) needed

Post by Rock » Wed, 22 Sep 1999 04:00:00


Quote:> Allowing the last file on the disc to span onto the next one is a MUCH
> easier approach to this...

If you could manage this, why bother with a maximal fit calculation at all?
Get a gross estimate of space, divide by disk capacity and use that many
CDs, filling directories in order (so each is on oce disk), and just span
disks when needed.
 
 
 

algorithm for maximal fit (cd-r backups) needed

Post by Martin Payn » Thu, 23 Sep 1999 04:00:00





>>>You are trying to solve the 'Minimum Bin Packing' problem.

>Yes, but an *ideal* solution isn't necessary, just a good one.

>>There is one simplification for your problem - given that you know the
total
>>size of the backup set (say 1,000MB) and that you want it to fit on as few
>>CDs as possible, then you can calculate the minimum number of CDs reqd
(i.e.
>>two CD in this case).

>>Your algorithm then only has to search for a solution to 'how do I divide
>>these files among two CDs', and can stop as soon as it has a solution.

>I'm not sure I'd take that approach.  I'd rather fill up one disc to the
>maximum possible extent, then move on to the next.

I was not implying that you need a perfect solution. My meaning was that you
could accept an imperfect solution as long as it didn't consume more CDs
then required - if a very simplistic algorithm splits 1,000MB of files
across three CDs then a more sophisticated algorithm is required.

If there is a way to split the major directories across discs and still use
the minimun number of discs then that seems the ideal solution (i.e.
calculate the size of each directory tree, and perform your packing
algorithm over these sizes first).

- Show quoted text -

Quote:

>>Unfortunately the space calculation will itself be very complicated - what
>>is the minimum size that a file can be on your CD-Rs? Does a 10 byte file
>>consume a 2K block on the CD-R? What is the overhead per file of the
>>directory?

>Yes, indeed.  A close familiarity with ISO-9660 will be essential.  Of
>course, the backup program will probably be generating the ISO-9660 image
>anyway, so it's not redundant work.

>>These issues could make it next to impossible to exactly specify how much
>>space will be consumed on one CD.

>I disagree.  You pick a file that looks like it'll fit, and add it.  If
>you go over the amount of space available, you revert your change, and
>try again with a smaller file.  If there are no smaller files, you
>pronounce that disc complete, and move on.

OK, you're discussing an algorithmic point. I had presumed that you would be
using some generic packet writing software and writing data to the disk via
a drive letter or some such equivalent. Given that packet software could be
quite inefficient in the way it uses space and that you are at the mercy of
the particular implementation you wouldn't have enough information to
perform the calculation.

If you're prepared to get your hands dirty and write in ISO-9660 format
yourself then all of the information required to accurately calculate space
should be available to you.

- Show quoted text -

>>If you have to restore one whole directory from your backups, you really
>>want to be able to do this from the minimum number of discs. This implies
to
>>me that you need to attempt to keep each directory tree together on as few
>>disks as possible.

>A good point.  "Best effort" may be good enough here.  You back up things
>more often than you restore them, but if that's not the case for the
>expected use of this, then you have to optimize differently.

>Anyway.  Assuming locality is not vital, you can probably get away by
>starting out with grabbing the largest file, and working on down.  It
>won't be perfect, but life is like that. :-)

>The problem of dealing with files >= 650MB remains, of course.  Some
>transparent file compression won't hurt, but it will make the space
>calculation prohibitively slow (because you have to compress it before you
>know how big it is).  You're also going to need lots of free space to do
>the backup, if you're going to create the entire ~650MB disc image up
>front and then write it.

>Allowing the last file on the disc to span onto the next one is a MUCH
>easier approach to this...

>--

>CD-Recordable FAQ - http://www.fadden.com/cdrfaq/ (a/k/a

www.spies.com/~fadden)
Quote:>Fight Internet Spam - http://spam.abuse.net/spam/ &

news.admin.net-abuse.email
 
 
 

algorithm for maximal fit (cd-r backups) needed

Post by Andy McFadd » Fri, 24 Sep 1999 04:00:00




>OK, you're discussing an algorithmic point. I had presumed that you would be
>using some generic packet writing software and writing data to the disk via
>a drive letter or some such equivalent. Given that packet software could be
>quite inefficient in the way it uses space and that you are at the mercy of
>the particular implementation you wouldn't have enough information to
>perform the calculation.

>If you're prepared to get your hands dirty and write in ISO-9660 format
>yourself then all of the information required to accurately calculate space
>should be available to you.

I believe he's writing it for Linux, so packet writing isn't yet an
option.

--

CD-Recordable FAQ - http://www.fadden.com/cdrfaq/ (a/k/a www.spies.com/~fadden)
Fight Internet Spam - http://spam.abuse.net/spam/ & news.admin.net-abuse.email

 
 
 

algorithm for maximal fit (cd-r backups) needed

Post by bill davids » Sat, 25 Sep 1999 04:00:00



| looking for ideas in an optimal solution toward solving a problem in
| cd-r backups (using linux, initially at least; maybe more o/s' after
| the general algorithm is worked out and implemented).
|
| the idea is to take a large hard disk (or partition) and break it up
| into directories and/or files and having the collection of these
| objects span several cd-r's when making a backup.  and minimizing
| wasted trailing space on each cd-r in the saveset.

Sure, easy. I have a program which does just that. For a typical hard
disk it would take perhaps 14-16 years of CPU time to find an optimal
answer.

If you're impatient I have a little AWK program I wrote when I was
backing up a unix-pc to floppies. It won't be optimal, but it is very
good and quite fast.

Somewhere I also have a little program which is still usefully fast and
does a slightly better job than the AWK program. I'm not offering to go
through old tapes from 1994 and find it, but there is such a beast.

Finally, unless you want only to load everything from all CDs, you
probably want to make some effort at keeping all the files in a
directory together as much as possible without using more than the
minimum number of storage media. That makes partial restores easier. I
have some thoughts on that, but I've never coded it.

--

  When taking small children to a carnival, always have them go potty
*before* you let them go on the rides, and let them eat all the junk
food and candy *after*.

 
 
 

algorithm for maximal fit (cd-r backups) needed

Post by bill davids » Sat, 25 Sep 1999 04:00:00



| I've actually solved puzzles and won prizes applying good ol' Greedy,
| leaving people to wonder, ``How'd he do that?''  So I'm biased.  Yet it
| is so easy:  work from placing the largest item to the smallest.

Sure, but you don't need a lot of CPU time to do better, defined as
using one less backup media, by some examination of the first output of
greedy and using a bit of trial and error with tree pruning and a limit
on how close you want to come before you stop trying.

--

  When taking small children to a carnival, always have them go potty
*before* you let them go on the rides, and let them eat all the junk
food and candy *after*.

 
 
 

1. Is CD-R or CD-RW good dor back-up ???

Hi, BSD people !

We are gonna to buy a CD-R or CD-RW device for OpenBSD 2.5 i386 box.
We think that it is possible to use it as back-up as well.

Can anybody share their expirience with such devices
and give me, please, any advice on particular model ???

Thanks,

-Deemon.

Sent via Deja.com http://www.deja.com/
Before you buy.

2. fetchmail no route to host

3. fitting large amounts of data onto CDs

4. hpgl -> postscript

5. How to fit a Hole Suse into CD's

6. login sessions "hanging" temporarily?

7. paranoia algorithms remaining of algorithm

8. How DHCP Client can auto set a static IP, when connect to DHCP Server fail ?

9. backup util/backup to CD-RW?

10. What do I need to make a system backup on external CD recorder (Plex 121032S)

11. CD-ROM mount needs source <--> source needs CD-ROM mount

12. Trying to find a good apache log analyzer to fit my needs.

13. NEEDED BADLY: Curve fitting program for Linux