Quote:> > > The ideas that keep popping up are about integrating package
> > > management at a deeper level of the OS. I see the problem as one
> > > of fan-out, in which files from a package are spread out across
> > > the system. One hard way to manage things would be to keep files
> > > in the same places they are now and track them in a database. An
> > > easier way might be to reorganize the filesystem, and keep files
> > > for a package together and access them in place.
Oh, right, the MS-DOS solution: make a directory, C:\PACKAGENAME, and
put everything in it. Then edit AUTOEXEC.BAT to add the directory to
your path, the end result of which is that $PATH has six* thousand
directories in it, half of which are for packages you no longer have
around. (I'm exaggerating the problem, of course, but this is
definitely one of the (many) things I don't like about the DOS world.)
$PATH management is one problem and it is tripled if you bring header
files and manpages into it. Another is functional difference between
different files. For instance: /etc traditionally stores config files.
/var stores files with relatively volatile contents. You might want to
mount your root filesystem read-only -- or even share it across several
machines -- and have separate read-write partitions for /etc, /var and
/tmp. If each package has a directory tree in which it keeps its own
config files, you pretty much can't share those trees around via NFS.
You also have to dump everything to tape, not just /etc, in order to
cover what can't be trivially reinstalled (i.e. your config files).
Quote:> > > I'm intrigued by the second solution, in which libraries would be
> > > kept with their h-files and man pages. I think this would make
> > > it easier to keep multiple revisions of libraries on-line, and
> > > ease automation of the upgrade process.
Do you often need multiple versions of a library? For run-time libs,
yes, perhaps, so as not to break applications. That's why .so's are
versioned. But when you're using a library for _development_, i.e. you
need the .h and .3 files around, what do you use multiple versions for?
(Not a rhetorical question, BTW.) I just pick a recent stable version.
Yes, there are times when two versions of a library have significant
functional differences: Berkeley db 1.85 and 2.x, for instance. Maybe
this is what you're thinking of?
Quote:> have a system of symbolic links and a file containing a list of those
> belonging to each packages (which would be kept in that packages
> installation directory).
If you're going to pollute your main directory trees with symlinks all
over the place, why not pollute them with the original files? In other
words, as far as I can tell, the symlink system is very like the
database system except with an extra layer of complexity.
I have to vote for the database system: the Debian Project's dpkg does
a great job of keeping track of whose files are whose, and it upgrades
packages very smoothly. (It even lets the pkg maintainer make a list
of "config files" then md5's them so it'll know not to overwrite one if
you've edited it. Whoever came up with that concept is a genius.) I
assume (though I haven't used it) that rpm is just as smooth. As for
multiple library versions, I don't know offhand what Debian packaging
does for, say, db 1.85 and db 2.0 header file/manpage conflicts (if
there be any). With minor library versions, that is to say, anything
the package maintainer doesn't feel to be an important enough
distinction, it just doesn't let you install more than one version....
This limitation, frankly, hasn't bothered me yet.
<sampo.creighton.edu ! psamuels>