Jethro Wright III wrote:
> George MacDonald wrote in message <3728CE6D.47EE...@slip.net>...
> >Jethro Wright III wrote:
> [snip]
> >Fortunately on *nix there are a number of standards that define
> >the command and library interface. This helps greatly in porting
> >code. However issues such as which desktop is used are more complicated.
> >I have an app that runs on Linux and initially I wrote an install that
> >worked for fvwm2 as it was what I used. However I just added another
> >script to install for KDE when it detects it's presence. It would be
> >useful to have a tool, perhaps called desktopInstall, that is fed
> >some common info and then localaizes properly for the users desktop(s).
> >For example
> > Adding program menu items
> > Specifying what the program icon is
> > Defining where help files are located ...
> Too much trouble.
Well for you perhaps, but it woulld ease the burden of those who
wish to support multip desktops. Making things configurable in
this manner is a good thing, not a bad thing.
> What I'm talking about is to ignore
> most of what's already there (ie. pre-installed sware) and provide
> a pre-packaged distribution that has all of the config stuf
> done when the pkg is created. E.G.:
It sounds like your heading towards a virtual machine, which is
what java is all about. Think of it like a spectrum on one
end you have hand tuned assembler, on the other you have
OO based virtual machine environments. In between are
X11 apps, Xt Apps, Motif Apps, KDE, GNOME ... GNOME and
KDE are farther towards a OO VME. Each of these has
advantages and disadvantages, it's always been that
way. Much of the progression of computer science is to
build sufficiently well designed layers that allow
one to solve problems in the language of the problem space
v.s using a machine/assembly/computer language. So you
pick the tools that solve the problem for the target
hardware/software environment. In doing so you eliminate
other choices. MS windows targeted the PC hardware
which brought with it near/far memory pointers,
segmented memory, programs doing there own swaping ...
Despite the fact that virtual memory systems had been arround
for years and were well known to provide more stable environments.
Today the Win 9* series carries this baggage.
Likewise, when you are targeting to a particular hardware/software
combination( from what you say this is Linux ), then you will
bind with the tools that get you there. Fortunately Linux is
based on unix which was exceptionally well designed. You have to
decide what your target audience is, what the hardware/software
combo is, how much work you want to put in, and what tradeoffs
you are willing to put up with. Using the higher level environments
of GNOME/KDE buy you very rich environments to leverage off of.
But they can bind you to them if you are not careful. For many
this is a simple equation, simply live withing the environment
of GNOME or KDE, either will cover 90+ % of your needs. The
remainder you will have to live without.
If you want portability to both, then you have to write at
a lower level, or split your code into a gui part and a
app part(not a bad idea anyhow). Both of these take more
work, so it's up to you to decide. Much of the beauty of
Linux is that a lot of the app part is already written
in shell based programs, these can often have a gui
wrapper written on top of them. i.e. a gui app can
open a pipe to any shell based command and read it's
output.
> XFree86 is in place and we can safely reason that the
> purchaser of our GlobalGalactic GUI app already has that
> operating.
This is your first decision, already you rule out shell based
users, servers, and low end machines.
> Now, we pkg GlobalGalactic w/ KDE or (GNOME)
> pre-config'd and localization scripts, so that multiple
> users of GG can use the same setup. Now, KDE/GNOME rely on
> certain shared libs normally provided w/ the OS.
Again more decisions, more reduction of your target space.
You rule out non GNOME/KDE users, if you produce executables
you rule out other architectures sparc, mips, ppc ...
This is why providing the source is so powerful.
> Simply provide
> the shared libs in the kit and replace the defalt lib paths
> w/ our localized paths.
Again more reduction of target space. If you ship the libs,
then what happens as users migrate forward, you endager
overwritting there newer libraries. But you can always
create statically linked apps. These are only bound to
an architecture and a kernel interface(which is more stable).
> This would be a little more costly in terms of disk
> space (but that's cheap nowadays), however the cost could be
> justified by the advantage of having pkgs that are certain
> to meet the users' expectation of being fully-operational w/
> GG and KDE/GNOME.
> The effort of developing/testing GG against different Linux
> variants is costly enuf w/o expending additional resources
> for the effort of trying to be compatible w/ infinite possible
> combinations of libs, installed pkgs, customized menus, etc.
That's why java was created. If you take your approach you have to
start somewhere, go for the largest target for the least cost,
or if you are time constrained, go for the most return for the
least effort. Being flexible to all environments is costly in
terms of time, it's only worth it for some apps. Sometimes
it's cheaper just to rewrite for each environment you want to
target. How many apps were tossed going from windows 3.1 to
windows 95? A lot of them were rewritten without the segmented
memory code which became redundant.
> It would be doable to offer a "pioneer's" kit for GG, which
> would provide instructions for the pioneer who wants to tweak
> and play w/ things to get them just right for his/her customized
> system. Provide them w/ the specs (ie. GG needs libc5 x.1.0
> and libpng y.2.0, etc.) and scripts to get them started and
> put the burden of making it work on the power user.
This is what the Linux distributions are about, they perform the
compilations and packaging for the environment. Another approach
for you is to bind to distributions, i.e. Red Hat or SUSE ...
Then just follow their upgrade path.
People do take sources and create binary packages for different
configs. If you don't release the sources then you have to do
all that work yourself.
> >Each desktop seems to have a different hierarchy for organizing apps,
> perhaps
> >a mapping could be defined. KDE does a useful default, in that it wraps any
> >existing app menu hierarchy(on red hat, the KDE menu's point to the red hat
> menus,
> >i.e. the fvwm2 menus).
> >As for libraries, the version numbering on *nix handles most of the
> problems
> >with dynamic libraries. A host can have multiple versions of a library if
> >need be.
> >What one really needs is a decent maping service to localizable entities
> that has a standard
> >interface. Thus when an app runs, it accesses such values via it's mapping,
> hopefully
> >defined as data.
> I'm trying to figure out *how* to do this, in parallel
> w/ getting up to speed w/ new dev tools and a new OS.
Just start with one, say KDE, learn QT, then write your app. After
your done, look at GNOME and GTK, then port your app to it. Or vice
versa. If you want to support both, then start at a lower level with
X/Xt and one of the many toolkits. Motif if you want to go to workstations,
or one of the other excellent toolkits, some of which port to MS platforms.
The later will not use all the features of KDE/GNOME but will run and you
can still add them to menus ... This will take more work that the first option,
but gives you more portability.
Or just bypass all of this problem and write a java app, taking the slight
performance hit, and living withing it's gui limitations(what limitations
have you seen 2.0?).
> I want
> to focus on the most important issues: becoming operational.
> I've been lurking on the various Linux NGs, bilding a documentation
> base of solutions for the common probs, learning about how development
> is done and how things work in the Linux world. Whenever I post
> something, it's a general inquiry about broader issues like this
> question about app deployment. There's *so* much to learn that
> unless one focuses on the critical path items, you'll never get
> started. Thanx for the feedback....Jet
If you know C++, i'd say go with KDE. If you prefer C, then try GNOME.
Either of these will work. You might also take the GUI's for a test drive and
decide which one you like best. If you are concerned about the KDE open
source issue then go with GNOME, if you are loyal to the GNU projects you
should also go with GNOME.
--
We stand on the shoulders of those giants who coded before.
Build a good layer, stand strong, and prepare for the next wave.
Guide those who come after you, give them your shoulder, lend them your code.
Code well and live! - g...@slip.net (7th Coding Battalion)