> the assumption is that by using native compilers we gain more efficient
> executables. is this true? what i mean is, if we use one uniform compiler
> (the only one i know of is gcc, if you know of something else please tell me)
> will we have slower modules? also, we use external libraries which sometime
> force us to compile the whole module with the compiler the suppliers of the
> libraries used. is there any way to circumvent that?
Solaris, IRIX, etc... The only thing we use GCC for is LINUX. Back
about 10 years ago, we found that GCC equalled or outperformed the
vendor implementatiosn (some of which were still PCC based). Since
GCC was already pusing ANSI (C) compliance and some of the vendor
compilers couldn't do function prototypes yet we used a lot of GCC.
A few years back we did the tests again. Most C implementations
were pretty good standards wise (at least as far as the language
itself was concerned) and the performance of all the vendor compilers
far outsripped GCC. We gave up and went back to using the native
compilers. Since going to C++ we still head bang around many of
the compiler braindamanges (which include G++) when it comes to
C++ implementation. We carry around the SGI STL implementation
so we have a known quantity there.
As far as the library issue goes, you're stuck. You either have
to have a KNOWN application interface that all the compiler vendors
code to (which exists for SGI and Sun for example) or you end up
not being able to mix and match. The other option is to go to one
of the network based API's like CORBA (which however, is still
more pain and suffering than it's worth in my opinion).
[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ --- Please see the FAQ before posting. --- ]
[ FAQ: http://reality.sgi.com/austern_mti/std-c++/faq.html ]