Executable getting very large

Executable getting very large

Post by Eric Burk » Mon, 17 Jun 1996 04:00:00



I've been setting up a C++ development environment for several
programmers,
and we have our application divided into multiple sub-projects.  Each of
these sub-projects, usually one GUI screen, is located in its own
directory.
This is all on UNIX (Sun), using SCCS for configuration management.

Each subproject is developed independently from the others, and each
of these subprojects has its own static library.  Consequently, we have
quite a few files such as libScreenA.a, libScreenB.a, etc...

The directory structure has a "master" makefile, which links all of
these
library files into one executable.  The whole system is working very
well for us, however I am concerned about the size of the resulting
program.  It is around 5.5 MB now, although it does include debugging
info still.

Would the resulting executable be smaller if we didn't build it from
multiple libraries?  If libraries are interdependent, we sometimes have
to list them two or three times in the LIBS section of the makefile...
does this cause extra code to be inserted into the executable, as well?

Any insight would be appreciated!
--
...
...  Eric M. Burke                 -  C++

...  http://www.i1.net/~ericb/     -  HTML
...  314-895-6250
...

 
 
 

Executable getting very large

Post by Patrick Horg » Wed, 19 Jun 1996 04:00:00


 . . . snip . . .

Quote:

> Each subproject is developed independently from the others, and each
> of these subprojects has its own static library.  Consequently, we have
> quite a few files such as libScreenA.a, libScreenB.a, etc...

Is there some reason that you've chosen not to use dynamic libraries?  
For a project this size, the use of static libraries seems a poor choice
due to the size of the resultent footstep and because of load time.  I
don't know if you've done it before, but all you have to do is use the
appropriate flag to make sure everything's built as position independent
code, the one more compile line to put everything together in a dynamic
library.  I'd give you more detailed information but you just said Unix
on Sun, so I don't know SunOS 4 or SunOS 5, nor what compiler you're
using.  (hmmm...I wonder if we should have something about static vs.
dynamic libraries in the c.u.p. faq?)

Quote:

> The directory structure has a "master" makefile, which links all of
> these
> library files into one executable.  The whole system is working very
> well for us, however I am concerned about the size of the resulting
> program.  It is around 5.5 MB now, although it does include debugging
> info still.

Use the size(1) command to figure out how much space is going where.

Quote:

> Would the resulting executable be smaller if we didn't build it from
> multiple libraries?  If libraries are interdependent, we sometimes have
> to list them two or three times in the LIBS section of the makefile...
> does this cause extra code to be inserted into the executable, as well?

If you use dynamic libraries you'll only have to list them once.

--


   Opinions mine, not my employer's except by most bizarre coincidence.

 
 
 

Executable getting very large

Post by Stephen Bayn » Fri, 21 Jun 1996 04:00:00


: I've been setting up a C++ development environment for several
: programmers,
: and we have our application divided into multiple sub-projects.  Each of
: these sub-projects, usually one GUI screen, is located in its own
: directory.
: This is all on UNIX (Sun), using SCCS for configuration management.

: Each subproject is developed independently from the others, and each
: of these subprojects has its own static library.  Consequently, we have
: quite a few files such as libScreenA.a, libScreenB.a, etc...

Sounds a good way of doing it. We do much the same thing, usually have
a library for each directory of source code. We have had programs linked
from about 50 libraries on occasions.

: The directory structure has a "master" makefile, which links all of
: these
: library files into one executable.  The whole system is working very
: well for us, however I am concerned about the size of the resulting
: program.  It is around 5.5 MB now, although it does include debugging
: info still.

Use size command, or try stipping the executable to see how much is debug
information and how much is code. You also need to add an estimate for
dynamic memory usage to get an idea of the memory needed at run time.

: Would the resulting executable be smaller if we didn't build it from
: multiple libraries?  If libraries are interdependent, we sometimes have
: to list them two or three times in the LIBS section of the makefile...
: does this cause extra code to be inserted into the executable, as well?

No, ld only should pull any object in once so the size will be the same.
We have done this alot and never had problems. See what map file options
your linker offers so you can see what code is pulled in (Probably -m.)
You might also investigate if you ld has the -y <symbol> option to trace
specific symbols.

If you wish to reduce repetitions of libraries to improve link speed
look at the lorder command or using the -u <symbol> switch to the linker.
Excesive use of -u can result in maintenace problems so be selective.

If you have an extreamly large program you might be able to improve performance
by carefull ordering of the object files in the image (by controling the
ordering they are linked) to reduce paging. However unix does not offer many
tools to help decide and obtain a good order or so I would not try doing
anything unless you have real problems. Even then a better solution would be to
consider if you really need one integrated application or if you can spawn sub
applications from another.

--

Philips Semiconductors Ltd
Southampton                                 My views are my own.
United Kingdom
 Are you using ISO8859-1? Do you see ? as copyright, as division and ? as 1/2?

 
 
 

Executable getting very large

Post by David Thomas Richard Giv » Sat, 29 Jun 1996 04:00:00



Quote:>I've been setting up a C++ development environment for several
>programmers,
[...]
>The whole system is working very
>well for us, however I am concerned about the size of the resulting
>program.  It is around 5.5 MB now, although it does include debugging
>info still.
[...]
>Would the resulting executable be smaller if we didn't build it from
>multiple libraries?  If libraries are interdependent, we sometimes have
>to list them two or three times in the LIBS section of the makefile...
>does this cause extra code to be inserted into the executable, as well?

[...]

A .a library is just an archive similar to a tar file that contains
all the .o files that make up the library, and an index file. If you use
ar tv libfoo.a it'll list them all. *All* these libraries do is provide
a useful short-cut to the user so that he doesn't have to put several
hundred .o files on the command line.

The linker will check all the .o files provided for interdependency and
only include those that are used. Some (broken) linkers will only look
inside .a files once, requiring you to put them on the command line
more than once, but they won't include the same .o file in the executable
more than once.

A 5.5 MB executable including debugging information is all right for a
large project. I did a project that gave me a 2 MB executable once;
after stripping out debug info it dropped to ~200kB. Try stripping it
and seeing what happens.

The other cause for large executables is using static libraries rather
than dynamic ones. If you have ldd on your system, try running it on
your executable to see what dynamic libraries it needs. Usual culprits
are Motif (libXm.so) and C++ runtime libraries, which tend not to be
installed properly. Motif will bloat your executable by a couple of meg
and the C++ runtime is the source of the infamous 500kB `Hello world'
program the old versions of gcc produced.

--
------------------- http://www-hons-cs.cs.st-and.ac.uk/~dg --------------------
   If you're up against someone more intelligent than you are, do something
    totally insane and let him think himself to death.  --- Pyanfar Chanur
---------------- Sun-Earther David Daton Given of Lochcarron ------------------

 
 
 

Executable getting very large

Post by Reto Korad » Mon, 01 Jul 1996 04:00:00



Quote:>*All* these libraries do is provide a useful short-cut to the user so
>that he doesn't have to put several hundred .o files on the command line.

Wouldn't a .o file *always* be linked if you put it on the command line,
while the linker would only extract the files from a library that are
really needed? That's an important difference IMO.

Quote:>Some (broken) linkers will only look inside .a files once, requiring you
>to put them on the command line more than once, [..]

I wouldn't consider such linkers to be broken, that's perfectly normal
behaviour for a UNIX linker. If you need to put a library on the command
line more than once, it's a very strong indication that you have a
serious problem in your program/library hierarchy.
--

 
 
 

Executable getting very large

Post by Tim Hollebe » Thu, 04 Jul 1996 04:00:00


: >Some (broken) linkers will only look inside .a files once, requiring you
: >to put them on the command line more than once, [..]

: I wouldn't consider such linkers to be broken, that's perfectly normal
: behaviour for a UNIX linker. If you need to put a library on the command
: line more than once, it's a very strong indication that you have a
: serious problem in your program/library hierarchy.

It's a feature, not a bug.  Consider:

cc buggy.o -ldebuglib otherstuff.o -lnormallib

Where debuglib and normallib define different versions of the same
functions.  buggy.o will use hte debug versions, which the rest will
use the normal versions.

If you 'fix' my linker so that this doesn't work I'll be very unhappy.
To avoid problems under normal circumstances, you only need to follow
one simple rule: always put libraries last on the line.  Unless, of
course, you have libraries that need symbols from other libraries,
which is a bad idea anyway.

---------------------------------------------------------------------------
Tim Hollebeek         | Disclaimer :=> Everything above is a true statement,
Electron Psychologist |                for sufficiently false values of true.

----------------------| http://wfn-shop.princeton.edu/~tim (NEW! IMPROVED!)

 
 
 

Executable getting very large

Post by Andrew Gier » Thu, 04 Jul 1996 04:00:00



>: >Some (broken) linkers will only look inside .a files once, requiring you
>: >to put them on the command line more than once, [..]

>: I wouldn't consider such linkers to be broken, that's perfectly normal
>: behaviour for a UNIX linker. If you need to put a library on the command
>: line more than once, it's a very strong indication that you have a
>: serious problem in your program/library hierarchy.

Well, this can be unavoidable in the case of mutual references.


Quote:>It's a feature, not a bug.  Consider:

>cc buggy.o -ldebuglib otherstuff.o -lnormallib

>Where debuglib and normallib define different versions of the same
>functions.  buggy.o will use hte debug versions, which the rest will
>use the normal versions.

Not on any linker I know of; any module in debuglib referenced by 'buggy.o'
will get included in the link, and references to its public symbols from
'otherstuff.o' will be satisfied, causing the equivalent modules from
normallib not to be included.

Quote:>If you 'fix' my linker so that this doesn't work I'll be very unhappy.

Watch out for the AIX linker (if you haven't already met it); it breaks
almost everyones expectations about linkers...

Quote:>To avoid problems under normal circumstances, you only need to follow
>one simple rule: always put libraries last on the line.  Unless, of
>course, you have libraries that need symbols from other libraries,
>which is a bad idea anyway.

Er... it's actually *extremely* common; one builds up higher level
libraries using lower level ones, and so forth. But it is generally true
that libraries should go last, and that references to a symbol should
appear earlier in the link order than the definition.


"Usenet is like a herd of performing elephants with diarrhea; massive,
difficult to redirect, awe-inspiring, entertaining, and a source of
mind-boggling amounts of excrement when you least expect it." [Gene Spafford]

 
 
 

Executable getting very large

Post by Reto Korad » Fri, 05 Jul 1996 04:00:00




>>: I wouldn't consider such linkers to be broken, that's perfectly normal
>>: behaviour for a UNIX linker. If you need to put a library on the command
>>: line more than once, it's a very strong indication that you have a
>>: serious problem in your program/library hierarchy.

>Well, this can be unavoidable in the case of mutual references.

Exactly what I meant to say. If you have libraries with mutual references,
I'd get very suspicious about the program structure. It's common design
practice to have modules that are in a well defined hierarchy, not calls
back and forth between different modules.

Quote:>Watch out for the AIX linker (if you haven't already met it); it breaks
>almost everyones expectations about linkers...

Weren't talking about UNIX? If you expect AIX to have any similarites
with UNIX, it will break many expectations... :)
--

 
 
 

Executable getting very large

Post by Stephen Bayn » Sat, 06 Jul 1996 04:00:00



: one simple rule: always put libraries last on the line.  Unless, of
: course, you have libraries that need symbols from other libraries,
: which is a bad idea anyway.

Why is it a bad idea? How do you avoid it? Do you merge libc with your
own code library so you library does not need the printf symbol from libc
when you call printf? How do you handle reusing utility functions between
diffrent applications if you don't put them a library that can be used by all,
or do you not use libraries for the applications code?

--

Philips Semiconductors Ltd
Southampton                                 My views are my own.
United Kingdom
 Are you using ISO8859-1? Do you see ? as copyright, as division and ? as 1/2?

 
 
 

1. Change memory limits for large executable?

Hello,
I have a dual board Linux PC with 786K RAM and 1.6G swap, and I am
trying to run an executable which is quite large:
   text    data     bss     dec     hex filename
 649952  283536 1383243996      1384177484      5280df4c        convec3d
(as far as I understand it, it's about 650K). When I try to start this
executable, it crashes with the error "Speicherzugriffsfehler" (Memory
access error), and I guess that it's just too large. OTOH, I think that
it could be possible to run it on the machine if I could increase the
amount of memory I may access. So my question is how to do that. ulimit
shows
ulimit -v
unlimited
and I didn't understand what I would have to change in the
/etc/security/limits.conf to make this executable runnable on my box.
Is there a way to achieve what I want - or do I maybe have to set some
g77 flag when compiling to get beyond some compiler-defined memory
barrier (I know such a kind of flag from the AIX compiler)?
Any pointers are most welcome.
--
------------------------------------------------------------------------
Thomas Ruedas
Institute of Meteorology and Geophysics, J.W.Goethe University Frankfurt

http://www.geophysik.uni-frankfurt.de/~ruedas/
------------------------------------------------------------------------

2. Info on SCOSS and SWiM

3. gcc 2.7.2 - Large Executable Size

4. Netatalk node/router?

5. Solaris: getting executable name from pid

6. DNS problem, Non-existent host/domain? bind-4.9.3-BETA9/named

7. getting executable code into the first 32MB

8. Downside to installing WINS server?

9. getting executable to run

10. Getting the executable name.

11. Getting the path of an executable

12. Getting symbol table from executable, or linker...