Automatic Data Segment Exceeds 64K In Program With No Data?

Automatic Data Segment Exceeds 64K In Program With No Data?

Post by Timothy Pats » Fri, 23 Nov 2001 07:35:34



I'm working on a very old DOS program on Borland C++ 4.51.
The memory model is LARGE.
The platform is DOS (16-bit DPMI).

Recently, I got a linker error:
"Automatic data segment exceeds 64K"

The problem only occurs in the debug version of my program.  The
release version is fine.

After some research, I understand the nature of the problem and have
seen many solutions offered.  Things like changing compilers, memory
models, or target platforms are not options for me because the program
is large and becomes horribly broken when I do these things.

After getting no results by placing large amounts of data in the
program into far segments, I decided to try a ridiculous experiment.
I removed all data from the program.  I still get the error.

It's almost as though breaching the 64K threshold flips a switch
somewhere in my compiler that I cannot unflip no matter how much data
I remove from the program.

Someone offered the following solution - "don't compile in debug
information to all modules of the program."  I went one better.  I
turned off the compile in debug information for the entire program.  I
still get the error.

I know I'm not breaching the limit at this point.  It seems like I
just can't wake the compiler up to this fact now.

Anyone had this problem?

 
 
 

Automatic Data Segment Exceeds 64K In Program With No Data?

Post by Alexander Russel » Fri, 23 Nov 2001 15:17:23



Quote:> I'm working on a very old DOS program on Borland C++ 4.51.
> The memory model is LARGE.
> The platform is DOS (16-bit DPMI).

> Recently, I got a linker error:
> "Automatic data segment exceeds 64K"

> The problem only occurs in the debug version of my program.  The
> release version is fine.

> After some research, I understand the nature of the problem and have
> seen many solutions offered.  Things like changing compilers, memory
> models, or target platforms are not options for me because the program
> is large and becomes horribly broken when I do these things.

> After getting no results by placing large amounts of data in the
> program into far segments, I decided to try a ridiculous experiment.
> I removed all data from the program.  I still get the error.

> It's almost as though breaching the 64K threshold flips a switch
> somewhere in my compiler that I cannot unflip no matter how much data
> I remove from the program.

> Someone offered the following solution - "don't compile in debug
> information to all modules of the program."  I went one better.  I
> turned off the compile in debug information for the entire program.  I
> still get the error.

> I know I'm not breaching the limit at this point.  It seems like I
> just can't wake the compiler up to this fact now.

> Anyone had this problem?

You have to get rid of the right kind of data. Large local (automatic)
varibles generally go on the stack and shouldn't cause link errors (just
stack over-flow).

Large STATIC and GLOBAL varibles go into there own segment. Sometime
changing varible from static or global to vars on the heap helps, eg

#include <stdio.h>

#define BIG_NUM some_big_number

int big_array[BIG_NUM];

void main(void)
{
    int i;

    for (i=0; i <BUB_NUM; i++ )
        big_array[i]=i;

Quote:}

the above could cause a segment exceeds 64k error

change to the code below to move big_array into the heap

int *big_array;

void main(void)
{
    int i;

    big_array=malloc(sizeof(int)*BIG_NUM);

    if ( big_array )
        {
        for (i=0; i <BUB_NUM; i++ )
            big_array[i]=i;
        }
else
    printf("out of mem\n");

- Show quoted text -

Quote:}


 
 
 

Automatic Data Segment Exceeds 64K In Program With No Data?

Post by m.. » Fri, 23 Nov 2001 09:54:17


I don't know your code or what you're doing, but this is
almost always a sign of trying to allocate too much data
at a compile/build time.  

Whenever I had this message, the fixes were either:
 a) removing extraneous variables/tables
 b) allocate them dynamically

Mu guess is that you overlooked something -- one way to
find the culprit is by removing portions of the program
one by one until the error goes away.  


> I'm working on a very old DOS program on Borland C++ 4.51.
> The memory model is LARGE.
> The platform is DOS (16-bit DPMI).
> Recently, I got a linker error:
> "Automatic data segment exceeds 64K"
> The problem only occurs in the debug version of my program.  The
> release version is fine.
> After some research, I understand the nature of the problem and have
> seen many solutions offered.  Things like changing compilers, memory
> models, or target platforms are not options for me because the program
> is large and becomes horribly broken when I do these things.
> After getting no results by placing large amounts of data in the
> program into far segments, I decided to try a ridiculous experiment.
> I removed all data from the program.  I still get the error.
> It's almost as though breaching the 64K threshold flips a switch
> somewhere in my compiler that I cannot unflip no matter how much data
> I remove from the program.
> Someone offered the following solution - "don't compile in debug
> information to all modules of the program."  I went one better.  I
> turned off the compile in debug information for the entire program.  I
> still get the error.
> I know I'm not breaching the limit at this point.  It seems like I
> just can't wake the compiler up to this fact now.
> Anyone had this problem?

--
*** This space is for rent ***
 
 
 

Automatic Data Segment Exceeds 64K In Program With No Data?

Post by Matthew J. Mine » Fri, 23 Nov 2001 12:56:05


Based on your comment re: it only fails in debug mode, my guess is that the
number of symbols (variables, functions, etc.) has finally exceeded the
limit (64K?) for the symbol table size. Making things "far" or dynamic, or
removing data (e.g., array elements) without removing variables, will not
change the number of symbols in the program.

Suggestion: compile only necessary modules in debug mode - compile the rest
in release form - that will put less entries in the symbol table. Another
choice offered by some compilers is to compile with line numbers only. That
is sufficient for statement tracing during debugging, though not for
displaying variables.

You alluded to a large program - another choice may be to divide the program
into more manageable chunks, some of which [functions] are enabled for
debugging (with full symbol tables), others in release form (no symbol
table).

Good luck.


Quote:> I'm working on a very old DOS program on Borland C++ 4.51.
> The memory model is LARGE.
> The platform is DOS (16-bit DPMI).

> Recently, I got a linker error:
> "Automatic data segment exceeds 64K"

> The problem only occurs in the debug version of my program.  The
> release version is fine.

> After some research, I understand the nature of the problem and have
> seen many solutions offered.  Things like changing compilers, memory
> models, or target platforms are not options for me because the program
> is large and becomes horribly broken when I do these things.

> After getting no results by placing large amounts of data in the
> program into far segments, I decided to try a ridiculous experiment.
> I removed all data from the program.  I still get the error.

> It's almost as though breaching the 64K threshold flips a switch
> somewhere in my compiler that I cannot unflip no matter how much data
> I remove from the program.

> Someone offered the following solution - "don't compile in debug
> information to all modules of the program."  I went one better.  I
> turned off the compile in debug information for the entire program.  I
> still get the error.

> I know I'm not breaching the limit at this point.  It seems like I
> just can't wake the compiler up to this fact now.

> Anyone had this problem?