Using large amounts of memory in the kernel

Using large amounts of memory in the kernel

Post by Shipman, Jeffrey » Thu, 18 Jul 2002 00:30:13



I've got a hash table of packet manipulation information
I need to use inside of a module in my kernel. The problem
is that this hash table is around 2MB. I'm trying to figure
out ways to shrink this table, but I'm coming up short on
ideas. What would be a good way to be able to allocate enough
memory to store all of this information?

Jeff Shipman - CCD
Sandia National Laboratories
(505) 844-1158 / MS-1372

-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in

More majordomo info at  http://vger.kernel.org/majordomo-info.html
Please read the FAQ at  http://www.tux.org/lkml/

 
 
 

Using large amounts of memory in the kernel

Post by Stephen Fros » Thu, 18 Jul 2002 00:40:08



Quote:> I've got a hash table of packet manipulation information
> I need to use inside of a module in my kernel. The problem
> is that this hash table is around 2MB. I'm trying to figure
> out ways to shrink this table, but I'm coming up short on
> ideas. What would be a good way to be able to allocate enough
> memory to store all of this information?

At a guess I'd say vmalloc...

        Stephen

  application_pgp-signature_part
< 1K Download

 
 
 

1. Trouble using large amount of memory with Redhat 7

I have a K7 AMD 1.2 G machine with 1.5 G of ram using an ASUS motherboard
and am running Redhat 7.  I use a Portland Fortran compiler (and sometimes
g77 and gcc).  I have found that when I write a simple program I can only
dimension a complex array by about 72000000 before the job will core dump (a
job size of about 549M as verified by computation and running top).  The
Portland people point to the OS and say that perhaps I can recompile the
kernel.  Okay ... so I've checked the Redhat site and done searching on
newsgroups, etc.   The hint I have found is that my stack size is 8M in the
kernel.  I can certainly recompile ... but I am trying to figure if this
will solve the problem.  I have noted that g77 has a different limit
(somewhat lower ... the Portland guy said however, that they have no such
limitation in the compiler).  Also a coworker uses a Lahey compiler on a
Win98 machine with 512M of memory.  He can dimension the array mentioned
above by 110000000 or about 840M.

Can anyone point me in the right direction and explain what is happening.  I
have written a small c code which pulls all of the resource limits and most
are set at unlimited but the stack and pipe.  Top and other means of
examining the memory indicate that all of the memory is recognized by the
system ... so I am inclined to agree with the Portland guy.

Thanks for the help,
J Montgomery

2. Spool writer SCO Unix

3. Large Amount of Kernel Memory on 2.4.16 Consumed by Kiobufs

4. HELP! Linux frame types

5. Alloc and lock down large amounts of memory

6. Complete P133 Computer System at Low Price

7. Allocating LARGE amounts of memory on an HP...?

8. A newbie? Welcome to comp.os.linux.setup, read this first if you're new here (FAQ)

9. can remap_page_range() map large amounts of memory

10. Allocating large amounts of Memory

11. Using Linux with Dual PIIs, AGP Video, Large Memory, Large HDs and On-Board SCSI

12. Programs for checking amount of memory used?