problem allocating large amount of memory

Roland Krause rokrau at yahoo.com
Wed Dec 3 21:24:47 EST 2003


Hi all,
I am trying to allocate a continuous chunk of memory of more than
2GBytes using malloc(). 

My sytem is a Microway Dual Athlon node with 4GB of physical RAM. The
kernel identifies itself as Redhat-2.4.20 (it runs RH-9). It has been
compiled with the CONFIG_HIGHMEM4G and CONFIG_HIGHMEM options turned
on. 

Here is what I _am_ able to do. Using a little test program that I have
written I can pretty much get 3 GB of memory allocated in chunks. The
largest chunk is 2,143 GBytes, then one of 0.939 GBytes size and
finally some smaller chunks of 10MBytes. So the total amount of memory
I can get is close enough to the promised 3G/1G split which is well
documented on the net. 

What I am not able to do currently is to get the 2.95GB all at once.
"But I must have it all."

I have set the overcommit_memory kernel parameter to 1 already but that
that doesn't seem to change anything. 

Also has someone experience with the various kernel patches for large
memory out there (im's 4G/4G or IBM's 3.5G/0.5G hack)? 

I would be very greatful for any kind of advice with regards to this
problem. I am certain that more people here must have the same problem.


Best regards,
Roland




__________________________________
Do you Yahoo!?
Free Pop-Up Blocker - Get it now
http://companion.yahoo.com/
_______________________________________________
Beowulf mailing list, Beowulf at beowulf.org
To change your subscription (digest mode or unsubscribe) visit http://www.beowulf.org/mailman/listinfo/beowulf



More information about the Beowulf mailing list