To store data from a precalculation on about 1100 images and 830000 features, i'm trying to allocate about 3.5 gigs of memory but it seems not to be working. I was writing in C so I was using malloc, but I tried 'new' as well with the same problem. I'm running on a 64 bit linux server with 64 gigs of physical memory.
Elsewhere I've read that the maximum allocation size in bytes is theoretically determined by the size of size_t, which in my case is 8 bytes, using the formula 2^(8*sizeof(size_t)) - 1. This is 2^64-1 bytes, which should be more than enough. I've checked the actual numbers (of _total_size and total_size in the code below) being sent to malloc/new using gdb and their sizes are correct. Yet for some reason it will not allocate the complete amount of memory. It only seems to allocate about 2^31.005 bytes of memory, which is not enough. It also doesnt give any warnings or exceptions, not even when I try to write in the unallocated part later on. Only when I try to read out data from there I get a seg fault.