I'm creating a dynamically allocated char array like so:
1 2 3 4 5
int maxChars = 0;
//maxChars is modified with a value sometime here
char* charArray = newchar[maxChars];
Now, if the variable maxChars is below ten or so, it works. The memory is allocated, and the pointer to charArray is valid. However, if maxChars is above ten, charArray becomes a bad pointer, and cannot be used.
Can anyone explain to me why this is? Is there a set limit on the size of a char array or what? Is there any way to fix this, besides using a string? (Speed is a huge issue here.)
The cutoff is ten? You must be misinterpretting the problem.
You should be able to allocate well in the hundred-thousands (or more) before you run into any sort of allocation problems unless your computer is alarmingly low on RAM.
Not only that, but new never gives you a bad pointer. It either gives you a valid pointer, or throws an exception (I suppose older compilers might also return a null pointer if allocation was unsuccessful). You must be corrupting the pointer on your own somehow.
Well, I have four gigs, so memory isn't the problem.
The above code is almost exactly what runs in my program. There is a function that modifies the array, and it might be related to the problem.
1 2 3 4 5 6 7 8 9 10 11 12
//Convert the integer key location to a text string
void integerToKey(unsignedlonglong location, char* output)
{
unsignedlonglong num = location;
int length = charsetLength + 1;
for(int i = 0; num > 0; i++)
{
output[i] = charset[num % (length)];
num /= length;
}
}
There isn't a fixed limit. It's limited by the amount of memory on the box. However, if you can't get more than 10 bytes, I'd guess that you've corrupted the heap earlier.
Can you reproduce your problem in a small standalone sample application?