#include <iostream>
#include <limits>
usingnamespace std;
int main(), cinsize(istream & is);
int main()
{
cout << "This is a test.\n";
int x;
while(1)
{
cout << "Enter something.\n";
cin >> x;
cout << "You entered: " << x << endl;
cout << "cinsize(cin) = " << cinsize(cin) << "!" << endl;
if (cin.fail())
{
cin.clear(cin.rdstate() ^ ios::failbit);
cin.ignore(cinsize(cin), '\n');
if (cin.fail())
cin.clear(cin.rdstate() ^ ios::failbit);
}
}
return 0;
}
int cinsize(istream &is)
{
char*dump;
int limit = 602100320829408;
dump = newchar[limit];
ios_base::iostate ioinfo = is.rdstate();
is.clear();
is.read(dump, limit);
is.clear(ioinfo);
delete dump;
return is.gcount();
}
for getting the size of the cin buffer? I know limit is significantly smaller than std::numeric_limits<std::streamsize>::max(), but I couldn't get my compiler to create a char pointer that big.
limit is the biggest char * i could create.. Is this machine or implementation dependent? Anything else wrong with my code?
Is this seriously the only way to get the size of the buffer left in cin?
like I said that was the largest buffer I could feed into the read() function, and char * dump = [std::numeric_limits<std::streamsize>::max()] caused it to crash.
I haven't read your previous threads and probably will not have time to do so until tonight. Because of that, I am not sure what you are trying to do. The constant you have there is larger than will fit into a 32-bit integer, so it will be trunicated.
The max for 32-bit is going to be 2 GB for signed and 4 GB for unsigned. You should generally use "size_t" for such sizes, which is usually defined as an unsigned long. Long is usually still 32-bit, same as int, on 32-bit systems and 64-bit with 64-bit compile options.
Also, you are limited by the amount of memory you have available. If you try to use max unsigned int it will try to allocate 4 GB, which may or may not work depending on your system confuguration (run-time).