Actually, apparently as of C++11, you don't need the file.clear() first.
But the point of clear() is that it resets the failure flag of the file stream.
If the failure flag of the stream is set, it won't get any more input. One of the ways that it's set is when you reach the end of the file.
Looping on the actual extraction of data is always the best thing to do. That way, as soon as you didn't get input, the loop will end. You'll never even have the chance of accidentally using bad data.
Looping on eof can introduce subtle differences in behavior.
For example, compare these two programs:
mainA
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
|
#include <iostream>
#include <string>
#include <fstream>
using namespace std;
int main()
{
ifstream file("foo.txt");
while (!file.eof())
{
string line;
getline(file, line);
cout << line << '\n';
}
cout << "<end of program>\n";
}
| |
mainB
1 2 3 4 5 6 7 8 9 10 11 12 13 14
|
#include <iostream>
#include <string>
#include <fstream>
using namespace std;
int main()
{
ifstream file("foo.txt");
string line;
while (getline(file, line))
{
cout << line << '\n';
}
cout << "<end of program>\n";
}
| |
foo.txt
If foo.txt ends with no newline, then both programs will behave the same.
But if foo.txt ends with a newline, then the final getline will fail to extract data, and an empty line will be printed out in mainA, while there will be no difference in mainB. This is because the EOF was not tripped by the previous call to getline, so the loop still enters. But now there's no more input from getline, so the getline call will fail, but there's nothing to check the return value of getline before it's printed.
This is more important if you're looping over something with a fixed size, like an array. Going even just one past the end of the array is a buffer overflow.