[Solved]Too much memory used?

Hi cpp-Community,
until yet I got every problem solved with just reading posts, but now I can't find any user that had the same problem and I hope you can help me :-)

I just want to read in a wordlist. Normally this shouldn't be a problem, but this time it is!
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
#include <iostream>
#include <cstdlib>
#include <string.h>
#include <fstream>
#include <windows.h>

using namespace std;

int main(int argv, char* argc[])
{
	//Info: There are 1275 Lines - The longest strings have 9 chars
	ifstream wl;
	wl.open(L"wordlist.txt");
	char buffer[1275][10];
	ZeroMemory(&buffer, sizeof(buffer));
	for (int i=0; i<1275; i++)
	{
		wl.getline(buffer[i], 9, '\n');
		cout << buffer[i] << endl;
	}
	return 0;
}


The compilation is no problem. But when I try to execute the program in cmd I just see some words (the first 971 (6486 chars)) and after this instead of words just empty lines.
Also when I execute it with the parameter >> test.txt there are 971 lines with words (in line 971 the last char of the word is missing) and after this there are a lot of empty lines...

To solve the problem I already tried to split the array like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
#include <iostream>
#include <cstdlib>
#include <string.h>
#include <fstream>
#include <windows.h>

using namespace std;

int main(int argv, char* argc[])
{
	//Info: There are 1275 Lines - The longest strings have 9 chars
	ifstream wl1,wl2,wl3;
	wl1.open(L"wl1.txt");
	wl2.open(L"wl2.txt");
	//wl3.open(L"wl3.txt");
	char buffer[500][10];
	char buffer1[500][10];
	//char buffer2[275][10];
	ZeroMemory(&buffer, sizeof(buffer));
	for (int i=0; i<500; i++)
	{
		wl1.getline(buffer[i], 9, '\n');
		cout << buffer[i] << endl;
		wl2.getline(buffer[i+500], 9, '\n');
		cout << buffer[i+500] << endl;
		//wl3.getline(buffer[i+1000], 9, '\n');
	}
	//string inc = argc[1];
	return 0;
}

The Compiler doesn't find any problem but when I execute it there's a Runtime-Error. Debugging says, that there is a memory corruption.

Can anybody help me? I really don't know how to go on... I just want to write a program that can unscramble 10 words with the help of the wordlist. But if I can't read in the wordlist, the whole program doens't make any sense.

I've 4GB of ram on my computer, so it shouldn't be such a problem for the os to give up some more memory to the prog.

Thanks for reading and thanks in advance for helping me ;-)

Greetz from Germany!

Kevin
Last edited on
You shouldn't allocate that large objects on the stack, because the stack is quite small even if you have lots of RAM. Use the heap instead:

std::vector<char> buffer(1275 * 10);
Thank you very much for your fast replay, Abramus!

I have never worked with vectors, so I'm not sure whether it's my fault or it's something else, but the program still doesn't work.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
#include <iostream>
#include <cstdlib>
#include <string.h>
#include <fstream>
#include <windows.h>
#include <vector>

using namespace std;

int main(int argv, char* argc[])
{
	//Info: There are 1275 Lines - The longest strings have 9 chars
	ifstream wl;
	wl.open("wordlist.txt");
	vector<char*> buffer;
	char line[9];
	int i=0;
	while (wl.good())
	{
		wl.getline(line, 9);
		buffer.push_back(line);
		cout << buffer[i] << endl;
		i++;
	}
	return 0;
}


The problem is still the same... it gets 971 lines, after this it stops :(

If it's wrong to use the char* instead of the char, please tell me how to get the lines from the wordlist into the buffer with having a vector<char> buffer(1275 * 9) :)

Kevin
Last edited on
char* is a pointer to an array of characters. You cannot use it as a vector element type because you overwrite the buffer which is referenced by the inserted pointer in the next iteration of the loop. If you want to read exactly 9 characters in each iteration, then you can do something like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
struct line_t
{
   char data[9];
};

...

        vector<line_t> buffer;
        line_t line;
        int i=0;
	while (wl.good())
	{
		wl.getline(line.data, 9);
		buffer.push_back(line);
		cout << buffer[i].data << endl;
                i++;
	}
Last edited on
Thank you again for your fast answer, Abramus.

But the Problem still exists...
971 lines (or 6486 chars) and it stops :-(

There must be another solution, coz I don't think this is such a big problem...
I've read, that the heap is sliced in blocks of memory. Is there any way to get a block of 12kbyte or something like that?

By the way: the longest words in the list consist of 9 characters. There are also words with less characters, but this shouldn't be a problem.

Kevin
What does it mean that "it stops"?

Can you post contents of the file near line 971?
Normally the program should give out every line of the wordlist in the cmd-window, coz of:
cout << buffer[i].data << endl;
until it reaches line 971 everything works. But then it stops and the program terminates as if the program already reached the end.

Here's a part of the wordlist:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
midnight
dwight
hotdog
indigo
kingdom
snoopdog
gordon
phurivdli    // This is line 971 (the second 'i' of phurivdli is already missing in the console)
judith
dolphin
dolphins
hotrod
dorothy
midori


Kevin
Last edited on
It looks that you need 10 characters for a line (9 characters of a word + terminating '\0'). Can you change definition of line_t and check the program once again?

BTW, I would be probably better to use vector<string>, as this way there wouldn't be any limit for a maximum word length.
Oh my god!!!
You're right! Thank you very much!

phurivdli is the first word with 9 characters in the list but there wasn't enough space in the variable. Now I defined char data[10]; and it works :)

Thank you very much, Abramus!

Kevin
Topic archived. No new replies allowed.