fopen

I am having a hard time finding the information I need on fopen and the C style input/output to a file.

I am creating a database for product information and going to store a large amount of data to a txt file. Each product will have it's own struct that get's written to the file, the struct containing 10 or so int/double values. No strings.

The problem is, the number of variables each product outputs to the file, can be different. So that's where I am having the problem. I.E.

ProductInfo.txt
1
2
3
4
5
10003, 25, 192, 2, 104, 9494, 1008, 1234;
10004, 83, 124, 8, 102, 3929, 1028;
10005, 43, 123, 5, 293, 1838, 1029, 1982, 1924, 1053;
10006, 84, 152, 3, 192, 3532, 1031, 3913;
etc...


Although there will be around 50,000 lines in the file. I know how to input each number into a variable, but I do not know how to make the program stop once it hits the EOL ";" and start over on the next line, with the next product.

Also, will inputting 50k structs and keeping them in memory use up much in a program like this, or should I just try to find a way to search in the file for what I need when I need it?
As this is a C++ forum, I have to ask: what's wrong with iostreams?

With C++ iostreams, you can create an ifstream to read each line into a string, and then use a istringstream to parse the individual lines.

50K items is nothing on computers these days.
I figured 50k wasn't much especially with int's. Although when I try to compile a 100k array struct, I get a compile error saying the array is too large!
1
2
// Read everything up to and including next newline
fscanf( fp, "%*[^\n]\n" );


Good grief, why not offer some actual help people? We do C just as easily as C++.

dmoore210
100k * 4 bytes == too big for contiguous memory. Use an array of arrays, each individually malloc()ed -- that allows for the OS/freestore management to spread things around so that it all fits.

To read the line, first get the entire line into a char[] or allocated char*. You can use fgets() directly or you can use something like I posted here:
http://www.cplusplus.com/forum/beginner/4174/page1.html#msg18271

Then you can use strchr() to find the semicolon (';') and replace it with a null ('\0') -- thus getting rid of it and ignoring everything on the line that follows.

After that, use strchr() or strtok() or the like to split up each section, and strtol() or the like to convert the value to an integer, which you can stick into your 2d array.

Hope this helps.
Last edited on
I think the compile error comes from the fact that a 100000 elements array is way too large for the stack.
You can just do int *array=malloc(100000*sizeof(int));
You can just do int *array=malloc(100000*sizeof(int));


I think that's bad advice, helios.

Use a vector<int>.

Except I can't tell whether he's using C or C++. His post is ambiguous.
I agree that the standard containers would be a better choice, but his post suggests C over C++.
However, I would not use a vector for the same reason that malloc(100000*sizeof(int)) would fail -- vectors use continuous space. Instead, use a deque <int>, which, like an array of arrays, allows the OS/freestore management to make things fit.
Anything but tiny embedded devices can allocate 400K to 800K of contiguous memory with no problem. Helios is correct. The issues is almost certainly that he is trying to allocate it all on the stack. Vector solves the problem by allocating it on the heap.
I am using mostly C with a little C++ for flavor ;)

Yea, I'm experimenting with a few different ways to achieve what I need. The main reason why I was creating such a large array was because the file contains ALL the product information to date. Which is the serial number of each product, then all the subsequent data needed to calibrate the product etc, (typically around 10 or so int's and double's worth of data)

Is there a way -- during reading a file, to append new data (in the middle of the file) and everything below it get's moved down? That would prevent me having to load such huge array's of data. I would just create a few search functions to search through the file until I find what I need, make any changes, to the specific line(s) and add lines if necessary.
Is there a way -- during reading a file, to append new data (in the middle of the file) and everything below it get's moved down?
That's not appending, and no, there's no way.

Sounds to me like what you need is a database. Why not take a look at SQLite?
A std::deque or std::list is ideal for what you want to do with your existing data. Load it all into memory, modify it in memory, then write it all back out to file.

I also like helios's suggestion that you use a SQL database. It requires a little work to set it up, but you can store much more usefully organized data. You will have to decide whether or not that is overkill for your needs.

Hope this helps.
Yea, it's a bit overkill, but what's wrong with going over the top if it works and works well? I'd also like to experiment with SQLite mostly because I've never really messed around with any other library's before (except pdcurses).

Which actually I'm having troubles with. I can't seem to get any SQLite programs to run. I'm not sure how to get the SQLite library embedded with my codeblocks program...
Topic archived. No new replies allowed.