Help -- something faster than Getline() ??
-
Hi -- I find getline() function to be extremely slow when reading data from a text file. I tried replacing getline with a homemade function called getChunk (shown below) but it's just as slow. getChunk(std::istream &in, std::string &s, const char terminator) { s.erase(s.begin(),s.end()) ; s.reserve(10) ; std::string::value_type ch ; while (in.get(ch) && ch != terminator) s.insert(s.end(),ch) ; } For context, here's the core of my program: rawData.reserve(numberRecords) ; int counter=0 ; do { counter=counter+1 ; std::vector record ; record.reserve(numberVars) ; for(int i=0; i::iterator j ; j=record.begin() ; record.erase(j, j+numberVars) ; } while (counter < numberRecords) ; numberRecords=rawData.size() ; Can anyone help me out here? Thanks, Kim Larsen
-
Hi -- I find getline() function to be extremely slow when reading data from a text file. I tried replacing getline with a homemade function called getChunk (shown below) but it's just as slow. getChunk(std::istream &in, std::string &s, const char terminator) { s.erase(s.begin(),s.end()) ; s.reserve(10) ; std::string::value_type ch ; while (in.get(ch) && ch != terminator) s.insert(s.end(),ch) ; } For context, here's the core of my program: rawData.reserve(numberRecords) ; int counter=0 ; do { counter=counter+1 ; std::vector record ; record.reserve(numberVars) ; for(int i=0; i::iterator j ; j=record.begin() ; record.erase(j, j+numberVars) ; } while (counter < numberRecords) ; numberRecords=rawData.size() ; Can anyone help me out here? Thanks, Kim Larsen
webHamlet wrote:
while (in.get(ch) && ch != terminator)
it seems u r reading the data one byte at a time, which would slow down the process anyways. if i were u, i wud've read a large chunk(5-10kb maybe) of data from the file into memory buffer and then read and interpret text data from it. "Do first things first, and second things not at all." — Peter Drucker.
-
Hi -- I find getline() function to be extremely slow when reading data from a text file. I tried replacing getline with a homemade function called getChunk (shown below) but it's just as slow. getChunk(std::istream &in, std::string &s, const char terminator) { s.erase(s.begin(),s.end()) ; s.reserve(10) ; std::string::value_type ch ; while (in.get(ch) && ch != terminator) s.insert(s.end(),ch) ; } For context, here's the core of my program: rawData.reserve(numberRecords) ; int counter=0 ; do { counter=counter+1 ; std::vector record ; record.reserve(numberVars) ; for(int i=0; i::iterator j ; j=record.begin() ; record.erase(j, j+numberVars) ; } while (counter < numberRecords) ; numberRecords=rawData.size() ; Can anyone help me out here? Thanks, Kim Larsen
-
Are you sure that it's the file I/O that's the problem? Maybe you (for testing purposes) should isolate the file reading from the storing of the data. And the compare file read times?
-
webHamlet wrote:
while (in.get(ch) && ch != terminator)
it seems u r reading the data one byte at a time, which would slow down the process anyways. if i were u, i wud've read a large chunk(5-10kb maybe) of data from the file into memory buffer and then read and interpret text data from it. "Do first things first, and second things not at all." — Peter Drucker.
-
Hi. You could try to read the file using the old f-functions, like fopen, fgets, fclose. Read the data to a char[], and then assign it to the string. It might help. Principle: char buffer[300]; FILE * file = fopen("filename", "rt"); if(file != NULL) { while(fgets(buffer, 300, file) != NULL) { string = buffer; ..... Save it here } fclose(file); }