Time of reading text file
-
Hi, I wrote a program in Windows CE platform, CPU is ARM 427MHZ. when program read a text file(size about 700KB,1000 lines) in Flash memory, it needs about 25 seconds to load the file. is there a means to decrease the reading time?
file.Open(fileName,CFile::typeBinary | CFile::modeRead);
while ( UNReadString(&file,strLine) ) {
//strLine.TrimLeft(); TrimRight(); N
strLine.Trim();strNum = strLine.Tokenize(TEXT("\\t"),tokenPos); strEng = strLine.Tokenize(TEXT("\\t"),tokenPos); //Get English text strCh = strLine.Tokenize(TEXT("\\t"),tokenPos); strRus = strLine.Tokenize(TEXT("\\t"),tokenPos); handleStr(); tokenPos = 0;
}
BOOL UNReadString(CFile *iFile, CString &strReturn)
{
TCHAR tc;
CString strBuff = _T("");
strReturn = _T("");while( ( sizeof(TCHAR) ) == ( iFile->Read(&tc, sizeof(TCHAR))) ) //EOF reached? { strBuff = tc; if(\_T("\\n") != strBuff) { strReturn += strBuff; } else { return TRUE; //Reached "\\n", 1 line added to strReturn } } return FALSE;
}
-
Hi, I wrote a program in Windows CE platform, CPU is ARM 427MHZ. when program read a text file(size about 700KB,1000 lines) in Flash memory, it needs about 25 seconds to load the file. is there a means to decrease the reading time?
file.Open(fileName,CFile::typeBinary | CFile::modeRead);
while ( UNReadString(&file,strLine) ) {
//strLine.TrimLeft(); TrimRight(); N
strLine.Trim();strNum = strLine.Tokenize(TEXT("\\t"),tokenPos); strEng = strLine.Tokenize(TEXT("\\t"),tokenPos); //Get English text strCh = strLine.Tokenize(TEXT("\\t"),tokenPos); strRus = strLine.Tokenize(TEXT("\\t"),tokenPos); handleStr(); tokenPos = 0;
}
BOOL UNReadString(CFile *iFile, CString &strReturn)
{
TCHAR tc;
CString strBuff = _T("");
strReturn = _T("");while( ( sizeof(TCHAR) ) == ( iFile->Read(&tc, sizeof(TCHAR))) ) //EOF reached? { strBuff = tc; if(\_T("\\n") != strBuff) { strReturn += strBuff; } else { return TRUE; //Reached "\\n", 1 line added to strReturn } } return FALSE;
}
-
econy wrote:
file.Open(fileName,CFile::typeBinary | CFile::modeRead);
Why do you open a textfile in binary mode? You will also find that your use of
Trim
andTokenize
on each line will add considerably to the time. -
Since there are east asian words in the file, and I try to comment tokenize block, no update
-
Sorry, I was not clear; I meant that I could not find a definition for that function in MSDN, so I do not know what it does. Is this a standard part of WindowsCE and if so where is the documentation for it?
-
Hi, I wrote a program in Windows CE platform, CPU is ARM 427MHZ. when program read a text file(size about 700KB,1000 lines) in Flash memory, it needs about 25 seconds to load the file. is there a means to decrease the reading time?
file.Open(fileName,CFile::typeBinary | CFile::modeRead);
while ( UNReadString(&file,strLine) ) {
//strLine.TrimLeft(); TrimRight(); N
strLine.Trim();strNum = strLine.Tokenize(TEXT("\\t"),tokenPos); strEng = strLine.Tokenize(TEXT("\\t"),tokenPos); //Get English text strCh = strLine.Tokenize(TEXT("\\t"),tokenPos); strRus = strLine.Tokenize(TEXT("\\t"),tokenPos); handleStr(); tokenPos = 0;
}
BOOL UNReadString(CFile *iFile, CString &strReturn)
{
TCHAR tc;
CString strBuff = _T("");
strReturn = _T("");while( ( sizeof(TCHAR) ) == ( iFile->Read(&tc, sizeof(TCHAR))) ) //EOF reached? { strBuff = tc; if(\_T("\\n") != strBuff) { strReturn += strBuff; } else { return TRUE; //Reached "\\n", 1 line added to strReturn } } return FALSE;
}
The first thing you've got to do is work out whether it's I/O bound or whether it's the data processing that's slowing things down. You might find that CFile isn't particularly well tuned for reading from flash and not buffering enough. If it turns out that it's the I/O that's causing the problem find the lowest level function you can call and see what the raw I/O speed of your device is for a range of buffer sizes and then implement something that hits the sweet spot in terms of buffer size. It's a shame you're using MFC classes and not standard library ones as it's a lot easier to replace the underlying buffering and I/O mechanism, but such is life! Another thing might be to try reading more data in your while loop (say 512 characters at a time) and manually parcelling out the data rather than letting CFile::Read do it. If it turns out that the I/O is fast enough then have a look at what you're doing with the data. If you've got a lot of string handling you might find loads of temporaries flying about (especially if you're using VC++ 2008 and earlier) and there's loads of hidden memory manipulation that's slowing things down. In that case look at reducing the number of temporaries you need by reusing objects and expoiting things like NRVO. Anyway, the main thing though is to find out where the bottle neck is then do something about it.