Memory leak
-
hi, i've designed a program that runs as an server. If a connections request comes the program opens a thread that handles the commands for this connection. This new thread construct a class that has all the thinking capabillaties for the thread, it also constructs a Class that handles the connection events(sending information/encrypting it and so on). The thread also receives the pointer to a "database server"(a class that handles all commands for the database) and sends it to the constructed class. Wenn a connection is made there will also be a Session-struct constructed(this hold all info about the connection, login-name/login-pass and so on). The session is constructed as a "new" struct, wenn the thread ends the session is "deleted" so is doesn't remain in the memory. if i run this program, and login and out a few hundred times the memory that is taken by the server seems to be growing... what can i doe to prevent this? wenn the thread ends is the constructed class fully removed out of the memory? or do i have to do it myself? thread code:
UINT ClientThreadRecv(LPVOID Client) { CCrypt Crypt; _// class for decrypting info_ char Block[BlockSize]; _// memory Block for receiving data_CString Data, TData; _// the final variable for storing the received data_ int loc; int namelen = sizeof(sockaddr_in); sockaddr_in name; char *ip; SOCKET client = (SOCKET)Client; _// socket that the thread handles_ getpeername(client, (sockaddr *)&name, &namelen); ip = inet_ntoa(name.sin_addr); _// client ip_ CGebruikersStruct *Session = new CGebruikersStruct; _// the session, later on stored in a linked list, thats why the new._ Session->client = client; Session->LoginNaam = ""; Session->LoginPass = ""; Session->LoggedIn = FALSE; Session->Rechten = 0; Session->IPadres = ip; CClientSocket ClientSocket; _// global point for socket info/sending data_ ClientSocket.Session = Session; ClientSocket.DatabaseServer = DatabaseServer; _// the pointer for the database_ CClientClass ClientClass; ClientClass.ClientSocket = &ClientSocket; ClientSocket.ConnectionBegin(); _// a function that sends the client that the server is ready_while(true) { loc = recv(client, Block, BlockSize, 0); _// receiving the data_ if ((loc == -1) || (loc == 0)) _// end thread if socket is disconnected_ { break; } Block[loc] = 0; Crypt.DeCrypt(Block, Block); _// decrypting the received data_TData += Block; _// add all data into 1 big v_
-
hi, i've designed a program that runs as an server. If a connections request comes the program opens a thread that handles the commands for this connection. This new thread construct a class that has all the thinking capabillaties for the thread, it also constructs a Class that handles the connection events(sending information/encrypting it and so on). The thread also receives the pointer to a "database server"(a class that handles all commands for the database) and sends it to the constructed class. Wenn a connection is made there will also be a Session-struct constructed(this hold all info about the connection, login-name/login-pass and so on). The session is constructed as a "new" struct, wenn the thread ends the session is "deleted" so is doesn't remain in the memory. if i run this program, and login and out a few hundred times the memory that is taken by the server seems to be growing... what can i doe to prevent this? wenn the thread ends is the constructed class fully removed out of the memory? or do i have to do it myself? thread code:
UINT ClientThreadRecv(LPVOID Client) { CCrypt Crypt; _// class for decrypting info_ char Block[BlockSize]; _// memory Block for receiving data_CString Data, TData; _// the final variable for storing the received data_ int loc; int namelen = sizeof(sockaddr_in); sockaddr_in name; char *ip; SOCKET client = (SOCKET)Client; _// socket that the thread handles_ getpeername(client, (sockaddr *)&name, &namelen); ip = inet_ntoa(name.sin_addr); _// client ip_ CGebruikersStruct *Session = new CGebruikersStruct; _// the session, later on stored in a linked list, thats why the new._ Session->client = client; Session->LoginNaam = ""; Session->LoginPass = ""; Session->LoggedIn = FALSE; Session->Rechten = 0; Session->IPadres = ip; CClientSocket ClientSocket; _// global point for socket info/sending data_ ClientSocket.Session = Session; ClientSocket.DatabaseServer = DatabaseServer; _// the pointer for the database_ CClientClass ClientClass; ClientClass.ClientSocket = &ClientSocket; ClientSocket.ConnectionBegin(); _// a function that sends the client that the server is ready_while(true) { loc = recv(client, Block, BlockSize, 0); _// receiving the data_ if ((loc == -1) || (loc == 0)) _// end thread if socket is disconnected_ { break; } Block[loc] = 0; Crypt.DeCrypt(Block, Block); _// decrypting the received data_TData += Block; _// add all data into 1 big v_
Am sorry, i found the memory leak, it wasn't the code in the thread but it was a piece of code in the checking for the right ip. i used a linked list to check it the ip that tries to log in has been banned from the server. it seemed that i kept deleting the ip from the database while it wasn't even in the database. this also explains why the log-on procedure took about 20 too 40 ms, while i'm only working with linked lists. i changed the code to delete the ip from the database only if it is found in the linked list(mirror to the database), now my logon procedure is takes < 5 ms(the computer says it takes 0 ms so... ). []D [] []D []