Can this be the cause of a memory leak
-
for(int i=0;i<3;i++)
{
vector * Nodes = new vector ();
thing * Athing = new thing();
Nodes->push_back(Athing);
//do stuff
}do I need to delete the vector once the work is done or this is not required? How about the things stored inside the container, do I need to delete those too or calling clear() is enough.
for(int i=0;i<3;i++)
{
vector * Nodes = new vector ();
thing * Athing = new thing();
Nodes->push_back(Athing);
//do stuff
Nodes->clear();
delete Nodes;
} -
for(int i=0;i<3;i++)
{
vector * Nodes = new vector ();
thing * Athing = new thing();
Nodes->push_back(Athing);
//do stuff
}do I need to delete the vector once the work is done or this is not required? How about the things stored inside the container, do I need to delete those too or calling clear() is enough.
for(int i=0;i<3;i++)
{
vector * Nodes = new vector ();
thing * Athing = new thing();
Nodes->push_back(Athing);
//do stuff
Nodes->clear();
delete Nodes;
}Calin Negru wrote:
do I need to delete the vector once the work is done? for(int i=0;i<3;i++) { vector<thing*> * Nodes = new vector <thing*>(); thing * Athing = new thing(); Nodes->push_back(Athing); //do stuff Nodes->clear(); delete Nodes; }
Yes, you do. Since you created it (vector) with new then you need to delete the vector once the work is done
-
for(int i=0;i<3;i++)
{
vector * Nodes = new vector ();
thing * Athing = new thing();
Nodes->push_back(Athing);
//do stuff
}do I need to delete the vector once the work is done or this is not required? How about the things stored inside the container, do I need to delete those too or calling clear() is enough.
for(int i=0;i<3;i++)
{
vector * Nodes = new vector ();
thing * Athing = new thing();
Nodes->push_back(Athing);
//do stuff
Nodes->clear();
delete Nodes;
}On top of what Victor said, look into using
std::unique_ptr
instead of "naked" pointers. Something like this:auto Nodes = new std::vector< < std::unique_ptr >;
Nodes->push_back( std::make_unique() );
//do stuff
delete []Nodes;Edit: It is a bit unusual to "new" vectors. Given they can grow dynamically, in most cases you would write something like:
std::vector< >nodes;
nodes.push_back (std::make_unique( /*constructor params go here*/ ) );
/* node[0] is a (unique) pointer to Thing */// do more stuff
When
nodes
goes out of scope allThings
get deleted automatically.Mircea
-
for(int i=0;i<3;i++)
{
vector * Nodes = new vector ();
thing * Athing = new thing();
Nodes->push_back(Athing);
//do stuff
}do I need to delete the vector once the work is done or this is not required? How about the things stored inside the container, do I need to delete those too or calling clear() is enough.
for(int i=0;i<3;i++)
{
vector * Nodes = new vector ();
thing * Athing = new thing();
Nodes->push_back(Athing);
//do stuff
Nodes->clear();
delete Nodes;
}Yes, you have to explicitly
delete
thevector
s you allocated usingnew
. But that's not enough. Try running the following code#include
#includeusing namespace std;
class thing
{
public:
thing(){cout << "thing ctor\n";}
~thing(){cout << "thing dtor\n";}
};int main()
{for(int i=0;i<3;i++)
{
vector * Nodes = new vector ();
thing * Athing = new thing();
Nodes->push_back(Athing);
//do stuff
Nodes->clear();
delete Nodes;
}
}"In testa che avete, Signor di Ceprano?" -- Rigoletto
-
Calin Negru wrote:
do I need to delete the vector once the work is done? for(int i=0;i<3;i++) { vector<thing*> * Nodes = new vector <thing*>(); thing * Athing = new thing(); Nodes->push_back(Athing); //do stuff Nodes->clear(); delete Nodes; }
Yes, you do. Since you created it (vector) with new then you need to delete the vector once the work is done
What happens if I don’t delete the vector, will that cause a memory leak or it’s just allocated memory that is not used and takes extra space. The program doesn’t break with an error when I do source code version No 1 I have unexpected behavior somewhere else in my program and I was wondering if this could be the cause of it.
-
What happens if I don’t delete the vector, will that cause a memory leak or it’s just allocated memory that is not used and takes extra space. The program doesn’t break with an error when I do source code version No 1 I have unexpected behavior somewhere else in my program and I was wondering if this could be the cause of it.
With a memory leak, the memory footprint of the running process increases over time. If the conditions that lead to the leak are encountered often enough, the process will eventually run out of available memory, usually causing an exception of some sort. Unless you're in a very specialized environment, the memory associated with the process gets released when it exits. That means that the overall memory on the system doesn't get incrementally consumed over time. So you really only run the risk of the one process running out of memory, not the system as a whole. I hope I've explained that clearly enough. Some system calls (and some user written functions!) use this to their advantage. On the first call they will allocate some memory, and then reuse it on successive calls. With no cleanup routine, they just rely on the program exit to do the memory release. That's why you might get notice of memory in use at exit when running a program under a memory diagnostic tool like Valgrind. When you trace back to where the memory was allocated, it might be inside something like
fopen()
."A little song, a little dance, a little seltzer down your pants" Chuckles the clown
-
With a memory leak, the memory footprint of the running process increases over time. If the conditions that lead to the leak are encountered often enough, the process will eventually run out of available memory, usually causing an exception of some sort. Unless you're in a very specialized environment, the memory associated with the process gets released when it exits. That means that the overall memory on the system doesn't get incrementally consumed over time. So you really only run the risk of the one process running out of memory, not the system as a whole. I hope I've explained that clearly enough. Some system calls (and some user written functions!) use this to their advantage. On the first call they will allocate some memory, and then reuse it on successive calls. With no cleanup routine, they just rely on the program exit to do the memory release. That's why you might get notice of memory in use at exit when running a program under a memory diagnostic tool like Valgrind. When you trace back to where the memory was allocated, it might be inside something like
fopen()
."A little song, a little dance, a little seltzer down your pants" Chuckles the clown
Thanks k5054 I think I understand. What I described in the first example is a memory leak but probably is not causing problems elsewhere.
-
Yes, you have to explicitly
delete
thevector
s you allocated usingnew
. But that's not enough. Try running the following code#include
#includeusing namespace std;
class thing
{
public:
thing(){cout << "thing ctor\n";}
~thing(){cout << "thing dtor\n";}
};int main()
{for(int i=0;i<3;i++)
{
vector * Nodes = new vector ();
thing * Athing = new thing();
Nodes->push_back(Athing);
//do stuff
Nodes->clear();
delete Nodes;
}
}"In testa che avete, Signor di Ceprano?" -- Rigoletto
Thank you guys for your feedback. I think I understand what a memory leak is now.
-
Thank you guys for your feedback. I think I understand what a memory leak is now.
-
Thanks k5054 I think I understand. What I described in the first example is a memory leak but probably is not causing problems elsewhere.
If by "not causing problems elsewhere" you mean it's not affecting other processes, that's mostly true. You can, of course, run into an Out Of Memory (OOM) situation, where all the RAM and swap is marked as "in use", and Bad Things start happening. Assuming you've got a 64 bit OS with lots of RAM and swap configured, (heck, even a 32 bit OS with good Virtual Memory), that's only likely to happen if you've got a *lot* of memory allocated. As a rule of thumb, you should clean up memory when it's no longer needed. Think of it like craftsmanship. A piece of Faberge jewelry shows attention to detail from both the back and the front. Freeing up unused memory is part of the attention to detail, just like closing files after use, for example.
"A little song, a little dance, a little seltzer down your pants" Chuckles the clown
-
If by "not causing problems elsewhere" you mean it's not affecting other processes, that's mostly true. You can, of course, run into an Out Of Memory (OOM) situation, where all the RAM and swap is marked as "in use", and Bad Things start happening. Assuming you've got a 64 bit OS with lots of RAM and swap configured, (heck, even a 32 bit OS with good Virtual Memory), that's only likely to happen if you've got a *lot* of memory allocated. As a rule of thumb, you should clean up memory when it's no longer needed. Think of it like craftsmanship. A piece of Faberge jewelry shows attention to detail from both the back and the front. Freeing up unused memory is part of the attention to detail, just like closing files after use, for example.
"A little song, a little dance, a little seltzer down your pants" Chuckles the clown
Quote:
Go to ParentIf by "not causing problems elsewhere" you mean it's not affecting other processes, that's mostly true
I mean it`s not like when you write to an array out of bounds and while doing so you`re overwriting stuff that has nothing to do with the array.
-
for(int i=0;i<3;i++)
{
vector * Nodes = new vector ();
thing * Athing = new thing();
Nodes->push_back(Athing);
//do stuff
}do I need to delete the vector once the work is done or this is not required? How about the things stored inside the container, do I need to delete those too or calling clear() is enough.
for(int i=0;i<3;i++)
{
vector * Nodes = new vector ();
thing * Athing = new thing();
Nodes->push_back(Athing);
//do stuff
Nodes->clear();
delete Nodes;
}Ah memory leaks (and small buffer overruns). I would point out that your example is fairly obvious. If you have a long running task, and this code is in some sort of processing loop, you'll see it quickly. What will really bite you in the a$$ are the small leaks. A byte here, a byte there. I live in the embedded world where customers forget our equipment was installed under their production line 10 years ago. The engineer responsible either died, retired or moved on to another company. I'm not being morbid, I have stories I could tell you :) The group I work in is a decent group of smart people. Sadly, they never let us into the field to see how the product is actually used. The few times I've seen examples, they always shock me with "I didn't see that coming" sort of response. What amazes me is that if your customer never turns off your machine, why not set up an area where a test unit runs forever? I guess it falls under diminishing returns.
Charlie Gilley “They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759 Has never been more appropriate.