Application losing a CPU while running
-
Hi, I'm having some problems. We have an application that we made and when it runs on a dual CPU machine, we lose one CPU after a while. After checking, we found out that the Process Affinity was changed. We can set it back using the task manager but only to lose it again later. In no way in our code do we set the process affinity. But it seems to happen only when the CPUs are running at 100%. Can Windows (this was on Windows 2000, Service pack 3) decide to change the CPU access to an application by itself? Thanks, Jason Phillips
-
Hi, I'm having some problems. We have an application that we made and when it runs on a dual CPU machine, we lose one CPU after a while. After checking, we found out that the Process Affinity was changed. We can set it back using the task manager but only to lose it again later. In no way in our code do we set the process affinity. But it seems to happen only when the CPUs are running at 100%. Can Windows (this was on Windows 2000, Service pack 3) decide to change the CPU access to an application by itself? Thanks, Jason Phillips
I would worry if you have CPU @ 100% on a dual machine. I would not be surprised if the OS took over and changed the settings to make sure that you app did not kill the box. Is this Win2k server or Pro?
Paul Watson wrote: "At the end of the day it is what you produce that counts, not how many doctorates you have on the wall."
George Carlin wrote: "Don't sweat the petty things, and don't pet the sweaty things."
-
I would worry if you have CPU @ 100% on a dual machine. I would not be surprised if the OS took over and changed the settings to make sure that you app did not kill the box. Is this Win2k server or Pro?
Paul Watson wrote: "At the end of the day it is what you produce that counts, not how many doctorates you have on the wall."
George Carlin wrote: "Don't sweat the petty things, and don't pet the sweaty things."