1650 gtx video card + blueiris + codeproject AI - No gpu mode, only cpu?
-
I had object detection .net working with the integrated gpu previously, so i switched to a 1650 video card. I removed .net and went with the 6.2 one. I also ran the cuda installer and the batch file installer per the codeproject ai download page. I've restarted the pc and i cant get it to go to GPU mode any thoughts out there? I had moved this thread to the other forum, but for some reason i cant find it, but here was the solution that worked for me: Uninstall code project Uninstall cuda 11.7 Uninstall gfx card drivers Reboot Install 516.94 gfx drivers Install cuda 11.7.0 Install Code Project 2.0.8 Install cuDNN bat file (https://www.codeproject.com/ai/docs/faq/gpu.html)
-
I had object detection .net working with the integrated gpu previously, so i switched to a 1650 video card. I removed .net and went with the 6.2 one. I also ran the cuda installer and the batch file installer per the codeproject ai download page. I've restarted the pc and i cant get it to go to GPU mode any thoughts out there? I had moved this thread to the other forum, but for some reason i cant find it, but here was the solution that worked for me: Uninstall code project Uninstall cuda 11.7 Uninstall gfx card drivers Reboot Install 516.94 gfx drivers Install cuda 11.7.0 Install Code Project 2.0.8 Install cuDNN bat file (https://www.codeproject.com/ai/docs/faq/gpu.html)
One suggestion, but first a question; does anything else on your box use the GPU in this expected manner? If this is a testing-of-the-water type of scenario wouldn't it be better to get tried-and-true stuff running first? I have no piers here on CP and see myself as somewhat of a pariah when I post but I'm sure somewhere in the expanse of time between using the CPU and using the GPU, in realtime computer use (1995-present), sticking to allowing a CPU to enslave the GPU, is in the questor's best interests.
-
One suggestion, but first a question; does anything else on your box use the GPU in this expected manner? If this is a testing-of-the-water type of scenario wouldn't it be better to get tried-and-true stuff running first? I have no piers here on CP and see myself as somewhat of a pariah when I post but I'm sure somewhere in the expanse of time between using the CPU and using the GPU, in realtime computer use (1995-present), sticking to allowing a CPU to enslave the GPU, is in the questor's best interests.
well, i can use the 1650 as the primary gpu without issue. In this situation i'm using the built in one for the lcd. Both drivers show up and show up in device manger, i had no issues with the two cuda installers either
-
well, i can use the 1650 as the primary gpu without issue. In this situation i'm using the built in one for the lcd. Both drivers show up and show up in device manger, i had no issues with the two cuda installers either
Perhaps ask your original question here then: CodeProject.AI Discussions[^] I've observed reposting on cp will result in boiled tempers so, because you're moving to a better venue, perhaps adding some keywords to better explain AND keeping your grammar right (like capitalizing your first-person pronouns) will cause someone there to come to the rescue.
-
Perhaps ask your original question here then: CodeProject.AI Discussions[^] I've observed reposting on cp will result in boiled tempers so, because you're moving to a better venue, perhaps adding some keywords to better explain AND keeping your grammar right (like capitalizing your first-person pronouns) will cause someone there to come to the rescue.
ah, i dont know how i ended up in the wrong forum again, thanks. And who uses caps for pronouns in 2023, especially from a phone :P :)
-
ah, i dont know how i ended up in the wrong forum again, thanks. And who uses caps for pronouns in 2023, especially from a phone :P :)
-
One suggestion, but first a question; does anything else on your box use the GPU in this expected manner? If this is a testing-of-the-water type of scenario wouldn't it be better to get tried-and-true stuff running first? I have no piers here on CP and see myself as somewhat of a pariah when I post but I'm sure somewhere in the expanse of time between using the CPU and using the GPU, in realtime computer use (1995-present), sticking to allowing a CPU to enslave the GPU, is in the questor's best interests.
Yes ... Without doing anything, my Windows C# UWP app utilizes the GPU. Only Edge and VS are also utilizing it.
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
-
I had object detection .net working with the integrated gpu previously, so i switched to a 1650 video card. I removed .net and went with the 6.2 one. I also ran the cuda installer and the batch file installer per the codeproject ai download page. I've restarted the pc and i cant get it to go to GPU mode any thoughts out there? I had moved this thread to the other forum, but for some reason i cant find it, but here was the solution that worked for me: Uninstall code project Uninstall cuda 11.7 Uninstall gfx card drivers Reboot Install 516.94 gfx drivers Install cuda 11.7.0 Install Code Project 2.0.8 Install cuDNN bat file (https://www.codeproject.com/ai/docs/faq/gpu.html)