I've posted in the CodeProject.AI thank you, how do I delete this post I cannot see any option allowing me to delete?
Zz0K2
Posts
-
CUDA Errors with YOLOv5 Object Detection CPAI 2.5.x / Blue Iris – Seeking Insights -
CUDA Errors with YOLOv5 Object Detection CPAI 2.5.x / Blue Iris – Seeking InsightsI'm reaching out to share a perplexing issue I've encountered with the integration CPAI with my BI setup, hoping to find if anyone else has experienced something similar or could offer any insights. The problem first manifested around 01:45 am on 14/02/2024, and despite troubleshooting efforts, it recurred this morning, indicating a persistent underlying issue. Initially, the system logs from 14th February showed an error related to CUDA, specifically mentioning "an illegal memory access was encountered". This issue caused a loop of errors until a system reboot was performed at 9:06 am. Here is the exact log entry for reference:
2024-02-14 01:39:57: Object Detection (YOLOv5 6.2): Retrieved objectdetection_queue command 'custom' in Object Detection (YOLOv5 6.2)
2024-02-14 01:39:57: Object Detection (YOLOv5 6.2): Detecting using ipcam-combined in Object Detection (YOLOv5 6.2)
2024-02-14 01:39:57: Response received (#reqid 85bde494-89d3-429d-a21b-c10b9430c5a8 for command custom)
2024-02-14 01:39:57: Object Detection (YOLOv5 6.2): [RuntimeError] : Traceback (most recent call last):
File "C:\Program Files\CodeProject\AI\modules\ObjectDetectionYOLOv5-6.2\detect.py", line 141, in do_detection
det = detector(img, size=640)
File "C:\Program Files\CodeProject\AI\runtimes\bin\windows\python37\venv\lib\site-packages\torch\nn\modules\module.py", line 1190, in _call_impl
return forward_call(*input, **kwargs)
File "C:\Program Files\CodeProject\AI\runtimes\bin\windows\python37\venv\lib\site-packages\torch\autograd\grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "C:\Program Files\CodeProject\AI\runtimes\bin\windows\python37\venv\lib\site-packages\yolov5\models\common.py", line 669, in forward
with dt[0]:
File "C:\Program Files\CodeProject\AI\runtimes\bin\windows\python37\venv\lib\site-packages\yolov5\utils\general.py", line 158, in __enter__
self.start = self.time()
File "C:\Program Files\CodeProject\AI\runtimes\bin\windows\python37\venv\lib\site-packages\yolov5\utils\general.py", line 167, in time
torch.cuda.synchronize()
File "C:\Program Files\CodeProject\AI\runtimes\bin\windows\python37\venv\lib\site-packages\torch\cuda\__init__.py", line 566, in synchronize
return torch._C._cuda_synchronize()
RuntimeError: CUDA error: an illegal memory access was encountered
CUDA kernel errors might be asynchronously reported at some other API call,so the stacktrace below might be incorr -
GPU Query for CPAI & Blue IrisHello everyone, I'm new to this forum and I'm currently in the midst of a decision-making process regarding the optimal GPU usage for running CPAI on Blue Iris, particularly for a single-camera setup at my home. The primary purpose is to detect human presence while effectively filtering out false triggers due to weather conditions. At present, I own a RTX 3080, which I'm considering repurposing for this task, especially since I'm contemplating an upgrade to one of the 4000 series super cards in the near future. However, I'm deliberating whether the 3080 might be overkill for my specific requirements. After substantial research and discussions on the Blue Iris forum, the consensus appears to be in favour of the GTX 1650. It's been recommended as a more than adequate solution for CPAI, offering sufficient processing speed while maintaining lower power consumption. My current setup relies solely on CPU processing, resulting in a latency of about 120-200ms for image processing. In contrast, I came across a post on the Blue Iris forum indicating that the GTX 1650 could potentially reduce this latency to around 30+ms. This substantial improvement naturally piques my interest. However, I can't help but wonder about the potential benefits of deploying my GTX 3080 for this purpose. Would there be a significant advantage in terms of processing speed or efficiency? I've noticed that a fellow member, MikeLud, is conducting tests with a GTX 4090, which adds another layer of curiosity regarding the performance spectrum of these GPUs. While I'm currently leaning towards the GTX 1650, primarily due to its power efficiency and seemingly adequate capabilities for my needs, I'm eager to hear your thoughts and experiences. Has anyone here used a GTX 3080/3080TI for a similar setup? If so, what were your observations regarding latency, power consumption, and overall performance? Your insights and any additional information you can provide would be greatly appreciated. I’m looking to make the most informed decision to ensure efficient and effective surveillance at my home. Thank you in advance for your valuable input! Best regards ;)