[Detector Support]: Openvino and Intel integrated NPU support #13248
-
Describe the problem you are havingHello, I have been using Frigate 14 with YoloNAS-M (256x256) with great satisfaction. For my use case, I have no FP and an inference speed below 10ms with an Intel Core Ultra 7 155H with the iArc GPU. I have been exploring the possibility to utilize the NPU, to offload the GPU from this task 1. I have sucessfully downloaded the NPU drivers from : 2. I have an initialized VPU with : 3. I also have an accel0 device passthrough working: 4. When I try to set the detector device to NPU in the Frigate config, I get the following message (see log below): 5. However, when looking inside the container, the plugin seems present: I observe the same behavior when using : model: Has anyone tried the integrated Intel NPU with Openvino before ? Many thanks, Philippe Version0.14.0-da913d8 Frigate config filemqtt:
host: 192.168.5.252
port: 1883
user: xxxx
password: xxxx
topic_prefix: frigate
cameras:
Cam_Garage: # <------ Name the camera
ui:
order: 3
ffmpeg:
hwaccel_args: preset-vaapi
inputs:
- path:
rtsp://xxxx:xxxx@192.168.5.204:554/axis-media/media.amp?videocodec=h264&resolution=800x450&fps=5
roles:
- detect
- path:
rtsp://xxxx:xxxx@192.168.5.204:554/axis-media/media.amp?videocodec=h264&resolution=1920x1080&fps=5
roles:
- record
zones:
entire_garage:
inertia: 3
coordinates: 0,450,800,450,800,187,528,50,284,21,0,55
detect:
enabled: true
width: 800
height: 450
fps: 5
motion:
mask: 0.356,0.043,0.663,0.109,1,0.403,1,0,0,0,0,0.117
improve_contrast: true
objects:
track:
- person
- cat
- dog
filters:
person:
threshold: 0.65 # 0.70 for Yolov8s_srt_100eV3. 0.6 for Yolov8s
min_score: 0.55
min_ratio: 0.20
max_ratio: 0.8
snapshots:
required_zones:
- entire_garage
enabled: true
bounding_box: true
retain:
default: 90
record:
enabled: true
retain:
days: 0
# mode: active_objects
events:
retain:
default: 30
mode: active_objects
review:
alerts:
required_zones:
- entire_garage
Cam_Garden: # <------ Name the camera
ui:
order: 1
ffmpeg:
hwaccel_args: preset-vaapi
inputs:
- path:
rtsp://xxxx:xxxx@192.168.5.202:554/axis-media/media.amp?videocodec=h265&resolution=800x600&fps=5
roles:
- detect
- path:
rtsp://xxxx:xxxx@192.168.5.202:554/axis-media/media.amp?videocodec=h265&resolution=1600x1200&fps=5
roles:
- record
zones:
entire_garden:
inertia: 3
coordinates: 0,1,0.71,1,0.835,0.8,0.963,0.474,0.57,0.267,0.287,0.15,0,0.307
detect:
enabled: true
width: 800
height: 600
fps: 5
motion:
mask: 0.29,0.141,0.58,0.264,0.695,0.322,1,0.485,1,0,0.278,0,0,0,0,0.296,0.162,0.212
improve_contrast: true
threshold: 30
contour_area: 10
objects:
track:
- person
- cat
- dog
filters:
person:
threshold: 0.65
min_score: 0.55
min_area: 250
max_area: 10000
snapshots:
required_zones:
- entire_garden
enabled: true
bounding_box: true
retain:
default: 90
record:
enabled: true
retain:
days: 0
# mode: active_objects
events:
retain:
default: 30
mode: active_objects
review:
alerts:
required_zones: entire_garden
Cam_Cave: # <------ Name the camera
ui:
order: 2
ffmpeg:
hwaccel_args: preset-vaapi
inputs:
- path:
rtsp://xxxx:xxxx@192.168.5.203:554/axis-media/media.amp?videocodec=h265&resolution=800x600&fps=5
roles:
- detect
- path:
rtsp://xxxx:xxxx@192.168.5.203:554/axis-media/media.amp?videocodec=h265&resolution=1600x1200&fps=5
roles:
- record
zones:
entire_cave:
inertia: 3
coordinates: 246,600,800,600,756,600,720,349,516,208,223,100,102,108,139,300,105,315,138,405
detect:
enabled: true
width: 800
height: 600
fps: 5
motion:
mask:
- 156,281,135,163,279,162,462,211,800,365,800,0,0,0,0,331
improve_contrast: true
threshold: 20
objects:
track:
- person
- cat
- dog
filters:
person:
threshold: 0.65
min_score: 0.55
min_area: 200
max_area: 14000
mask:
0.17,0.272,0.369,0.278,0.372,0.339,0.384,0.339,0.387,0.289,0.573,0.355,1,0.606,1,0,0,0.002,0,0.552,0.196,0.466
snapshots:
required_zones:
- entire_cave
enabled: true
bounding_box: true
retain:
default: 90
record:
enabled: true
retain:
days: 0
# mode: active_objects
events:
retain:
default: 30
mode: active_objects
review:
alerts:
required_zones:
- entire_cave
Cam_Entree: # <------ Name the camera
ui:
order: 4
ffmpeg:
hwaccel_args: preset-vaapi
inputs:
- path:
rtsp://xxxx:xxxx@192.168.5.201:554/axis-media/media.amp?videocodec=h264&resolution=640x480&fps=5
roles:
- detect
- path:
rtsp://xxxx:xxxx@192.168.5.201:554/axis-media/media.amp?videocodec=h264&resolution=640x480&fps=5
roles:
- record
detect:
enabled: true
width: 640
height: 480
fps: 5
objects:
track:
- person
- cat
- dog
filters:
person:
threshold: 0.65
min_score: 0.55
mask: 1,0.357,1,0,0,0,0,0.321,0.593,0.325,0.601,0.346
snapshots:
enabled: true
bounding_box: true
retain:
default: 90
record:
enabled: true
retain:
days: 0
# mode: active_objects
events:
retain:
default: 30
mode: active_objects
motion:
mask: 0.001,0.002,0.999,0.001,0.999,0.355,0.602,0.346,0.592,0.322,0,0.32
threshold: 40
contour_area: 10
improve_contrast: true
detectors:
ov:
type: openvino
device: NPU
model:
path: /models/yolo_nas_m_256.onnx
model:
model_type: yolonas
width: 256 # <--- should match whatever was set in notebook
height: 256 # <--- should match whatever was set in notebook
input_tensor: nchw
input_pixel_format: bgr
labelmap_path: /models/coco_80cl.txt
version: 0.14
camera_groups: {} docker-compose file or Docker CLI commandversion: "3.9"
services:
frigate:
container_name: frigate
privileged: true # this may not be necessary for all setups
restart: unless-stopped
image: ghcr.io/blakeblackshear/frigate:0.14.0
shm_size: "256mb" # update for your cameras based on calculation above
devices:
- /dev/bus/usb:/dev/bus/usb # passes the USB Coral, needs to be modified for other versions
- /dev/dri
- /dev/accel
volumes:
- /etc/localtime:/etc/localtime:ro
- /root/frigate/config/:/config/
- /root/frigate:/media/frigate
- /root/frigate/models:/models
- type: tmpfs # Optional: 1GB of memory, reduces SSD/SD Card wear
target: /tmp/cache
tmpfs:
size: 1000000000
ports:
- "15000:5000"
- "19350:1935" # RTMP feeds
environment:
PLUS_API_KEY: xxxx
NEOReadDebugKeys: 1
OverrideGpuAddressSpace: 48 Relevant log output2024-08-21 09:28:04.483711629 Process detector:ov:
2024-08-21 09:28:04.484450278 Traceback (most recent call last):
2024-08-21 09:28:04.484477079 File "/usr/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
2024-08-21 09:28:04.484479530 self.run()
2024-08-21 09:28:04.484480738 File "/usr/lib/python3.9/multiprocessing/process.py", line 108, in run
2024-08-21 09:28:04.484485230 self._target(*self._args, **self._kwargs)
2024-08-21 09:28:04.484487045 File "/opt/frigate/frigate/object_detection.py", line 102, in run_detector
2024-08-21 09:28:04.484500125 object_detector = LocalObjectDetector(detector_config=detector_config)
2024-08-21 09:28:04.484501349 File "/opt/frigate/frigate/object_detection.py", line 53, in __init__
2024-08-21 09:28:04.484502754 self.detect_api = create_detector(detector_config)
2024-08-21 09:28:04.484503632 File "/opt/frigate/frigate/detectors/__init__.py", line 18, in create_detector
2024-08-21 09:28:04.484504447 return api(detector_config)
2024-08-21 09:28:04.484505400 File "/opt/frigate/frigate/detectors/plugins/openvino.py", line 43, in __init__
2024-08-21 09:28:04.484519659 self.interpreter = self.ov_core.compile_model(
2024-08-21 09:28:04.484520992 File "/usr/local/lib/python3.9/dist-packages/openvino/runtime/ie_api.py", line 521, in compile_model
2024-08-21 09:28:04.484531392 super().compile_model(model, device_name, {} if config is None else config),
2024-08-21 09:28:04.484540486 RuntimeError: Exception from src/inference/src/cpp/core.cpp:123:
2024-08-21 09:28:04.484542622 Cannot load library '/usr/local/lib/python3.9/dist-packages/openvino/libs/libopenvino_intel_npu_plugin.so': libze_loader.so.1: cannot open shared object file: No such file or directory Operating systemProxmox Install methodDocker Compose Object DetectorOpenVino Any other information that may be helpfulAsus NUC Intel Core Ultra 7 155H with iArc GPU. |
Beta Was this translation helpful? Give feedback.
Replies: 6 comments 8 replies
-
Bumping up this one.... |
Beta Was this translation helpful? Give feedback.
-
Hi gyus. I have solved this issue installing the NPU drivers and from intel and updated OpenVino to the ltest version using pip and even once i got NPU working. Unfortunately I didn't get any screenshots to show how it works. CPU is Ultra 5 125H I've installed all the drivers inside the docker container but didn't figure out how to commit changes and lauch it as the HA addon. So after container restart i just lost all progres, but for my opinion this would be great to add NPU support. It is really easy, working out of the box. And NPU can be really useful for some specific cases. |
Beta Was this translation helpful? Give feedback.
-
I'm having the same issue. ASUS nuc14 pro with Intel Core Ultra 7 155H processor. frigate log
frigate config
docker compose yaml
I seem to have no problem accessing libopenvino_intel_npu_plugin.so from inside the container (docker exec)
Is anyone else having the same issue and has found a solution? |
Beta Was this translation helpful? Give feedback.
-
So i can confirm that intel NPU is works fine for me. There are the steps how i got it works.
So i'm not a pro in github usage but i'll try to create the pull request to add intel NPU drivers in dev container because for this you just need to install one package from debial bookwork and 4 packages from intel. The main 2 problems which i faced with are:
also it seems that detector CPU load shows incorrect data because htop shows host load no more than 18% So it would be awesome to add NPU support because it can be very potential for some cases and also feature mentioned somewhere there to add the possibility to use separate detector for specific camera |
Beta Was this translation helpful? Give feedback.
-
I was able to install Intel NPU Drivers into my host (Proxmox) and pass it through utilizing the Proxmox GUI Device Pass Through (/dev/accel/accel0) and also added this into my container config (included - /dev/accel/accel0 arg under devices:). I have no way to confirm the % utilization utilizing the NPU argument in the frigate configuration - but inference speed gets around 18 ms. If I utilize GPU argument, it goes down to 8ms. If utilize CPU (I have a core ultra 125H) it goes to 4ms inference time. Proxmox GUI CPU Utilization percentage increases quite a lot and I am allowing 12 cores to the frigate container - increases from 3-4% to 10-15%. I am yet to plug a smart plug to check how many watts my Beeling SEi14 consumes utilizing NPU vs GPU - but looking at inference times alone makes me believe that I don't have to install Intel NPU Drivers inside the container to make it work - it is enough to pass through the hardware and configure it in the config.yml of the frigate container |
Beta Was this translation helpful? Give feedback.
-
Here's how I made the integrated NPU on my Intel Core Ultra 5 245K work with Frigate. Other distros should work as well, assuming DISCLAIMER: Tested inside of an LXC container on Proxmox VE 8.3.4, with an Intel Core Ultra 5 245K, kernel DISCLAIMER 2: This changes files inside of the Frigate container image. This means that any re-creates of the container or image updates of the container will erase these changes. You'll need to redo this process, starting from step 3. Hopefully Frigate will be updated in the near future to support Arrow Lake NPUs out of the box without needing this to be done. I would have tried to make my own image but I'm currently unable to figure out how to recreate the image builder process. These are the steps I needed to perform:
--- End of steps and additional notes:--- I realize these steps are similar to what has been posted before but only on 100 series Intel processors. It seems the currently distributed OpenVINO version works for that generation of product but not Arrow Lake/200 series. I do plan to test switching from the I had tried both only updating OpenVINO inside the container and also only installing the NPU packages inside the container, and each of them by themselves would not work. Updating only OpenVINO returned device errors, while installing only the NPU packages kept returning the constant "no available devices error". In my testing, the inference time on the Core Ultra 5 245K iGPU is ~4ms, while the NPU returns times in the range of 12-20ms. In my setup I'm not noticing a drastic difference in performance to be concerned, but it's worth noting. Power consumption for the whole system (measured from the wall with a Sonoff S31 running ESPHome) decreased by 2-3 watts, down from just shy of 60, to almost as low as 55 watts. I am willing to consider that power usage difference down to simple fluctuations through the day. In my setup I'm choosing to dedicate the NPU entirely for Frigate inference, so any other applications that run with OpenVINO use the iGPU (such as Immich with its smart search and facial recognition features). I haven't tried any of the semantic search or generative AI features in Frigate so I'm not sure how this will behave being ran on an NPU. Here is my proof that OpenVINO is making use of the NPU for inference. Note that the My camera setup: EDIT: Forgot to mention that the NPU |
Beta Was this translation helpful? Give feedback.
Here's how I made the integrated NPU on my Intel Core Ultra 5 245K work with Frigate. Other distros should work as well, assuming
intel_vpu
is supported by the kernel and you have a means of installing theintel-fw-npu
package.DISCLAIMER: Tested inside of an LXC container on Proxmox VE 8.3.4, with an Intel Core Ultra 5 245K, kernel
6.8.12-8-pve
. Frigate 0.15.0 Release, Intel Linux NPU Driver v1.13.0. I also tested this on Ubuntu 24.04 (skipping over Docker nested inside an LXC container) on the same hardware and it worked the same. I only tested the default SSDlite Mobilenet V2 model. See below for my Frigateconfig.yml
file.DISCLAIMER 2: This changes files inside of the Frigate contain…