Frigate is NVR software with motion detection, object detection, recording, etc… It has matured a lot over the past couple of years and I’m really happy with it.

I’ve been running Frigate for a while, but with version 0.17.0 it sounded like things have changed enough for me to update how I do things. I’m writing all of the following in case anyone else is in the same boat. There’s a lot to read, but hopefully it helps make sense of the options.

Keeping my camera feeds the same, I was interested in switching my object detector from a Google Coral to the embedded graphics in my 13th gen Intel CPU. The main reason for this was because the Google Coral was flaky and I was having to reboot all the time. Maybe because I run Frigate in a virtual machine in Proxmox, so the Coral has to be passed through to the VM? Not sure.

I also wanted to figure out how to get the camera streams to work better in Home Assistant.

Switching from Google Coral to OpenVINO

This was relatively straight forward. I mostly followed these directions and ended up with:

detectors:  
  ov:  
    type: openvino  
    device: GPU  

Switching from the default to YOLOv9

Frigate comes with some default ability to detect objects such as person and car. I kept hearing that YOLOv9 was more accurate, and they even got YOLOv9 working with Google Coral devices, just with a limited set of objects. So, I wanted to switch.

This took me a minute to wrap my head around since it’s not enabled out of the box.

I added the following to my config based on these directions :

model:  
  model_type: yolo-generic  
  width: 320 # <--- should match the imgsize set during model export  
  height: 320 # <--- should match the imgsize set during model export  
  input_tensor: nchw  
  input_dtype: float  
  path: /config/model_cache/yolo.onnx  
  labelmap_path: /labelmap/coco-80.txt  

… except for me the yolo file is called yolov9-t-320.onnx instead of yolo.onnx… but I could have just as easily renamed the file.

That brings us to the next part – how to get the yolo.onnx file. It’s a bit buried in the documentation, but I ran the commands provided here. I just copied the whole block of provided commands and ran them all at once. The result is an .onnx file in whatever folder you’re currently in.

The .onnx file needs to be copied to /config/model_cache/, wherever that might be based on your Docker Compose.

That made me wonder about the other file, coco-80.txt. Well, it turns out coco-80.txt is already included inside the container, so nothing to do there. That file is handy though, because it lists 80 possible things that you can track. Here’s the list on github.

I won’t go over the rest of the camera/motion configuration, because if you’re doing this then you definitely need to dive into the documentation for a bunch of other stuff.

Making the streams work in Home Assistant

I’ve had the Frigate integration running in Home Assistant for a long time, but clicking on the cameras only showed a still frame, and no video would play.

Home Assistant is not on the same host as Frigate, by the way. Otherwise I’d have an easier time with this. But that’s not how mine is set up.

It turns out my problem was caused by me using go2rtc in my Frigate setup. go2rtc is great and acts as a re-streamer. This might reduce bandwidth which is important especially for wifi cameras. But, it’s optional, and I learned that I don’t want it.

go2rtc should work with Home Assistant if they’re both running on the same host (same IP address), or if you run the Docker stack with network_mode: host so it has full access to everything. I tried doing that, but for some reason Frigate got into a boot loop, so I changed it back to the bridge network that I had previously.

The reason for this, apparently, is that go2rtc requires more than whatever published ports they say to open in Docker. Maybe it uses random ports or some other network magic. I’m not sure.

The downside of not having go2rtc is that the camera feeds in the Frigate UI are limited to 720p. I can live with that. The feeds in Home Assistant are still full quality, and recordings are still full quality.

By removing go2rtc from my config, Home Assistant now streams directly from the cameras themselves instead of looking for the go2rtc restream. You may have to click “Reconfigure” in the Home Assistant integration for the API to catch up.

Hope this helps. If not, sorry you had to read all of this.

  • CmdrShepard49@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 hours ago

    I also have frigate on proxmox with a Google coral but mine has been rock solid. The only difference is that I use an LXC instead of a VM. I recall there being more issues passing hardware to VMs in Proxmox since they don’t like to share.

  • frongt@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 hours ago

    Yeah you probably need to pass the tpu to the VM directly. But openvino on CPU has been just fine for me.

    Although I’ve noticed in 0.17, it’s started complaining that ov takes a long time, with an absudly large value in ms. Nothing seems to be broken, and restarting the container clears it.

  • jake_jake_jake_@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 hours ago

    I use Frigate and HomeAssistant, they are on different hosts and the only port allowed from Frigate to HomeAssistant is the non-auth api port. For normal users using Frigate, I use an Oauth2-proxy instance on the same host (same compose) as Frigate tied to a third host with keycloak. go2rtc is on the Frigate host, but it only talks to Frigate and the cameras themselves. You can also access go2rtc from outside if you want to access the streams directly but your HomeAssistant does not need too. I find that this is better than the cameras directly as the processing is not really meant for a whole bunch of streams at once.

    I followed docs for the HomeAssistant to Frigate stuff with the GIF notifications and it is working fine. I also use the Frigate integration (using HACS) so maybe there is a lot done for me.