The Qwen3.5 models are still the best local models I’ve used, so I’m excited to see how this updated version performs.

    • MalReynolds@slrpnk.net
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      4 hours ago

      AI lab

      Had a quick squiz at it, and if it meets your needs, I’d just wait, unless you want to get into the lower levels of things (and run linux), llama-swap is just an inference server, you’ll need something like Open WebUI for chat as well etc.