TheCornCollector@piefed.zip to LocalLLaMA@sh.itjust.worksEnglish · edit-27 hours agoQwen3.6-35B-A3B releasedhuggingface.coexternal-linkmessage-square14fedilinkarrow-up126arrow-down12file-text
arrow-up124arrow-down1external-linkQwen3.6-35B-A3B releasedhuggingface.coTheCornCollector@piefed.zip to LocalLLaMA@sh.itjust.worksEnglish · edit-27 hours agomessage-square14fedilinkfile-text
The Qwen3.5 models are still the best local models I’ve used, so I’m excited to see how this updated version performs.
minus-squarealtphoto@lemmy.todaylinkfedilinkEnglisharrow-up2·5 hours agoOkay, thanks for the help! I’ll give llama swap a shot.
minus-squareMalReynolds@slrpnk.netlinkfedilinkEnglisharrow-up3·edit-24 hours ago AI lab Had a quick squiz at it, and if it meets your needs, I’d just wait, unless you want to get into the lower levels of things (and run linux), llama-swap is just an inference server, you’ll need something like Open WebUI for chat as well etc.
Okay, thanks for the help! I’ll give llama swap a shot.
Had a quick squiz at it, and if it meets your needs, I’d just wait, unless you want to get into the lower levels of things (and run linux), llama-swap is just an inference server, you’ll need something like Open WebUI for chat as well etc.