TheCornCollector@piefed.zip to LocalLLaMA@sh.itjust.worksEnglish · edit-22 days agoQwen3.6 27B releasedhuggingface.coexternal-linkmessage-square14fedilinkarrow-up153arrow-down11file-text
arrow-up152arrow-down1external-linkQwen3.6 27B releasedhuggingface.coTheCornCollector@piefed.zip to LocalLLaMA@sh.itjust.worksEnglish · edit-22 days agomessage-square14fedilinkfile-text
minus-squareAbrinoxus@thelemmy.clublinkfedilinkEnglisharrow-up1·edit-23 hours agoKoboldcpp is an easy way to get into running local llm:s they have executables for linux, mac, windows on their github and a simple gui to load a model
minus-squaresexual_tomato@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up2·1 hour agoI use ollama to run it, litellm to put an openai API in front of it, and use it via any app that can talk to an open AI API.
Koboldcpp is an easy way to get into running local llm:s they have executables for linux, mac, windows on their github and a simple gui to load a model
I use ollama to run it, litellm to put an openai API in front of it, and use it via any app that can talk to an open AI API.