Ola
Elsewhere, I’ve been building a behaviour shaping harness for local LLMs. In the process of that, I thought “well, why not share what the voices inside my head are saying”.
With that energy in mind, may I present Clanker Adjacent (name chosen because apparently I sound like a clanker - thanks lemmy! https://lemmy.world/post/43503268/22321124)
I’m going for long form, conversational tone on LLM nerd-core topics; or at least the ones that float my boat. If that’s something that interests you, cool. If not, cool.
PS: I promise the next post will be “Show me your 80085”.
PPS: Not a drive by. I lurk here and get the shit kicked out of me over on /c/technology
that looks interessing any guides where this is in an docker compose stack with olama and open webui? i want to experiment on an i5 6th gen. mini pc.
noob here.
Done
I’ll give you the noob safe walk thru, assuming starting from 0
- Install Docker Desktop (or Docker Engine + Compose plugin).
- Clone the repo:
git clone https://codeberg.org/BobbyLLM/llama-conductor.git - Enter the folder and copy env template:
cp docker.env.example .env(Windows: copy manually) - Start core stack:
docker compose up -d - If you also want Open WebUI:
docker compose --profile webui up -d
Included files:
docker-compose.ymldocker.env.exampledocker/router_config.docker.yaml
Noob-safe note for older hardware:
- Use smaller models first (I’ve given you the exact ones I use as examples).
- You can point multiple roles to one model initially.
- Add bigger/specialized models later once stable.
Docs:
- README has Docker Compose quickstart
- FAQ has Docker + Docker Compose section with command examples
Yes, if you mean llama-conductor, it works with Open WebUI, and I’ve run it with OWUI before. I don’t currently have a ready-made Docker Compose stack to share, though.
https://github.com/BobbyLLM/llama-conductor#quickstart-first-time-recommended
There are more fine-grained instructions in the FAQ:
https://github.com/BobbyLLM/llama-conductor/blob/main/FAQ.md#technical-setup
PS: will work fine on you i5. I tested it the other week on a i5-4785T with no dramas
PPS: I will try to get some help to set up a docker compose over the weekend. I run bare metal, so will be a bit of a learning curve. Keep an eye on the FAQ / What’s new (I will announce it there if I mange to figure it out)

