I’m connecting to llama.cpp on my laptop through my phone via Tailscale but when my laptop sleeps I can’t access it anymore on my phone.

What are yall using for this? Thanks!

  • hendrik@palaver.p3x.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    4 hours ago

    I use my homeserver for it. It’s located in the broom closet and on 24/7. But there’s ways to do it with a laptop. You can inhibit standby and let it run contunuously. Or configure Wake on Lan and wake it up before you use it… I mean a switched-off computer obviously can’t do any computation.