I’m connecting to llama.cpp on my laptop through my phone via Tailscale but when my laptop sleeps I can’t access it anymore on my phone.

What are yall using for this? Thanks!

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    7 hours ago

    When your laptop is sleeping, it can’t be doing computation. I mean, that’s not specific to LLMs. You’re going to have to set it in your OS to not sleep if you want to remotely access it, or to sleep under conditions that don’t arise for you (e.g. “don’t sleep if the lid is closed while the laptop is plugged in” or whatever).

    The only exception I can think of would be if you (a) have it connected to wired Ethernet and (b) the laptop can do Wake-on-LAN; in that scenario, you could rig something up where it wakes up when something tries to contact it. In practice, you probably just don’t want to have it sleeping.