I’m connecting to llama.cpp on my laptop through my phone via Tailscale but when my laptop sleeps I can’t access it anymore on my phone.
What are yall using for this? Thanks!
I use my homeserver for it. It’s located in the broom closet and on 24/7. But there’s ways to do it with a laptop. You can inhibit standby and let it run contunuously. Or configure Wake on Lan and wake it up before you use it… I mean a switched-off computer obviously can’t do any computation.
Disable sleep/hibernate on your laptop. Disable sleep/hibernate/poweroff when lid is closed. If you’re going to treat your laptop as a server, probably better to configure it as a server.
When your laptop is sleeping, it can’t be doing computation. I mean, that’s not specific to LLMs. You’re going to have to set it in your OS to not sleep if you want to remotely access it, or to sleep under conditions that don’t arise for you (e.g. “don’t sleep if the lid is closed while the laptop is plugged in” or whatever).
The only exception I can think of would be if you (a) have it connected to wired Ethernet and (b) the laptop can do Wake-on-LAN; in that scenario, you could rig something up where it wakes up when something tries to contact it. In practice, you probably just don’t want to have it sleeping.




