Recently a user posted a comment on one of my posts about Qwen secretly sending information over the internet even if run locally.
Is there any privacy concern for locally run models to share your conversations or data? What if they can connect to the internet via a tool or MCP?


Yes. As far as I know, any gguf file should be completely safe. There had been some bugs/security vulnerabilities early on in llama.cpp, but they fixed that and I think overall, they have a good track record.
Issues might come after that, if you run some Agents on top of it, and give them access to your computer. But you don’t have to do that. If you just talk to it, I don’t see any reason to be alarmed. Other than the usual stuff. Keep using your own brain once in a while, and don’t blindly trust what AI Chatbots tell you, they give inaccurate information all the time 😅