I built a note-taking app because the one I wanted didn’t exist. Clean UI, local .md files, no cloud, no account.

Built with Rust + Tauri 2.0 + SvelteKit. Full-text search powered by Tantivy. Graph view, AI writing tools (bring your own key), Obsidian import, version history.

Available for Linux (AppImage, APT, AUR), Windows, and macOS. Source: https://codeberg.org/ArkHost/HelixNotes

  • teawrecks@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 hours ago

    I see on the page it says you can bring an anthropic or openai key. Can I also point it at my own locally hosted model?

    • ArkHost@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      6 hours ago

      Not at this moment. Which local model would you like to see as an additional option?

      • teawrecks@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 hours ago

        I don’t know what is typical, but when I use AI locally I’ve been running llama-cpp with models grabbed from HF (ex. QwenCoder). Then in my VS code plugin (RooCode) I use the “OpenAI compatible” option to point it at my local server.

        Not sure how hard that is to get working, but my hope is that “OpenAI Compatible” helps.