cross-posted from: https://lemmy.ml/post/43700680

“A terminal tool that right-sizes LLM models to your system’s RAM, CPU, and GPU. Detects your hardware, scores each model across quality, speed, fit, and context dimensions, and tells you which ones will actually run well on your machine.”

  • toothbrush@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 hours ago

    I would greatly prefer a simple calc software/website that you give a model to and it tells you how it would run, but I suppose in the era of vibecoding, making small, functional, well designed software has become a distant dream.

    (and the readme tells you to curl and execute a shell script? no thanks)

  • lime!@feddit.nu
    link
    fedilink
    English
    arrow-up
    11
    ·
    3 hours ago

    feels like this could be made a lot safer by just making it a website where you enter your specs

    • vermaterc@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      7
      ·
      3 hours ago

      true, I don’t like the curl [something] | sh pattern for installation. Calling it is just like letting random guy from the internet control of your PC to download some binaries. I’m seeing this trend more and more in Github repos

      • lime!@feddit.nu
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 hours ago

        it’s just going to be more common with vibe coding taking over, since if you ask a model to write a readme with an installation section that’s what you’ll get.