• 0 Posts
  • 24 Comments
Joined 2 years ago
cake
Cake day: July 1st, 2023

help-circle


  • SGforce@lemmy.catoLocalLLaMA@sh.itjust.worksMy AI Skeptic Friends Are All Nuts
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    1 month ago

    Local models are not capable of coding yet, despite what benchmarks say. Even if they get what you’re trying to do they spew out so many syntax errors and tool calling problems that it’s a complete waste of time. But if you’re using an API then I don’t see why not one editor over another. They’ll be different in implementation but generally pull off the same things





  • Space Empires V

    It’s very old, unfinished and jank as fuck. The ai was never very good and could be steamrolled easily with the right tech tree. But those first few turns while exploring and setting up colonies without knowing exactly which tech your nearest rivals would have or if they were planning an invasion was always very fun. Then it would turn into a tedious logistics game of trying to move your fleets or decommission ships that took you the majority of the game to build.

    Also, Space Rangers 2.

    It’s like an amalgum arcady space shooter but somehow turnbased and space RPG text adventure. It was always very buggy with a UI that is ugly as hell.












  • The technology for quantisation has improved a lot this past year making very small quants viable for some uses. I think the general consensus is that an 8bit quant will be nearly identical to a full model. Though a 6bit quant can feel so close that you may not even notice any loss of quality.

    Going smaller than that is where the real trade off occurs. 2-3 bit quants of much larger models can absolutely surprise you, though they will probably be inconsistent.

    So it comes down to the task you’re trying to accomplish. If it’s programming related, 6bit and up for consistency with whatever the largest coding model you can fit. If it’s creative writing or something a much lower quant with a larger model is the way to go in my opinion.