Yes this is a recipe for extremely slow inference: I’m running a 2013 Mac Pro with 128gb of ram. I’m not optimizing for speed, I’m optimizing for aesthetics and intelligence :)

Anyway, what model would you recommend? I’m looking for something general-purpose but with solid programming skills. Ideally obliterated as well, I’m running this locally I might as well have all the freedoms. Thanks for the tips!

  • humanspiral@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    12 days ago

    on principal I didn’t like running a model that will refuse to talk about things China doesn’t like.

    A good way to define a political issue is that there are at least 2 sides to a narrative. You can’t use a LLM to decide a side to favour if you can’t really use Wikipedia either. It takes deep expertise and an open mind to determine a side more likely to contain more truth.

    You may or not seek confirmation of your political views, but media you like should do so more than a LLM, and it is a better LLM that avoids confirming or denying your views, arguably anyway.