• 0 Posts
  • 26 Comments
Joined 1 year ago
cake
Cake day: June 26th, 2023

help-circle









  • This might be happening because of the ‘elegant’ (incredibly hacky) way openai encodes multiple languages into their models. Instead of using all character sets, they use a modulo operator on each character, to make all Unicode characters represented by a small range of values. On the back end, it somehow detects which language is being spoken, and uses that character set for the response. Seeing as the last line seems to be the same mathematical expression as what you asked, my guess is that your equation just happened to perfectly match some sentence that would make sense in the weird language.


  • stingpie@lemmy.worldtoProgrammer Humor@lemmy.mlTrue Story
    link
    fedilink
    arrow-up
    1
    arrow-down
    5
    ·
    8 months ago

    If C++/C were real languages for real programming they’d enforce unreadability in the compiler.

    No sane language designer would say “It is imperative that you write the most unreadable code possible” then write a compiler that says “oh your code doesn’t triple dereference pointers? lol lmao that rocks”

    They have played you all for fools.



  • Recently, I’ve just given up trying to use cuda for machine learning. Instead, I’ve been using (relatively) cpu intensive activation functions & architecture to make up the difference. It hasn’t worked, but I can at least consistently inch forward.


  • I’m not sure I understand your argument. Are you saying that the emulated processor executes instructions while the SoC doesn’t? Every instruction that goes to the x86 is broken down into several SoC instructions, which the SoC executes in order to emulate what an x86 would do. Saying that the emulated x86 is booting/running Linux, but the SoC is not is like saying that computers can’t run java code, they can only run jvm.