🦀Rust🦀 https://typst.app 💚

  • 0 Posts
  • 47 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle




  • The last part was wrapped in a spoiler and under the post scriptum clause to indicate that it’s not important and that you should really see it if you don’t want to. And I added that just to educate a bit more, since I already started the " this is wrong and this is right" conversation. To be honest, I hate that that OS had so many naming changes that everyone is just left confused in the end. Some still say OS X or whatever else.


  • What is bad about it? It wasn’t an offensive statement, it was stating the fact that that the person was new to the whole “MB vs. MiB” mega story that is an ongoing issue for at least over 10 years, and that I envy/pity the “cruel world that they are in” (where everyone uses JEDEC units while IEC ones should be used instead).

    If that is the only reason why you’re starting calling names other people and downvoting all of their comments then you’re overreacting. The person I talked with didn’t even mention it. I heard this phrase from some movie or something.



  • Windows and MacOS use the abbriviation “MB” referring to the binary units, correct?

    Yez. I’m only sure about the first one, but didn’t test myself whether the macOS is using power of 2 or 10 under the hood (of MB). You can open properties of something big and try converting raw number of bytes with /1024^n and /1000^n and compare the end results.

    How come that these big OS’s use another unit than these large international bodies recognize?

    Legacy, legacy everywhere (IMO). And of course they don’t want to confuse their precious users that don’t know any better. And this also would break some scripts that rely on that specific output. GNU C library also uses JEDEC units by default, hence flatpak and other software.

    On a side note, I’ve always found it weird why HDDs or SSDs are/were sold with 128GB, 265GB, 512GB etc. when they are referring to decimal units.

    It is weird for everyone, because we mainly only count with multiples of 2 when it comes to digital size of information. I didn’t investigate why they use power of 10, but I’ve seen that some other hardware also uses decimal units (I think at least in RAM, but JEDEC is used intentionally or not for CPU cache memory). I had a link where the RAM thingy is lightly addressed, but I couldn’t find it.

    spoiler

    P.S. it’s “OSes” and “macOS” BTW.



  • If you’re not aware, it was called MB because of JEDEC when IEC units weren’t invented. IEC units were introduced because they remove the double meaning of JEDEC units — decimal and binary. IEC units only carry the binary meaning, hence why they’re superior. If you convert 1000 kB to 1 MB then use MB, but in case of 1024 KiB to 1 MiB you should be using MiB. It’s all about getting the point across, and JEDEC units aren’t good at it.







  • Andrew@mander.xyztoProgrammer Humor@lemmy.mlHashtag noob life
    link
    fedilink
    arrow-up
    2
    arrow-down
    5
    ·
    edit-2
    10 months ago

    All this multiplatform stuff is bullshit according to my experience. The dotnet CLI is slow, files still use CRLF line ending, I also remember CLI autocompletion was not great. C# has only one working LSP server implementation that sucked ass in VS Code and Neovim. It’s kinda like Java DX, but at least with Java the DX is equally isn’t great on any OS. Maybe I like C# more than Java as a language, but I hate everything else. I also hate Java, btw.

    Also .Net wasn’t always OSS, therefore it has proprietary history (Java has less of the same).