• 0 Posts
  • 76 Comments
Joined 2 years ago
cake
Cake day: August 8th, 2023

help-circle
  • I’ve been using Linux with Nvidia for 10 years and it’s been a constant dumpster fire. The Nvidia driver constantly caused issues over these 10 years, especially during updates. Currently I’m having the issue that the entirety of Wayland, including all open programs, crash when I run out of VRAM because the Linux Nvidia drivers cannot fall back on RAM when running out of VRAM. It’s making my gaming experience very frustrating.






  • As long as it’s a bit of a sandbox: hell yeah. But there needs to be stuff happening, things to do. I love games like GTA, Cyberpunk, Just Cause, Stalker, because you can just go around the world and experience random stuff happening. Sometimes I don’t want a goal, but just a sandbox to create my own stories.


  • I haven’t tried them, so I cannot judge, but I’m just afraid I’ll run into issues when I will have to go off the beaten path. Inevitably I’ll have to do something hacky in order to fix some obscure software that the maintainers of the distro didn’t think of, and that’s currently already a big pain. But in such a strict setting it will be even more difficult. There will be no documentation and probably no guide or questions/answers on any forum either.

    I’d be willing to try it for a productivity setup if I needed a reinstall, but not for my main PC because I just rely on too many hacks to get shit working.






  • We played this game called Icarus last weekend because of a free weekend. It was okay for me, but I also have a pretty high-end PC for the 1080p monitor connected to it. Even for me the game was quite janky, but for my friends with older hardware the game wasn’t a good time. One friend’s microphone randomly turned into a max-volume noise generator while playing on multiple occasions, something that has never happened before. Another (who plays on Linux) experienced constant crashes and weird behaviour.

    After that disappointment we went back in time to The Showdown Effect for the first time in years, which was still as hilarious as ever. Apparently there’s an updated free to play version now (called reloaded or something?) so we’d have to check that out. Would recommend it if you’re looking to have some mayhem with friends .

    Edit: oh yeah and I also bought Grid Legends because it has a big sale and I like racing games. The driving physics don’t annoy me like the ones in The Crew or Forza so I’m having a good time with it till now




  • I was just about to post the same thing. I’ve been using Linux for almost 10 years. I never really understood the folder layout anyway into this detail. My reasoning always was that /lib was more system-wide and /usr/lib was for stuff installed for me only. That never made sense though, since there is only one /usr and not one for every user. But I never really thought further, I just let it be.


  • Sometimes I look at the memes around here and wonder wtf y’all are doing. Like, neither my code nor the code at the place I work at are perfect. But I don’t think I’ve ever seen a merge do this. Maybe some of the most diverged merges temporarily had a lot of errors because of some refactoring, but then it was just a few find + replaces away from being fixed again. But those were merges where multiple teams had been working on both the original and the fork for years and even then it was usually pretty okay.



  • Machine learning and compression have always been closely tied together. It’s trying to learn the “rules” that describe the data rather than memorizing all the data.

    I remember implementing a paper older than me in our “Information Theory” course at university that treated the creation of a decision tree as compression. Their algorithm considered sending the decisions tree and all the exceptions to the decision tree and the tree itself. If a node in the tree increased the overall message size, it would simply be pruned. This way they ensured that you wouldn’t make conclusions while having very little data and would only add the big patterns in the data.

    Fundamentally it is just compression, it’s just a way better method of compression than all the models that we had before.

    EDIT: The paper I’m talking about is “Inferring decision trees using the minimum description length principle” - L. Ross Quinlan & Ronald L. Rivest


  • I’m on Arch (actually a converted Antergos) and I have an NVIDIA card as well. My first attempt a few months ago was horrible, bricking my system and requiring a bootable USB an a whole evening to get Linux working again.

    My second attempt was recently, and went a lot better. X11 no longer seems to work, so I’m kinda stuck with it, but it feels snappy as long as my second monitor is disconnected. I’ve yet to try some gaming. My main monitor is a VRR 144Hz panel with garbage-tier HDR. The HDR worked out of the box on KDE Plasma, with the same shitty quality as on Windows, so I immediately turned it off again. When my second monitor is connected I get terrible hitching. Every second or so the screen just freezes for hundreds of milliseconds. Something about it (1280x1024, 75Hz, DVI) must not make Wayland happy. No settings seem to change anything, only physically disconnecting the monitor seems to work.