• 5 Posts
  • 164 Comments
Joined 3 years ago
cake
Cake day: January 23rd, 2022

help-circle
  • When does Debian update a package? And how does it decide when to?

    These both can be answered in depth at Debian’s releases page, but the short answer is:

    Debian developers work in a repo called “unstable” or “sid,” and you can get those packages if you so desire. They will be the most up to date, but also the most likely to introduce breaking changes.

    When the devs decide these packages are “stable enough,” (breaking changes are highly unlikely) they get moved into “testing” (the release candidate repo) where users can do QA for the community. Testing is the repo for the next version of debian.

    When the release cycle hits the ~1.5 year mark, debian maintainers introduce a series of incremental “freezes,” whereby new versions of packages will slowly stop being accepted into the testing repo. You can see a table that explains each freeze milestone for Trixie (Debian 13) here.

    After all the freezes have gone into effect, Debian migrates the current Testing version (currently Trixie, Debian 13) into the new Stable, and downgrades the current stable version to old-stable. Then the cycle begins again

    As for upgrades to packages in the stable/old-stable repos: see the other comments here. The gist is that they will not accept any changes other than security patches and minor bug fixes, except for business critical software that cannot just be patched (e.g. firefox).




  • The point of security isn’t just protecting yourself from the threats you’re aware of. Maybe there’s a compromise in your distro’s password hashing, maybe your password sucks, maybe there’s a kernel compromise. Maybe the torrent client isn’t a direct route to root, but one step in a convoluted chain of attack. Maybe there are “zero days” that are only called such because the clear web hasn’t been made aware yet, but they’re floating around on the dark web already. Maybe your passwords get leaked by a flaw in Lemmy’s security.

    You don’t know how much you don’t know, so you should be implementing as much good security practices as you can. It’s called the “Swiss Cheese” model of security: you layer enough so that the holes in one layer are blocked by a different layer.

    Plus, keeping strong security measures in place for something that’s almost always internet connected is a good idea regardless of how cautious you think you’re being. It’s why modern web-browsers are basically their own VM inside your pc anymore, and it’s why torrent clients shouldn’t have access to anything besides the download/upload folders and whatever minimal set of network perms they need.





  • Debian Testing has a lot more current packages, and is generally fairly stable. Debian Unstable is rolling release, and mostly a misnomer (but it is subject to massive changes at a moment’s notice).

    Fedora is like Debian Testing: a good middleground between current and stable.

    I hear lots of good things about Nix, but I still haven’t tried it. It seems to be the perfect blend of non-breaking and most up-to-date.

    I’ll just add to: don’t believe everything you hear. Distrowars result in rhetoric that’s way blown out of proportion. Arch isn’t breaking down more often than a cybertruck, and Debian isn’t so old that it yearns for the performance of Windows Vista.

    Arch breaks, so does anything that tries to push updates at the drop of a hat; it’s unlikely to brick your pc, and you’ll just need to reconfigure some settings.

    Debian is stable as its primary goal, this means the numbers don’t look as big on paper; for that you should be playing cookie clicker, instead of micromanaging the worlds’ most powerful web browser.

    Try things out for yourself and see what fits, anyone who says otherwise is just trying to program you into joining their culture war



  • You intentionally do not want people that you consider “below” you to use Linux or even be present in your communities.

    No, but I do want my communities to stay on-topic and not be derailed by Discourse™

    Who I consider beneath me is wholly unrelated to their ability to use a computer, and entirely related to their ability to engage with others in a mature fashion, especially those they disagree with.

    Most people use computers to get something done. Be it development, gaming, consuming multimedia, or just “web browsing”

    I realize most people use computers for more than web-browsing, but ask anybody who games, uses multimedia software, or develops how often they have issues with their workflow.

    (which you intentionally use to degrade people “just” doing that)

    No I don’t. Can you quote where I did so, or is it just a vibe you got when reading in the pretentious dickwad tone you seem to be projecting onto me?

    But stop trying to gatekeep people out of it

    I’m not, you’re projecting that onto me again. If you want to use Linux, use Linux. Come here and talk about how you use Linux, or ask whatever questions about Linux you want. If you don’t want to use Linux, or don’t want to to talk about Linux, take it to the appropriate community.

    If keeping communities on-topic and troll-free is “gatekeeping,” then I don’t give a fuck how you feel about it.


  • I don’t think we do, but that’s a feature, not a bug. Here’s why:

    1. There was a great post a few days ago about how Linux is a digital 3rd Space. It’s about spending time cultivating the system and building a relationship with it, instead of expecting it to be transparent while you use it. This creates a positive relationship with your computer and OS, seeing it as more a labor of love than an impediment to being as productive as possible (the capitalist mindset).

    2. Nothing “just works.” That’s a marketing phrase. Windows and Mac only “just work” if the most you ever do is web-browsing and note-taking in notepad. Anything else and you incite cognitive dissonance: hold onto the delusion at the price of doing what you’re trying to do, or accept that these systems aren’t as good as their marketing? The same thread I mentioned earlier talked about how we give Linux more lenience because of the relationship we have with it, instead of seeing it as just a tool for productivity.

    3. Having a barrier of entry keeps general purpose communities like this from being flooded with off-topic discourse that achieves nothing. And no, I’m not just talking about the Yahoo Answers-level questions like “how to change volume Linux???” Think stuff like “What’s the most stargender-friendly Linux distro?” and “How do we make Linux profitable?” and “what Linux distro would Daddy Trump use?” and “where my other Linux simping /pol/t*rds at (socialist Stallman****rs BTFO)???” Even if there is absolutely perfect moderation and you never see these posts directly, these people would still be coming in and finding ways that skirt the rules to inject this discourse into these communities; and instead of being dismissed as trolls, there would be many, many people who think we should hear them out (or at least defend their right to Free Speech).

    4. Finally, it already “just works” for the aforementioned note-taking and web-browsing. The only thing that’s stopping more not so tech-savvy people is that it’s not the de facto pre-installed OS on the PC you pick up from Best Buy (and not Walmart, because you want people to think you’re tech-savvy, so you go to the place with a dedicated “geek squad”). The only way it starts combating Windows in this domain is by marketing agreements with mainstream hardware manufacturers (like Dell and HP); this means that the organization responsible for representing Linux would need the money to make such agreements… Which would mean turning it into a for-profit OS. Which would necessitate closing the source. Which would mean it just becomes another proprietary OS that stands for all that Linux is against.






  • You’ve defined yourself into an impossible bind: you want something extremely portable, universal but with a small disk imprint, and you want it to be general purpose and versatile.

    The problem is that to be universal and general purpose, you need a lot of libraries to interact with whatever type of systems you might have it on (and the peculiarities of each), and you need libraries that do whatever type of interactions with those systems that you specify.

    E.g. under-the-hood, python’s open("<filename>", 'r') is a systemcall to the kernel. But is that Linux? BSD? Windows NT? Android? Mach?

    What if you want your script to run a CLI command in a subshell? Should it call “cmd”? or “sh”? or “powershell”? Okay, okay, now all you need it to do is show the contents of a file… But is the command “cat” or “type” or “Get-FileContents”?

    Or maybe you want to do more than simple read/write to files and string operations. Want to have graphics? That’s a library. Want serialization for data? That’s a library. Want to read from spreadsheets? That’s a library. Want to parse XML? That’s a library.

    So you’re looking at a single binary that’s several GBs in size, either as a standalone or a self-extracting installer.

    Okay, maybe you’ll only ever need a small subset of libraries (basic arithmetic, string manipulation, and file ops, all on standard glibc gnu systems ofc), so it’s not really “general purpose” anymore. So you find one that’s small, but it doesn’t completely fit your use case (for example, it can’t parse uci config files); you find another that does what you need it to, but also way too much and has a huge footprint; you find that perfect medium and it has a small, niche userbase… so the documentation is meager and it’s not easy to learn.

    At this point you realize that any language that’s both easy to learn and powerful enough to manage all instances of some vague notion of “computer” will necessarily evolve to being general purpose. And being general purpose requires dependencies. And dependencies reduce portability.

    At this point your options are: make your own language and interpreter that does exactly what you want and nothing more (so all the dependencies can be compiled in), or decide which criteria you are willing to compromise on.



  • I have a Libre LePotato, Pinebook and Pinephone. They’re fine for most of my use cases, but they don’t handle games too well. They are also not great for VMs or emulation, and no chance in hell would I use any for my home media server.

    That being said, I’m starting to see ARM CPU desktops in my feeds, and I think one of those would be fine for everything but gaming (which is more an issue of the availability of native binaries and not necessarily outright performance). TBH at that price point, using off-chip memory and GPU, I don’t see much reason to go with ARM; maybe the extra cores, but I can’t imagine there is much in the way of electrical efficiency that SoCs entail.


  • I’ve been running Debian stable on my decade-old desktop for about 3 years, and on my ideapad that’s just as old for about 5. During that time I had an update break something only once, and it was the Nvidia driver what did it. A patch was released within a three days.

    Debian epitomizes OS transparency for me. Sure, I can still customize the hell out of it and turn it into a frankenix machine, but if I don’t want to, I can be blissfully unaware of how my OS works, and focus only on important computing tasks (like mindlessly scrolling lemmy at 2 am).


  • I use virt-manager. Works better than virtualbox did at the time (back while v6.1 was still the main release branch), it’s easier, and it doesn’t involve hitching yourself to Oracle.

    VMWare may be “free,” but it ain’t free. And if you don’t care about software freedom, why choose Linux over Windows or MacOS? Also, Workstation Player lacks a lot of functionality that makes it not good as a hypervisor. Only one VM can be powered at a time, and all the configuration is severely limited. Plus the documentation is mediocre compared to the official virt-manager docs.