• 1 Post
  • 10 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle
  • dragontamer@lemmy.worldtoAnime@lemmy.mlAny Christian anime out there?
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    11 months ago

    Chrono Crusade is maybe the closest you’ll get

    Chrono Crusade is a Miko x Yokai adventure story except the Miko is a Nun and the Yokai is a “Demon”.

    Mikos are known to use archery to defeat Yokai. No wait, its a nun and westerners use guns, right? And they meet all kinds of Yokai… like good yokai, bad yokai, friendly yokai. No wait, western culture doesn’t have Yokai, they call them Demons. So good demons, bad demons, friendly demons…

    Nothing wrong with that :-) Switching up the setting is fine but its not especially “Christian” in theme.


  • dragontamer@lemmy.worldtoAnime@lemmy.mlAny Christian anime out there?
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    2
    ·
    edit-2
    11 months ago

    You expect a country of Buddhists and Shinto to make a Christian anime?

    There’s Evangelion, which is closer to how we Westerners take Buddhist holy texts and say random techno-mumbo-jumbo in our SciFi stories (ex: Stargate).

    Similarly, Angel Beats takes the Christian idea of Purgatory except it’s all Buddhist / Zen enlightenment based because Japan doesn’t get Christianity lol.

    There is Death Note, where some characters are explicitly Catholic and a few Christian references pop up kind of elegantly. (L washes the feet of Light Yagami for example). I wouldn’t call the plot especially ‘Christian’ otherwise but this does deserve mention as an Anime that at least got the meaning behind the Christian references they chose.

    Digimon Season 1 might be the closest thing to Christian. Kids get powers as they invoke the power of Virtues like Courage or Faith. Two Digimon turn into literal Angels and one of the bad guys is ‘Devilmon’, who was weak to the angel Digimon. Season4 has a fallen angel named Lucifer (well… Lucemon. All Digimon have a -mon on the end) who goes evil as well. Season4 is more random / less elegant with the Christian references as Cherubimom and Seriphmon (angels) don’t really represent Christian ideas anymore IMO.

    But Season1 Digimon was the closest you’ll get IMO

    But this is really rare. Christianity is treated as an exotic religion / weird Character trait more often than not (much like random Buddhists show up in Hollywood to round out a cast or add a bit of exotic flair).


    I dunno. Hellsing Ultimate? Irish Catholic priest is pissed off at a Protestant Vampire who works for the Queen in the 1990s… is referencing ‘The Troubles’ too soon?

    Obviously not a Christian story given the huge amounts of hyperviolence. But the Christian references were at least kinda-sorta correct…


  • That’s not what storage engineers mean when they say “bitrot”.

    “Bitrot”, in the scope of ZFS and BTFS means the situation where a hard-drive’s “0” gets randomly flipped to “1” (or vice versa) during storage. It is a well known problem and can happen within “months”. Especially as a 20-TB drive these days is a collection of 160 Trillion bits, there’s a high chance that at least some of those bits malfunction over a period of ~double-digit months.

    Each problem has a solution. In this case, Bitrot is “solved” by the above procedure because:

    1. Bitrot usually doesn’t happen within single-digit months. So ~6 month regular scrubs nearly guarantees that any bitrot problems you find will be limited in scope, just a few bits at the most.

    2. Filesystems like ZFS or BTFS, are designed to handle many many bits of bitrot safely.

    3. Scrubbing is a process where you read, and if necessary restore, any files where bitrot has been detected.

    Of course, if hard drives are of noticeably worse quality than expected (ex: if you do have a large number of failures in a shorter time frame), or if you’re not using the right filesystem, or if you go too long between your checks (ex: taking 25 months to scrub for bitrot instead of just 6 months), then you might lose data. But we can only plan for the “expected” kinds of bitrot. The kinds that happen within 25 months, or 50 months, or so.

    If you’ve gotten screwed by a hard drive (or SSD) that bitrots away in like 5 days or something awful (maybe someone dropped the hard drive and the head scratched a ton of the data away), then there’s nothing you can really do about that.


  • If you have a NAS, then just put iSCSI disks on the NAS, and network-share those iSCSI fake-disks to your mini-PCs.

    iSCSI is “pretend to be a hard-drive over the network”. iSCSI can exist “after” ZFS or BTRFS, meaning your scrubs / scans will fix any issues. So your mini-PC can have a small C: drive, but then be configured so that iSCSI is mostly over the D: iSCSI / Network drive.

    iSCSI is very low-level. Windows literally thinks its dealing with a (slow) hard drive over the network. As such, it works even in complex situations like Steam installations, albeit at slower network-speeds (it gotta talk to the NAS before the data comes in) rather than faster direct connection to hard drive (or SSD) speeds.


    Bitrot is a solved problem. It is solved by using bitrot-resilient filesystems with regular scans / scrubs. You build everything on top of solved problems, so that you never have to worry about the problem ever again.



  • Wait, what’s wrong with issuing “ZFS Scan” every 3 to 6 months or so? If it detects bitrot, it immediately fixes it. As long as the bitrot wasn’t too much, most of your data should be fixed. EDIT: I’m a dumb-dumb. The term was “ZFS scrub”, not scan.

    If you’re playing with multiple computers, “choosing” one to be a NAS and being extremely careful with its data that its storing makes sense. Regularly scanning all files and attempting repairs (which is just a few clicks with most NAS software) is incredibly easy, and probably could be automated.


  • dragontamer@lemmy.worldtoLinux@lemmy.ml*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Honestly, Docker is solving the problems in a lot of practice.

    Its kinda stupid that so many dependencies need to be kept track of that its easier to spin up a (vm-like) environment to run Linux binaries more properly. But… it does work. With a bit more spit-shine / polish, Docker is probably the way to move forward on this issue.

    But Docker is just not there yet. Because Linux is Open Source, there’s no license-penalties to just carrying an entire Linux-distro with your binaries all over the place. And who cares if a binary is like 4GB? Does it work? Yes. Does it work across many versions of Linux? Yes (for… the right versions of Linux with the right compilation versions of Docker, but yes!! It works).

    Get those Dockers a bit more long-term stability and compatibility, and I think we’re going somewhere with that. Hard Drives these days are $150 for 12TB and SSDs are $80 for 2TB, we can afford to store those fat binaries, as inefficient as it all feels.


    I did have a throw-away line with MUSL problems, but honestly, we’ve already to incredibly fat dockers laying around everywhere. Why are the OSS guys trying to save like 100MB here and there when no one actually cares? Just run glibc, stop adding incompatibilities for honestly, tiny amounts of space savings.


  • dragontamer@lemmy.worldtoLinux@lemmy.ml*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    5
    ·
    edit-2
    1 year ago

    Because it isn’t inferior.

    Ubuntu barely can run programs from 5 years ago, backwards compatibility is terrible. Red Hat was doing good but it just shit the bed. To have any degree of consistenty, you need to wrap all apps inside of a Docker and carry with you all the dependencies (but this leads you to obscure musl bugs in practice, because musl has different bugs than glibc).

    For better or worse, Windows programs with dependency on kernel32.dll (at the C++ level) have remained consistently deployed since the early 1990s and rarely break. C# programs have had good measures of stability. DirectX9, DirectX10, DirectX11, and DirectX12 all had major changes to how the hardware works and yet all the hardware automatically functions on Windows. You can play Starcraft from 1998 without any problems despite it being a DirectX6 game.

    Switch back over to Ubuntu land, and Wayland is… maybe working? Eventually? Good luck reaching back to programs using X.org dependencies or systemd.


    Windows is definitely a better experience than Ubuntu. I think Red Hat has the right idea but IBM is seemingly killing all good will built up to Red Hat and CentOS. SUSE linux is probably our best bet moving forward as a platform that cares about binary stability.

    Windows networking stack is also far superior for organizations. SAMBA on Linux works best if you have… a Windows Server instance holding the group-policies and ACLs on a centralized server. Yes, $1000 software works better than $0 software. Windows Server is expensive but its what organizations need to handle ~50 to ~1000 computers inside of a typical office building.

    Good luck deploying basic security measures in an IT department with Linux. The only hope, in my experience, is to buy Windows Server, and then run SAMBA (and deal with SAMBA bugs as appropriate). I’m not sure if I ever got a Linux-as-Windows-server ever working well. Its not like Linux development community understands what an ACL is in practice.