• 0 Posts
  • 66 Comments
Joined 6 months ago
cake
Cake day: March 8th, 2024

help-circle
  • He shipped enough clunkers (and terrible design decisions) that I never bought the mythification of Jobs.

    In any case, the Deck is a different beast. For one, it’s the second attempt. Remember Steam Machines? But also, it’s very much an iteration on pre-existing products where its biggest asset is pushing having an endless budget and first party control of the platform to use scale for a pricing advantage.

    It does prove that the system itself is not the problem, in case we hadn’t picked up on that with Android and ChromeOS. The issue is having a do-everything free system where some of the do-everything requires you to intervene. That’s not how most people use Windows (or Android, or ChromeOS), and it’s definitely not how you use any part of SteamOS unless you want to tinker past the official support, either. That’s the big lesson, I think. Valve isn’t even trying to push Linux, beyond their Microsoft blood feud. As with Google, it’s just a convenient stepping stone in their product design.

    What the mainline Linux developer community can learn from it, IMO, is that for onboarding coupling the software and hardware very closely is important and Linux should find a way to do that on more product categories, even if it is by partnering with manufacturers that won’t do it themselves.



  • I genuinely think Linux misses a beat by not having a widely available distro that is a) very closely tied to specific hardware and b) mostly focused on web browsing and media watching. It’s kinda nuts and a knock on Linux devs that Google is running away with that segment through both Android and ChromeOS. My parents aren’t on Windows anymore but for convenience purposes the device that does that for them is a Samsung tablet.


  • I keep trying to explain how Linux advocacy gets the challenges of mainstream Linux usage wrong and, while I appreciate the fresh take here, I’m afraid that’s still the case.

    Effectively this guide is: lightly compromise your Windows experience for a while until you’re ready, followed by “here’s a bunch of alien concepts you don’t know or care about and actively disprove the idea that it’s all about the app alternatives.”

    I understand why this doesn’t read that way to the “community”, but parse it as an outsider for a moment. What’s a snap? Why are they bad? Why would I hate updates? Aren’t updates automatic as they are in Windows? Why would I ever pick the hardware-incompatible distros? What’s the tradeoff supposed to be, does that imply there is a downside to Mint over Ubuntu? It sure feels like I need to think about this picking a distro thing a lot more than the headline suggested. Also, what’s a DE and how is that different to a distro? Did they just say I need a virtual machine to test these DE things before I can find one that works? WTF is that about?

    Look, I keep trying to articulate the key misunderstanding and it’s genuinely hard. I think the best way to put it is that all these “switch to Linux, it’s fun!” guides are all trying to onboard users to a world of fun tinkering as a hobby. And that’s great, it IS fun to tinker as a hobby, to some people. But that’s not the reason people use Windows.

    If you’re on Windows and mildly frustrated about whatever MS is doing that week, the thing you want is a one button install that does everything for you, works first time and requires zero tinkering in the first place. App substitutes are whatever, UI changes and different choices in different DEs are trivial to adapt to (honestly, it’s all mostly Windows-like or Mac-like, clearly normies don’t particularly struggle with that). But if you’re out there introducing even a hint of arguments about multiple technical choices, competing standards for app packages or VMs being used to test out different desktop environments you’re kinda missing the point of what’s keeping the average user from stepping away from their mainstream commercial OS.

    In fairness, this isn’t the guide’s fault, it’s all intrinsic to the Linux desktop ecosystem. It IS more cumbersome and convoluted from that perspective. If you ask me, the real advice I would have for a Windows user that wants to consider swapping would be: get a device that comes with a dedicated Linux setup out of the box. Seriously, go get a Steam Deck, go get a System76 laptop, a Raspberry Pi or whatever else you can find out there that has some flavor of Linux built specifically for it and use that for a bit. That bypasses 100% of this crap and just works out of the box, the way Android or ChromeOS work out of the box. You’ll get to know whether that’s for you much quicker, more organically and with much less of a hassle that way… at the cost of needing new hardware. But hey, on the plus side, new hardware!


  • I’ll take persona, although it’s been way too many games with the same setup. Ditto for the Trails series.

    Honestly, I don’t think it got any better than ATB systems in FF 6 and 7. Everybody else is either riffing on those or spending so much money they think they can’t be those and need to be Devil May Cry instead.


  • For sure, it’s a bit of technical curiosity and an opportunity for tinkering.

    And given the absolute flood of misinformation around and about machine learning and “AI”, I also find it to be a hygiene thing to be able to identify bullshit on both the corporate camp and the terminally online criticism. Because man, do people say a lot of wild stuff that doesn’t make sense about this subject. Looking under the hood seems like a good thing to do.


  • Yeah, the smaller alternatives start at 14 GB, so they do fit in the 24 GB of the 4090, but I think that’s all heavily quantized, plus it still runs like ass.

    Whatever, this is all just hobbyist curiosity stuff. My experience is that running these raw locally is not very useful in any case. People underestimate how heavy the commercial options are, how much additional work goes into them beyond the model, or both.


  • There are ways to bring the models down in size at the cost of accuracy and I believe you can trade off performance to split them across the GPU and the CPU.

    Honestly, the times I’ve tried the biggest things out there out of curiosity it was a fun experiment but not a practical application, unless you are in urgent need of a weirdly taciturn space heater for some reason.


  • I mean, from what I can tell we still don’t, at least as home users. The full size model won’t fit on any commercial hardware. Even with a top of the line 4090 GPU you’re limited to the 8B model if you want to run it offline, and that still charts lower than the last-gen 70B model.

    Still cool to have it be available, though.





  • MudMan@fedia.iotoGaming@beehaw.orgLet's discuss: Deus Ex
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    3 months ago

    Hah. I almost wrote that I also think the two Ultima Undergrounds are better than Deus Ex despite being much older and having an objectively very clumsy interface. Then I thought that’d get us in the weeds and pull us too far back, so I took it out.

    Look, yeah, Deus Ex rolled in elements from CRPGs and had good production values for the time. But all those things were nothing new for an RPG, they were just new for a shooter. Baldur’s Gate and Fallout were a few years old. The entire Ultima franchise had been messing around with procedural, simulated worlds for almost a decade at that point, which in the 90s was a technological eon.

    And yeah, System Shock had created a template for a shooter RPG, they just applied it to a lone survivor dungeon crawly horror thing, rather than try to marry it to the narrative elements of NPC-focused CRPGs, which is admittedly a lot more complicated. And Deus Ex was fully voiced and had… well, a semblance of cutscenes. In context it’s hilariously naive compared to what Japanese devs were doing in Metal Gear or Final Fantasy, but it was a lot for western PC game standards.

    But it wasn’t… great to play? I don’t know what to tell you. Thief and Hitman both had nailed the clockwork living stage thing, and at the time I was more than happy to give up the Matrix-at-home narrative and the DnD-style questing for that. The pitch was compelling, but it didn’t necessarily make for a great playable experience against its peers.

    I didn’t hate it or anything. I spent quite a bit of time messing with it. That corny main theme still pops up in my head with no effort on demand. I spent more time using it as a benchmark than Unreal, which I also thought wasn’t a great game.

    Also, while I’m here pissing people off, can we all agree that “immersive sim” is a terrible name for a genre? What exactly is “simulated”? Why is it immersive? Immerisve as opposed to what? At the time we tended to lump them in with stealth games, so the name is just an attempt to reverse engineer a genre name by using loose words that weren’t already taken, and I hate it. See also: character action game. Which action games do NOT have characters?

    Man, I am a grumpy old fart today.


  • The closest thing we had was the System Shock duology, since both predate Deus Ex. Deus Ex was basically accessible System Shock. Having dialogue trees and NPCs without losing the open-ended nature of System Shock’s more dungeon crawl-y approach was the real selling point. Well, that and the trenchcoats and shades. The Matrix was such a big deal.

    But even then, each of those elements were already present in different mixes in several late 90s games. Deus Ex by some counts was one of the early culminations of the genre blending “everything game” we were all chasing during the 90s. The other was probably GTA 3. I think both of those are fine and they are certainly important games, but I never enjoyed playing them as much as less zeitgeist-y games that were around at the same time. I did spend a lot of time getting Deus Ex to look as pretty as possible, but I certainly didn’t finish it and, like a lot of people, I mostly ran around Liberty Island a bunch.

    I played more Thief 2 that year, honestly. I played WAY more Hitman than Deus Ex that year. I certainly thought System Shock 2 was better. Deus Ex is a big, ambitious, important game, for sure, but I never felt it quite stuck the landing when playing it, even at the time.



  • Kind of overrated? I mean, it was cool to see a bit more of a palatable cinematic presentation in real time to go along with the late 90s PC jank, and that theme did kick ass, but it’s less groundbreaking in context than I think people give it credit for. And it doesn’t hold up nearly as well as System Shock 2, in my book.


  • I guess that depends on the use case and how frequently both machines are running simultaneously. Like I said, that reasoning makes a lot of sense if you have a bunch of users coming and going, but the OP is saying it’s two instances at most, so… I don’t know if the math makes virtualization more efficient. It’d pobably be more efficient by the dollar, if the server is constantly rendering something in the background and you’re only sapping whatever performance you need to run games when you’re playing.

    But the physical space thing is debatable, I think. This sounds like a chonker of a setup either way, and nothing is keeping you from stacking or rack-mounting two PCs, either. Plus if that’s the concern you can go with very space-efficient alternatives, including gaming laptops. I’ve done that before for that reason.

    I suppose it’s why PC building as a hobbyist is fun, there are a lot of balance points and you can tweak a lot of knobs to balance many different things between power/price/performance/power consumption/whatever else.


  • OK, yeah, that makes sense. And it IS pretty unique, to have a multi-GPU system available at home but just idling when not at work. I think I’d still try to build a standalone second machine for that second user, though. You can then focus on making the big boy accessible from wherever you want to use it for gaming, which seems like a much more manageable, much less finicky challenge. That second computer would probably end up being relatively inexpensive to match the average use case for half of the big server thing. Definitely much less of a hassle. I’ve even had a gaming laptop serve that kind of purpose just because I needed a portable workstation with a GPU anyway, so it could double as a desktop replacement for gaming with someone else at home, but of course that depends on your needs.

    And in that scenario you could also just run all that LLM/SD stuff in the background and make it accessible across your network, I think that’s pretty trivial whether it’s inside a VM or running directly on the same environment as everything else as a background process. Trivial compared to a fully virtualized gaming computer sharing a pool of GPUs, anyway.

    Feel free to tell us where you land, it certainly seems like a fun, quirky setup etiher way.


  • Yeah, but if you’re this deep into the self hosting rabbit hole what circumstances lead to having an extra GPU laying around without an extra everything else, even if it’s relartively underpowered? You’ll probably be able to upgrade it later by recycling whatever is in your nice PC next time you upgrade something.

    At this point most of my household is running some frankenstein of phased out parts just to justify my main build. It’s a bit of a problem, actually.


  • OK, but why?

    Well, for fun and as a cool hobby project, I get that. That is enough to justify it, like any other crazy hobbyist project. Don’t let me stop you.

    But in the spirit of practicality and speaking hypothetically: Why set it up that way?

    For self-hosting why not build a few standalone machines and run off that instead? The reason to do this large scale is optimizing resources so you can assign a smaller pool of hardware to users as they need it, right? For a home set of two or three users you’d probably notice the fluctuations in performance caused by sharing the resources on the gaming VMs and it would cost you the same or more than building a couple reasonable gaming systems and a home server/NAS for the rest. Way less, I bet, if you’re smart about upgrades and hand-me-downs.