- cross-posted to:
- gaming@lemmy.ml
- cross-posted to:
- gaming@lemmy.ml
I’d rather see consoles be limited to what they can handle than a game to be limited for everyone because of what a single console can handle.
I want this game to be huge and look beautiful. If my PC can handle 60fps I don’t want to locked to 30fps because that’s all an Xbox can handle. And if I want to play it on an Xbox I don’t want it to be a blurry mess to get 60fps, I want it to look as good as it possibly can. Especially in a game like this where the visuals do a great majority of the storytelling when it comes to exploration and finding new things.
Good then, because this is quite literally what Bethesda is doing with Starfield.
Right, that was sort of the point of the post. People will complain about 30fps on console but this is the correct way to develop games.
As someone who doesnt mind 30fps, there shouldnt be games running at 30 on new gen hardware anymore lol
This right here. As a 40+ gamer, I don’t mind 30fps. Been dealing with lower fps for a long, long time and its fine for me. But that just seems like an unreasonably low expectation of a AAA video games these days.
What’s really weird to me is the hard 30fps cap. Why not have at least an option to disable the cap and let VRR do it’s job?
Not a deal breaker for this kind of game but a 60fps performance mode on series x at 1080p would’ve been a nice option.
Playing TOTK right now on switch and it really proves how great games can overcome technical limitation. A masterpiece at 30fps is still a masterpiece. Here’s hoping Starfield can deliver as a great game first and foremost.
If the bottleneck is something like AI or physics calculations on the CPU then lowering the rendering resolution won’t help achieve a higher framerate unfortunately.
I suspect most games shipping this gen without 60 FPS modes are CPU bound.
That’s a great point.
This isn’t surprising. Todd Howard already stated that given the choice between fidelity or framerate they would choose fidelity every time. It’s disappointing that he thinks that’s still what people want in 2023, but it’s not surprising.
At least give a performance/fidelity toggle like many games. Especially with how similar the Xbox architecture is to Windows, I’ve always wondered why devs can’t use some of the same tools to give console players more graphical options.
I’m curious about this kind of thing from an engine and console architecture perspective. Any gamedevs able to shed some light?
I work in the industry, but not directly on low-level engine implementation details. Personally, my gut thinking is that the Creation Engine is falling behind in terms of modern asset streaming techniques.
Similarly, I wonder if a lack of strong virtualized geometry / just-in-time LOD generation tech could be a huge bottleneck?
From what I understand, efforts like Nanite in UE5 were an enormous engineering investment for Epic, and unless Bethesda has a massive engine team of their own (they don’t), they simply won’t be able to benefit from an in-house equivalent in tech.
Ultimately, I do think the lack of innovation in the Creation Engine is due to internal technical targets being established as “30FPS is good enough”, with frame times below 33ms being viewed as “for those PC gamers with power to spare.”
I’m curious about this kind of thing from an engine and console architecture perspective. Any gamedevs able to shed some light?
I work in the industry, but not directly on low-level engine implementation details. Personally, my gut thinking is that the Creation Engine is falling behind in terms of modern asset streaming techniques.
In an imaginary world where I’ve poured over Bethesda’s engine source for days, I wonder if I might discover that:
-
Asset formats and/or orchestration code used for asset streaming in the Creation Engine are not optimized to a degree where scene graphs can be effectively culled based on camera frustum or player proximity without noticeable dips in frame-time. It simply takes too long to pause actor simulations or too long to stream assets back into memory and reintroduce objects to the scene graph
-
Virtualized geometry or other magical low-overhead auto-LOD solutions aren’t in place. As far as I understand it, efforts like Nanite in UE5 were an enormous engineering investment for Epic, and unless Bethesda has a massive engine team of their own (they don’t), they simply won’t be able to benefit from an in-house equivalent in tech
-