• polygon@beehaw.org
    link
    fedilink
    arrow-up
    15
    ·
    1 year ago

    I’d rather see consoles be limited to what they can handle than a game to be limited for everyone because of what a single console can handle.

    I want this game to be huge and look beautiful. If my PC can handle 60fps I don’t want to locked to 30fps because that’s all an Xbox can handle. And if I want to play it on an Xbox I don’t want it to be a blurry mess to get 60fps, I want it to look as good as it possibly can. Especially in a game like this where the visuals do a great majority of the storytelling when it comes to exploration and finding new things.

      • polygon@beehaw.org
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Right, that was sort of the point of the post. People will complain about 30fps on console but this is the correct way to develop games.

  • ThatGuy@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    1 year ago

    As someone who doesnt mind 30fps, there shouldnt be games running at 30 on new gen hardware anymore lol

    • kris40k@beehaw.org
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      This right here. As a 40+ gamer, I don’t mind 30fps. Been dealing with lower fps for a long, long time and its fine for me. But that just seems like an unreasonably low expectation of a AAA video games these days.

  • puck@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    Not a deal breaker for this kind of game but a 60fps performance mode on series x at 1080p would’ve been a nice option.

    Playing TOTK right now on switch and it really proves how great games can overcome technical limitation. A masterpiece at 30fps is still a masterpiece. Here’s hoping Starfield can deliver as a great game first and foremost.

    • beefcat@beehaw.org
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      1 year ago

      If the bottleneck is something like AI or physics calculations on the CPU then lowering the rendering resolution won’t help achieve a higher framerate unfortunately.

      I suspect most games shipping this gen without 60 FPS modes are CPU bound.

  • Butterbee (She/Her)@beehaw.org
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    This isn’t surprising. Todd Howard already stated that given the choice between fidelity or framerate they would choose fidelity every time. It’s disappointing that he thinks that’s still what people want in 2023, but it’s not surprising.

    • Domiku@beehaw.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      At least give a performance/fidelity toggle like many games. Especially with how similar the Xbox architecture is to Windows, I’ve always wondered why devs can’t use some of the same tools to give console players more graphical options.

  • smolgumball@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    1 year ago

    I’m curious about this kind of thing from an engine and console architecture perspective. Any gamedevs able to shed some light?

    I work in the industry, but not directly on low-level engine implementation details. Personally, my gut thinking is that the Creation Engine is falling behind in terms of modern asset streaming techniques.

    Similarly, I wonder if a lack of strong virtualized geometry / just-in-time LOD generation tech could be a huge bottleneck?

    From what I understand, efforts like Nanite in UE5 were an enormous engineering investment for Epic, and unless Bethesda has a massive engine team of their own (they don’t), they simply won’t be able to benefit from an in-house equivalent in tech.

    Ultimately, I do think the lack of innovation in the Creation Engine is due to internal technical targets being established as “30FPS is good enough”, with frame times below 33ms being viewed as “for those PC gamers with power to spare.”

  • smolgumball@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I’m curious about this kind of thing from an engine and console architecture perspective. Any gamedevs able to shed some light?

    I work in the industry, but not directly on low-level engine implementation details. Personally, my gut thinking is that the Creation Engine is falling behind in terms of modern asset streaming techniques.

    In an imaginary world where I’ve poured over Bethesda’s engine source for days, I wonder if I might discover that:

    • Asset formats and/or orchestration code used for asset streaming in the Creation Engine are not optimized to a degree where scene graphs can be effectively culled based on camera frustum or player proximity without noticeable dips in frame-time. It simply takes too long to pause actor simulations or too long to stream assets back into memory and reintroduce objects to the scene graph

    • Virtualized geometry or other magical low-overhead auto-LOD solutions aren’t in place. As far as I understand it, efforts like Nanite in UE5 were an enormous engineering investment for Epic, and unless Bethesda has a massive engine team of their own (they don’t), they simply won’t be able to benefit from an in-house equivalent in tech