It’s BS though. People with TOTL hardware are having issues. Those systems don’t underperform because the game is advanced or anything like that – the game underperforms because it is a new release that is poorly optimized. It’s also expected because it’s on a senior citizen of a game engine that likely needs a few other nudges.
Todd Howard forgets that PC users see this shit all the time, and it’s pretty obvious with this one. Hoping to see talk of optimization in a coming patch instead.
Edit: a good example – not hitting 60fps in New Atlantis, but concurrently, CPU usage in the 50s and GPU usage in the 70s. That’s a sign of poor optimization.
I’m starting to think that maybe, just maybe brute forcing a 26 yesr old engine that makes skyrim have a stroke if you try to play above 30fps isn’t a good idea
Is it actually the same engine?
No, Im not a fan of the game personally but a quick search shows they are using the creative engine 2, which is a newer version of their engine.
They could have called it Creative Engine 129030129784.32985 for all that it matters. It’s just a name for an engine update, as they do for every new game. They didn’t re-write it from scratch; that would be a billion-dollar venture.
From what I’ve read it’s the exact same engine as FO4 with better lighting (and of course, as with every new game, some improvements locally relevant to the gameplay).
But, fundamentally, underneath the fancy lights, still the same engine. That explains the 2008-esque animations, the bugs, the performance issues, and general flatness of the game. It can’t be more than “Skyrim in Space” because that’s what it technically is.Because putting a 2 after the name makes a new engine. It’s just a new iteration of the same old engine that runs Fallout 3, skyrim, and Fallout 4.
Ill see if I can find it when I’m at my PC, but in an interview a dev said it was still using significant amounts of code from their Gamebryo engine from 97
What game engine is 26 years old other than the Unreal engine?
Edit: stepped on some toes i guess lmfao
Gamebryo, the base of creation engine used by Bethesda for this
Ah okay. Thank you for the actual answer
My friend and I were just discussing the likelihood that some hardware producers pay game devs to purposely output bad optimizations so users are encouraged to spend more on upgrades.
In this case, you get Starfield free with the purchase of select AMD CPUs or GPUs.
But it’s weird for Todd Howard to come out with this push now, because it’s in response to those already playing the game.
I mean, that’s probably why he would make the push. The bait’s in the mouth (people have the game), then comes the pull of the hook (they have to upgrade to try and handle its poor optimization, fulfilling the benefit of AMD backing them). And Beth doesn’t lose anything if its too frustrating and people stop playing over it because they already have the money.
EDIT: Admittedly I keep forgetting that game-pass is a thing, but maybe even that doesn’t really matter to Microsoft if it got people to get on gamepass or something? That makes my earlier point a bit shakier.
Yeah, MS wins either way, so long as people still want to play the game.
…like not launching with DLSS. What a weird oversight.
AMD is the official sponsor. That’s the one thing that wasn’t a surprise.
It’s not an oversight, they were paid to not include DLSS.
While I’m no fan of paid sponsorships holding back good games, this is untrue.
Neither nvidia nor amd block their partner devs from supporting competing tech in their games. They just won’t help them get it working, and obviously the other side won’t either, since that dev is sponsored. There are some games out there that support both, some of them even partnered.
So yes, it’s bullshit. But it’s not “literally paid” bullshit. Bethesda could have gone the extra mile, and didn’t.
AMD blocks partners from implementing DLSS. You’re probably right that it’s not paid bullshit as the payout isn’t monetary. But it’s still being blocked due to the partnership.
This is hardly the first game to do this. Jedi Survivor, RE4 have the same problem. AMD sponsored FSR2 only. The work required to implement FSR2 or DLSS is basically the same (motion data). That’s why DLSS mods were immediately available.
Since FSR2 was released not a single AMD sponsored game has DLSS added. Even games done in engines like unreal where all the dev has to do is include the plugin.
Literally not the case here, as evidenced by public communications.
Yes, it is the case. Companies lie all the time.
Is there actual evidence for AMD blocking DLSS?
And no, AMD being a sponsor is not sufficient evidence.
To be more accurate, they were paid to include AMD optimization instead of DLSS.
Why upgrade when I will just pick it up on the PS7, 10 years from now, along with the Skyrim bundle.
I expected this once everyone kept buying into nvidias dlss.
Nvidia and dlss will be required to get titles to run decently.
Minimal game optimization will be done on majority of future game titles.
Fml
Minimal game optimization will be done on majority of future game titles.
That’s more optimisation than we get now
I haven’t played starfield yet but many of the recent headliner releases have been performance hogs. It’s not unreasonable to expect people to either play with lower settings or upgrade if you want to run the best possible set up. That’s why there are performance sliders in most games. When you need a 3080 to run minimum settings that’s when you start running into trouble (👀ksp 2)
At the same time my 3080 runs these games just fine with 60-90 fps at 4k with high settings. Don’t need more than that for games that aren’t competitive.
Man, that’s why armored core blew me away. Completed the whole game, at launch, maximum settings and I don’t recall a single frame drop. 3060, with very mediocre other hardware. I know there’s a lot to be said about map sizes and instanced missions, but with as fantastic as that game looks and plays…
Same happened with Doom Eternal. The graphics were a show stopper when the game came out and the game didn’t even stutter. It’s so well optimized that I’m told you can even play it with integrated graphics.
It’s almost like having a giant open world comes with some massive drawbacks. I’m pretty fatigued over open world games tho so that may just be me.
Frankly, open world sucks. I’ve played Far Cry 2 sometime last year because one of my friends spoke so highly of it and I’ve spent more time driving around than actually shooting anything. It served no purpose other than wasting player’s time. Missions were rather basic too. And nothing in the reviews of more modern examples showcase that anything has changed.
I have a 3060Ti and play most games on max settings. There is the occasional game that explodes if I do that but otherwise GPU power is out ahead of decently optimized games (probably because gaming is now no longer the driving factor for GPU performance).
Makes my decision to not buy it even easier.
What made you previously decide not to buy it?
I own many games that I impulse buy, but find out that I don’t care for. That gets expensive.
Now I’m much more selective, and tend to wait until the game’s been out long enough to get patches, updates, and reviews.
Add my lack of interest in any Todd Howard product until ES6, which I may not live long enough to play (boomer puke here), as well as the offhanded arrogance of his ‘upgrade your PC’ statement, and that about covers why I’ve decided not to buy Starfield.
Starfield also requires an SSD, a first for a modern triple-A PC game.
I recall the same being said about Cyberpunk 2077, and I’m not sure that was the first either.
Cyberpunk doesn’t require an SSD, it had “SSD recommended” under it’s storage but not required. Starfield lists it as a requirement.
Cyberpunk also has a “HDD mode” in its options.
Because you load every time you walk through a door.
I stand corrected.
To be fair, Cyberpunk 2077 came out in the peak of Covid GPU scarcity, I was still gaming on a GTX1080 at it’s release and the only way I could have a decent experience was running it at 50% resolution scale with 100% sharpening.
BG3 has the same too.
deleted by creator
Y’all are surprised the boss of a AAA studio suggested you buy hardware from companies he has a deeply vested interest in?
It’s all one big circle jerk of companies and anyone buying “cutting edge” gets what they deserve.
You’re the product in more ways than one
You’re literally the consumer in this instance. The game is the product. The computer is the product.
Its on Game Pass, Todd. If it doesn’t run well I’ll just not play Skyrim-Space Edition.
My partner who is interested has a PS5 and an older PC. If her PC doesn’t run it, she’ll probably just keep playing Stardew Valley. Honestly it’s not like anyone is going to really be talking about Starfield in a month or two except ridiculous ship builds on social media.
I bought a new PC just to play Starfield (and BG3 with less issues).
It looks alright overall. But it’s pretty crazy that even 30xx cards can’t run it well (I had a 1070 though).
I did a CPU/mobo/RAM upgrade for it – but I was quite overdue.
It looks alright overall.
That’s the thing. It looks alright, but it’s not the next-gen beauty fest that they want people to think it is. Plenty look better and run better. I enjoy the game, but the whole argument that it’s a graphical standout doesn’t really hold water.
I read this Todd like John Conner says it in T2. “She’s not my mother, Todd.”
I have a i9 13900k and a Radeon 7900xtx, 64GB RAM and I had to refund on steam it because it would keep crashing to desktop every few minutes. Sometimes I would not even get passed the Bethesda into Logo before crashing. Very frustrating experience to say the least.
I mean, the game definitely runs like shit but if you keep crashing that sounds like a you problem. My 7600x/6700XT/32GB DDR5 build hasn’t crashed once in 15 hours of playtime and I’ve heard a ton of complaints about the game but barely any about crashing.
I have a i7-10700k/32gbRAM/3080ti - playing the game at 4k with all settings to max (without motion blur ofc) and with almost 80hrs into the game, I have yet to have a single crash or performance issue.
Only realized people were having issues when I saw posts and performance mods popping up.
Oh, only a 7900xtx? lol
Not that I’ll be buying it anytime soon but if the hardware specifications I’ve read are true, no graphics card is worth €500+ to play a game. This is bonkers.
If he’s telling us this, does that mean we get to bill him for the upgrade?
What Todd Howard is being a dipshit tool again? I’m shocked…shocked I tell you…
I’m a little shocked. Normally its Hines caught with his foot that deep in his own mouth.
Runs fine for me. 5600X, RTX 3080 @ 1440p high-ultra settings native.
Same here except I use a 6600 xt, which isn’t anywhere near as good as your GPU. I’m running medium settings at 4k and it’s fine. It even runs on the Steam Deck, although the graphics are not so good on there. Still, it’s playable and I will probably play there when it’s convenient.
IMO, ultra settings are for people with new, high end hardware and to future proof a game for at least a couple years. It’s not for people running a 2-3 year old rig with a 1080p GPU. Medium and high settings are generally good. Ultra is just like bonus mode for hardcore enthusiasts.
Yeah, the reason why I mentioned my experience is because I’m finding people with better specs complaining and I’m like if we just turned the FPS counter off and enjoyed the game, I’m sure we’d barely notice it dips below 60 at times.
Wish my computer weren’t dead, so I could at least try to play it. Although my 2070 wouldn’t have survived. It runs nice on my Series X, but I hate playing this type of game with a controller.
I’m a PC gamer who likes playing with controllers generally (from the couch), but damn, I hate the way they adapted run and walk to the left analog stick. Feels horrible. I wish I just had autorun and could hold a button to walk. The key binding shuts off even if I try and force it with Steam controller config, because the game doesn’t technically support split inputs.
Just upgrade your PC 4head