this post was submitted on 10 Apr 2024
212 points (96.5% liked)
PCGaming
6484 readers
70 users here now
Rule 0: Be civil
Rule #1: No spam, porn, or facilitating piracy
Rule #2: No advertisements
Rule #3: No memes, PCMR language, or low-effort posts/comments
Rule #4: No tech support or game help questions
Rule #5: No questions about building/buying computers, hardware, peripherals, furniture, etc.
Rule #6: No game suggestions, friend requests, surveys, or begging.
Rule #7: No Let's Plays, streams, highlight reels/montages, random videos or shorts
Rule #8: No off-topic posts/comments
Rule #9: Use the original source, no editorialized titles, no duplicates
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Likely less of a hardware thing and more of a, "made with a ten year old engine that was discontinued seven years ago."
Originally I was gonna say Unreal Engine 5 only has a smallish performance hit when Lumen Global Illumination is enabled, but when I looked up the helldivers engine, they outright cite its discontinuing for, "not being able to compete against UE and Unity."
Edit: the engine is Autodesk Stingray, since I failed to name it.
Edit 2: apparently Helldivers 2 was in production since before Stingray was discontinued, but still, why'd they try to put GI in such an old engine? Did nobody consider the stress a new lighting model would apply to an engine that barely existed before the idea of GI?
I've got a 5800x3d with 7900xt on Arch Linux (btw) running KDE Plasma in Wayland. The only 2 games I've had lock up my system in the last year since I've owned the video card is Jedi Survivor and Helldivers 2, so my assumption is that they share some kind of bad call to the card somewhere.
And yeah, as far as the stingray shit goes, you're not the first person to question their commitment to stick with it.