this post was submitted on 31 Jul 2023
63 points (93.2% liked)

Games

32501 readers
1576 users here now

Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.

Weekly Threads:

What Are You Playing?

The Weekly Discussion Topic

Rules:

  1. Submissions have to be related to games

  2. No bigotry or harassment, be civil

  3. No excessive self-promotion

  4. Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts

  5. Mark Spoilers and NSFW

  6. No linking to piracy

More information about the community rules can be found here.

founded 1 year ago
MODERATORS
 
top 43 comments
sorted by: hot top controversial new old
[–] LouNeko@lemmy.world 23 points 1 year ago (4 children)

I want to rant real quick.

I want to preface this by saying I'm not a game developer, but I have played a fair share Unreal Engine games and my honest opinion as a consumer is that it is a literal plague especially in the indie game world. Show me 1 second of gameplay of any game and I could tell you with 100% certainty whether it's an unreal engine game or not. And the main issue isn't the engine itself, I bet its a fine engine that can do everything that a developer needs it to do.
The main issue in general gaming but most noticeably in UE is the absolute horrible TAA antialiasing. Somehow we went from crisp and sharp looking games in 2010 to absolutely blurry messes today. UE is the biggest offender, every single on of their games uses TAA as its main AA method and only with the sharpening filter turned to a 100, is it barely serviceable. And on top of the blurriness you have visual artifacts especially in Picture-in-Picture (PiP) rendering, so forget realistic scopes or mirrors or particle effects. And if you decide to use any other method for AA, all the characters hair looks like an unacceptable flickering wiremesh. We always see these tech demos of amazing lighting and huge open landscapes rendered in realtime with UE but it all amounts to nothing if everything is blurred beyond recognition.
The second biggest gripe is the abysmal performance. Sure if a game looks good you can expect it to be a little bit more demanding on the hardware side. But thanks to TAA, no UE game actually looks good. So you're just left with the hardware demands. But in the past, if your PC couldn't handle a game at max setting you just tone them down a little bit and "viola" your game runs good. That is absolutely impossible with UE. I have 3 UE games that I regularly play, and the difference between lowest and max settings on all of them is ~5 FPS. So your game looks like a PS2 game and you get barely any performance gain, awesome, good job UE. Not to mention that in an attempt to maximize "performance" most NPCs that are further than 50m are rendered at 5 FPS, looks realy good on those big open landscapes with amazing lighting.\

I am sure that all of those problems are solved if the engine is in the hands of a talented developer that knows what their doing and puts value on visual clarity and performance. But that is not what the vast majority of UE developers do. UE feels to me like a modular package. You just slap things together and it supposedly works. But you can't expect to create art by just slapping things together. It also feels like UE tries to become the jack of all trades but master of none to appeal to the broadest market so that Epic can cash in on all that licensing money.

[–] AProfessional@lemmy.world 6 points 1 year ago (3 children)

These are real issues but what is the alternative?

Most other engines are not better. Creating a new engine is very expensive, takes time, and is risky.

[–] LouNeko@lemmy.world 5 points 1 year ago (3 children)

It seems like Unity is the go to engine for 2D applications. But I'm always surprised how much developers can squeeze out of it for 3d games. Konami could get their heads out of their asses and sell the Fox Engine or make it publicly available since they aren't using it anymore. The CryEngine always looked stellar and is available for licensing.

I just dont understand, is the Unreal Engine so much cheaper and better for development than any alternative? Is Epics support better than any competitors? Why does it seem like every 2nd indie or double A title uses UE?

We also have more and more developers transfer to UE for sequels even if they already have a working engine. (Insurgency: Source, Insurgency Sandstorm: Unreal)

[–] MossBear@lemmy.world 8 points 1 year ago (1 children)

Our studio uses Godot. It's fantastic and open source.

[–] LouNeko@lemmy.world 4 points 1 year ago (1 children)

I just watched the 2022 desktop/console showcase, I've only played Brotato and Cassette Beasts (Switch) out of all of them. Looks very clean but so far mostly focuses on 2D and 2.5D games. I also saw a VR game in the showcase. Looks very interesting.

[–] MossBear@lemmy.world 4 points 1 year ago

It is great for 2D. The 3D is definitely getting there, but it's not on the same level as Unreal or Unity yet. I think within a year or two, It should catch up to Unity at least though. We're super pleased with it.

[–] Coelacanth@feddit.nu 3 points 1 year ago (1 children)

CDPR are switching from their proprietary Red Engine to UE5 as well.

[–] scrubbles@poptalk.scrubbles.tech 6 points 1 year ago (1 children)

Yeah anyone who says that studios should just develop an engine or that it's not that hard should look to cyberpunk. Most bugs there were engine related, and all of its performance woes were too.

I'm actually sad, it ended up being a fine engine after they fixed it up for a year, and it'd be nice to have some more alternatives to unreal

[–] Coelacanth@feddit.nu 5 points 1 year ago (1 children)

I completely agree, on both counts. I'm sad about the demise of Red Engine too, especially since the look of Cyberpunk was one of the things they nailed. Not just graphically, but things like small character movement animations during dialogues and facial expressions.

I'm fearful that the upcoming Cyberpunk 2 (when it releases in 10 years) will lose a lot of identity by being Unrealified.

[–] Marsupial@quokk.au 2 points 1 year ago (1 children)

Those character animations are an engine agnostic problem. That’s on the art department, any engine can handle it with ease.

[–] Coelacanth@feddit.nu 2 points 1 year ago

That's possible. I know rigging for facial expressions used to be a big thing and was very different between engines, but at this point perhaps every option is at a sufficient enough level for it not to matter.

[–] beefcat@lemmy.world 2 points 1 year ago (1 children)

Unreal is way more versatile and easier to use than CryEngine, and a lot more capable for AAA game development than Unity. Looking at UE5, none of these alternatives have equivalents for features like Nanite or Lumen.

[–] LouNeko@lemmy.world 1 points 1 year ago (1 children)

I've seen the presentation of Nanite and Lumen a month ago and they seem like very interesting technologies. I still haven't seen a game implement Nanite to get a significant performance boost though. Lumen is more of a filmmakers tool, since lighting in games is often preferred to be more stylized than realistic. But this also brings up another issue with UE. The constant updates distract developers from actually fulfilling their vision and finishing the game. Early Access titles often stagnate development to update to a new engine version and implement new technologies, instead of providing content and bugfixes. And if you don't update the to a new engine, the community whines about it. So the devs have no choice. The versitality has its price, it's like UE tries to become jack of all trades, but master of none in an effort to provide everybody with a platform.

[–] beefcat@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

Lumen is not a filmmakers tool. Fortnite is already using it in production on current gen consoles, and Immortals of Aveum will be using it exclusively when it launches later this month.

Nanite is about eliminating LoD pop in without a performance penalty. I wouldn’t expect games to run faster, only look better.

[–] Vengefu1Tuna@lemmy.world 4 points 1 year ago

Another big factor is developer engine knowledge. It's expensive to train developers on a new or unpopular engine when you can hire plenty of devs who are already familiar with a popular engine like Unreal. 343i continues to have this issue with Halo Infinite running on their Slipspace engine, which is why (IIRC) they're switching to Unreal for future games.

[–] beefcat@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

There isn’t a great alternative. SSAA is way too expensive, and old anti-aliasing techniques do not work well with shader-heavy games or really fine detail.

The fucktaa crowd would rather just live with really nasty shimmering and other artifacts of aliasing, or they have obnoxiously expensive setups that can drive SSAA or displays with really high pixel densities. Personally I think they’re crazy. I find most TAA implementations look way better on my 27” 1440p monitor than no AA.

[–] lustyargonian@lemm.ee 5 points 1 year ago (1 children)

Lemmy needs its own /r/fucktaa

[–] LouNeko@lemmy.world 4 points 1 year ago (1 children)

I wonder why exactly somebody decided that the search for a perfect AA method has to stop TAA. We went from jaggy edges to edge detection and oversampling (MSAA) being the standard in 2000-2012 but people where unsatisfied with the performance tank so we needed a lighter method. So we got post processing AA like SMAA which is a scam and does absolutely nothing or FXAA which simlpy applies a blur filter to edges. Not the most elegant solutions but they will do if you can't effort to use MSAA. Then TAA came around the corner and I dont even know how it looks so bad, because it sounds fine on paper. Using multiple frames to detect differences in contrast and then smoothing out those diffrences seems like an OK alternative, but it should've never become the main AA method.
I've honestly expected the AA journey to end with 4K resolution being the standard. AA is mostly a matter of pixeldesity over viewing distance. Mobile games have mostly no AA because their pixel density is ridiculous, Console games also rarely have AA because you sit 10 feet away from the screen. PC being the only outlier but certainly having the spare power to run at higher resolutions than consoles. But somewhere along the way, Nvidia decided to go all in on Raytracing and Dynamic Resolution instead of raw 4K performance. And Nvidia basiacly dictates where the gaming industry goes.
So I honestly blame Nvdia for this whole mess and most people can agree that Nvidia has dropped the ball the last couple of years. Their Flagship cards cost more than an all consoles from Sony, Microsoft and Nintendo combined. They cost more than mid-high range gaming laptops. And the raw power gain has been like 80% over the last 10 years, because they put all their R&D into gimmicks.

[–] fushuan@lemm.ee 2 points 1 year ago (2 children)

I got quite the good AA by rendering the screen at 4k and letting the graphic card underscale it into the screen's 1080p resolution. No AA needed, looks fiine.

[–] LouNeko@lemmy.world 2 points 1 year ago (1 children)

That is basically MSAA without the edge dection. Rendering in 4K and downscaling is the dirtiest but most effective AA method. But downscaling the whole screen also applies to UI elements, this often times results in tiny blurry fonts if the UI isn't scaled appropriately. But more and more games have started to add a render resolution scale option that goes beyond 100% without affecting the UI. Downscaling also causes latency issues. I can run Metal Gear Solid 5 at a stable 60 FPS at 4K but the display latency is very noticeable compared to 1440p at 60.
I miss the time when you could just disable the games native AA and force MSAA through Nvidia control panel. But most newer titles dont accept Nvdias override, especialy Unreal games.

[–] beefcat@lemmy.world 1 points 1 year ago* (last edited 1 year ago) (1 children)

MSAA only samples the geometry multiple times, not the whole scene. It doesn’t work very well in games with a lot of shaders and other post process work, which is basically every game made in the last decade.

What GP is describing is SSAA (Super sampled anti-aliasing).

[–] LouNeko@lemmy.world 1 points 1 year ago

Thats what I meant by edge detection. I think part of the downfall of MSAA in modern gaming is foliage. Nowadays every field in videogames is filled with lush grass, same goes for trees and bushes. They aren't flat textures of low poly models anymore. Most engines use completely different rendering methods for foliage to get the 1000s of swaying leafs and grass on screen with minimum performance impact. But having to detect all the edges of every single piece of grass and apply oversampling to it, would make any game run at single digit frames. There are certainly a few other things that GPUs have to render in bulk to justfy novel rendering methods, but foliage is by far the best example. So I can understand why post processing AA is easier to implement. But is TAA really the best we cab do? Especially because things like swaying grass becomes a green blob through TAA. Slow and fine movement like swaying is really the bane of temporal sampling.

[–] beefcat@lemmy.world 1 points 1 year ago (1 children)

That is an insanely expensive solution to this problem. You are cutting performance by 75% or more to make that possible, meaning your 30 FPS game could be doing 120 if you stuck to native 1080p.

[–] fushuan@lemm.ee 1 points 1 year ago

That's the thing, my game is running at 60+, and I don't need more.

In any case new graphic cards AR prepared to run for 4k games, so having a 1080p screen which which I'm content is a godsend performance wise, it let's me do stuff like this without practical performance losses.

[–] Psythik@lemm.ee 3 points 1 year ago* (last edited 1 year ago) (3 children)

Well there's always DLSS and FSR. I don't even use AA anymore cause DLSS Balanced looks so much better than even native resolution + 8x MSAA.

[–] beefcat@lemmy.world 2 points 1 year ago (1 children)

MSAA doesn’t do anything for modern games because just about every surface has multiple pixel shaders applied on top. This is why few games bother to support it.

[–] LouNeko@lemmy.world 1 points 1 year ago

Yes I agree, I wrote another comment about how I think the prevalence of realistic foliage in modern games might have been the biggest factor in MSAAs abandonment.

[–] Dyf_Tfh@lemmy.sdf.org 2 points 1 year ago

DLSS and especially FSR, are basically TAA repurposed for upscaling.

But contrary to the vast majority of TAA implementations, they are actually good.

[–] LouNeko@lemmy.world 1 points 1 year ago

I wish could experience DLSS. I'm still rocking a 1080Ti, so no DLSS for me, only FSR. But, in my opinion, FSR is such a visual downgrade for a minuscule performance boost. Especially in PvP games, where you can get killed by a single pixel, playing at a curbed resolution is a dealbreaker. I've heard DLSS looks a lot better than FSR but I'm going to run the 1080Ti till it dies, since it still runs nearly everything maxed out at 1440p.

[–] Treczoks@lemmy.world 2 points 1 year ago (1 children)

It gets even worse when non-game applications use those frameworks designed for games. Like Stud.io - virtual LEGO building CAD. Even if you don't touch the thing, it still renders 60 frames per second. Whenever I use it, the fans run high even when it is idling. And don't even think of running this on a battery-powered laptop...

[–] Katana314@lemmy.world 5 points 1 year ago (1 children)

I could be wrong, but I think on a lot of complaints like that, the issue ends up being at least partially the fault of the studio using the engine being too lazy to adjust Unreal's defaults like that. I'd be surprised if it doesn't let you turn off rendering and preserve the current image on screen.

[–] Treczoks@lemmy.world 3 points 1 year ago

I don't know if they are using Unreal or Unitiy, or whatever, but it really sucks, and it is also so f-ed up that it won't properly run in the Wine environment under Linux.

[–] GreenDust@lemmings.world 7 points 1 year ago (1 children)

Wasn't Satisfactory upgraded to UE5?

[–] astropenguin5@lemmy.world 4 points 1 year ago (1 children)

Yep, although its still on the experimental branch which is update 8. The list seems to focus more on upcoming titles than already released ones interestingly

Source because I looked it up to be sure: https://store.steampowered.com/news/app/526870/view/3690183765419759383?l=english

[–] Kolanaki@yiffit.net 1 points 1 year ago (2 children)

Any benefits aside from maybe better graphical features? Is there improved performance, maybe? My only issue with the game is that it gets a little slow after you are self-sufficient, even with optimal factories. Not that it isn't unexpected in a game like this, but any little bit helping would be nice!

[–] LouNeko@lemmy.world 2 points 1 year ago (1 children)

Apparently there are performance increases because the simulation of distant objects has been optimized. Here's the Devs talking about it.

[–] Piemanding@sh.itjust.works 1 points 1 year ago* (last edited 1 year ago)

Didn't UE 4 have shader loading issues too? Maybe it was UE3. Many UE games suffer from microstutters cause of this.

[–] astropenguin5@lemmy.world 1 points 1 year ago

I've not tried it yet myself, i would check the satisfactory forums on either steam, here, or reddit if you really want to for discussion of it.

[–] Carighan@lemmy.world 3 points 1 year ago

Oh I did not know Avowed is going to use UE5. Interesting. I think I might have to upgrade my machine soon. 🙈 Also, Hellbade 2 cannot release soon enough!

[–] Coelacanth@feddit.nu 1 points 1 year ago (3 children)

I remember watching a video on the Unrealification of modern gaming and the prospect of every release coming with that Unreal Engine look. Does anyone else know which one I'm talking about? I can't seem to find it.

[–] BreadGar@lemmy.ca 2 points 1 year ago

That's pretty much like the early unreal games that were all green/brown in color

[–] hellishharlot@lemmy.world 2 points 1 year ago

It's not that dissimilar from the period of unity3d games that all shared pretty much the same effects.

[–] beefcat@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

That “look” has more to do with studios just using the standard shaders and default settings that come with Unreal. Using a different engine wouldn’t really solve this, as they would probably just lean on whatever that engine’s defaults are. Any studio that wants to can write their own shaders to give their game a more unique look.

The engine inherited problems that stick out to me are traversal stutter and shader compilation stutter. These are both products of engine limitations that are difficult for developers to work around.