kleeon

joined 4 years ago
[–] kleeon@hexbear.net 7 points 1 day ago* (last edited 1 day ago)

We agree there'd be some graphical compromise, maybe we're just disagreeing over details of the scale of that compromise and the hardware that would be 'permittable' by this scenario.

I was imagining something that looks and plays almost exactly like Witcher 3 except at lower resolution, which doesn't seem possible to me. I'd say if you simplify the graphics (replace dynamic lighting with pre-baked lightmaps, do gouraud shading instead of PBR, remove most post-processing effects etc etc) and make all models look like half-life assets, you could definitely render something at an acceptable frame rate. However asset streaming would still be a huge issue for a large and detailed open world like Witcher 3, considering memory limits and slow disk reads

[–] kleeon@hexbear.net 11 points 1 day ago (2 children)

In my experience writing straightforward C code is gonna give you basically the same thing as well-written asm would. I'm not saying modern compilers are perfect but they generally do the right thing (or it's very easy to coax then into doing the right thing). They usually vectorize properly, they know how to avoid obvious pipeline stalls etc. These days order of magnitude performance improvements generally don't come from hand-written assembly but from using better algorithms and data structures

Also you could look at what games of that era actually looked like. Something like Quake 2 already pushed contemporary hardware to it's limits and I wouldn't argue it's a poorly optimized game

[–] kleeon@hexbear.net 17 points 1 day ago (5 children)

I'd say two things:

  1. Modern compilers are way better at generating code than they were in the 90s so hand-rolled assembly is not that useful anymore

  2. Vast majority of frame time in a game like Witcher 3 comes from rendering, not gameplay code and you can't really write rendering code in asm since GPU vendors don't allow you to program their hardware directly. Also running it on a 25 year old GPU would require way more compromises than just lower resolution because 98 era GPUs don't even have basic things like vertex buffers or programmable shaders

I think you could make Witcher 3 run on a computer from 2008 if you optimize it properly but that's about it