Dying 1080 noises.
Gaming
From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!
Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.
See also Gaming's sister community Tabletop Gaming.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
I just went from a 3570k to a 12600k and will run this 1080 till it can't run no more.
I still game on a 768p monitor with my 750Ti and i'm very happy :)
I bought a 4090 just to run LLM and Stable Diffusion, with some occasional gaming. But if you're just use it for ML, get whatever is cheaper (ironically I found 4090 cheaper than 3090 when shopping around).
7900 XTX recently got support for Stable Diffusion and LLM, on paper, it's faster than 4090 RTX for FP16 computation, it does seem faster judging my experience using rented 4090 RTX on Runpod and my 7900 XTX GPU. 14 seconds (4090 RTX) vs 6 seconds (7900 XTX.)
7900 XTX is an option if you want $1000 cheaper than 4090 RTX and have similar sized VRAM and having comparable performance to that of 4090 RTX.
I'm doing summer research with a focus on ML. I just built my computer and picked AMD because of the price, but did not now that Nvidia was the one to pick at the moment if that's what I wanted it for. I don't know enough about hardware and could use the school labs anyway, but I should have done better research (ironic heh).
How much ML training will you do, and what kind of models? Are you just a hobbyist, or are you a student or researcher in ML?
If the former, you may be better served by renting a machine for training instead. Vast.ai is one such service for this and you can rent machines with a 4090 for something like 50 cents an hour. For hobbyist stuff this usually ends up cheaper than buying a whole card, especially if you find out that you need multiple GPUs to train the model effectively.
If you're a researcher though, a 3090 might be a good buy. IMO the gains from a 4090 won't be too crazy unless you're doing specific mixed precision stuff (the newer gen tensor cores support more data types). Be aware that the large models that necessitate 24GB of VRAM usually require many GPUs to train them successfully in a reasonable amount of time, so having a large VRAM GPU is moreso useful for quick development and debugging rather than training large models, in which case the 4090 wouldn't be all that much better.
I game at 3440x1440 ultrawide, and upgraded from a 3090 to 4090. 4090 is significantly faster and smoother. And DLSS3 frame interpolation is no joke- in Hogwarts Legacy with every setting cranked up and max ray tracing, turning on DLSS3 jumped me from around 80fps with noticeable 1% low stutters and now pegs it at the 144fps limiter I set in game. Smooth as butter.
Also I mess around with some AI things like stable diffusion, and it's much faster for that as well. As much as I hate the term "future proof", the 4090 is more worth it in that regards, IMO.
upgrade your monitor
of all the possible choices out there, the 3090 is pretty ass but the 4090 is actually one of the best. But not on 1080p, you ought to have ditched that back when you got that 2080
Even if its “just” to get a notably higher refresh rate. If you’re considering around 4090 kind of prices a lovely higher refresh rate 1440p monitor would be a great sweet spot to consider.
Though I’d maybe say different if its business expense to earn you revenue and gaming is only lighter touch.
a lovely higher refresh rate 1440p monitor would be a great sweet spot to consider.
nah, screw that, a 4090 is wasted on anything not 4K in my opinion
Sincerely, why? I ran 1440 165hz with my 3070 with mediocre success and even with my 4090 and games like jedi fallen order and valhalla hitch and stutter occasionally. Paired with a 7900x I was expecting better and feel like a lot of people get hung up on the numbers.
Monitor upgrades are one of the big reasons why I'm still rocking a 1070ti. If I buy an expensive new video card then I also need to upgrade my ultra wide and I'm not ready for that yet.