this post was submitted on 26 Jun 2023
8 points (100.0% liked)

Gaming

30564 readers
193 users here now

From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!

Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.

See also Gaming's sister community Tabletop Gaming.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

I have an RTX 2080ti.

I still play in 1080p 60Hz, and the 2080 is plenty. But I'm looking to train some ML models, and the 11GB VRAM is limiting for that.

Thus, I plan to buy a new one. Also I don't want a ML only GPU since I don't want to maintain two GPUs.

Since I'm upgrading, I need to think of future compatibility. At some point I will move to at least 2k, although still I'm not bought into 4k as any perceivable benefit.

Given all these, I wanted to check with folks who have either card, should I consider 4090?

you are viewing a single comment's thread
view the rest of the comments
[–] KingRandomGuy@kbin.social 2 points 1 year ago

How much ML training will you do, and what kind of models? Are you just a hobbyist, or are you a student or researcher in ML?

If the former, you may be better served by renting a machine for training instead. Vast.ai is one such service for this and you can rent machines with a 4090 for something like 50 cents an hour. For hobbyist stuff this usually ends up cheaper than buying a whole card, especially if you find out that you need multiple GPUs to train the model effectively.

If you're a researcher though, a 3090 might be a good buy. IMO the gains from a 4090 won't be too crazy unless you're doing specific mixed precision stuff (the newer gen tensor cores support more data types). Be aware that the large models that necessitate 24GB of VRAM usually require many GPUs to train them successfully in a reasonable amount of time, so having a large VRAM GPU is moreso useful for quick development and debugging rather than training large models, in which case the 4090 wouldn't be all that much better.