this post was submitted on 26 Jun 2023
8 points (100.0% liked)
Gaming
30564 readers
193 users here now
From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!
Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.
See also Gaming's sister community Tabletop Gaming.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
How much ML training will you do, and what kind of models? Are you just a hobbyist, or are you a student or researcher in ML?
If the former, you may be better served by renting a machine for training instead. Vast.ai is one such service for this and you can rent machines with a 4090 for something like 50 cents an hour. For hobbyist stuff this usually ends up cheaper than buying a whole card, especially if you find out that you need multiple GPUs to train the model effectively.
If you're a researcher though, a 3090 might be a good buy. IMO the gains from a 4090 won't be too crazy unless you're doing specific mixed precision stuff (the newer gen tensor cores support more data types). Be aware that the large models that necessitate 24GB of VRAM usually require many GPUs to train them successfully in a reasonable amount of time, so having a large VRAM GPU is moreso useful for quick development and debugging rather than training large models, in which case the 4090 wouldn't be all that much better.