this post was submitted on 03 Dec 2024
113 points (96.7% liked)

Linux Gaming

15507 readers
565 users here now

Discussions and news about gaming on the GNU/Linux family of operating systems (including the Steam Deck). Potentially a $HOME away from home for disgruntled /r/linux_gaming denizens of the redditarian demesne.

This page can be subscribed to via RSS.

Original /r/linux_gaming pengwing by uoou.

Resources

WWW:

Discord:

IRC:

Matrix:

Telegram:

founded 2 years ago
MODERATORS
top 24 comments
sorted by: hot top controversial new old
[–] sugar_in_your_tea@sh.itjust.works 26 points 2 weeks ago (2 children)

I'm interested in benchmarks to compare to my current RX 6650 XT, which is pretty similar to the 4060.

It has 12GB VRAM, which might be enough to mess around with smaller LLM models, but I really wish they'd make a high VRAM variant for enthusiasts (say, 24GB?).

That said, with Gelsinger retiring, I'll probably wait until the next CEO is picked to hear whether they'll continue developing their GPUs, I'd really rather not buy into a dead-end product, even if it has FOSS drivers.

[–] circuitfarmer@lemmy.sdf.org 10 points 2 weeks ago (2 children)

12GB VRAM in 2024 just seems like a misstep. Intel isn't alone in that, but it's really annoying they didn't just drop at least another 4GB in there, considering the uplift in attractiveness it would have given this card.

[–] Montagge@lemmy.zip 16 points 2 weeks ago (2 children)

And here I am with 8GBs in 2024 lol

[–] circuitfarmer@lemmy.sdf.org 8 points 2 weeks ago

The industry as a whole has really dragged ass on VRAM. Obviously it keeps their margins higher, but for a card targeting anything over 1080, 16GB should be mandatory.

Hell, with 8GB you can run out of VRAM even on 1080, depending on what you play (e.g. flight sims).

[–] Blackmist@feddit.uk 2 points 2 weeks ago

That's why I avoided the 4060s and stuck to my 1060 for now

[–] sugar_in_your_tea@sh.itjust.works 1 points 2 weeks ago (1 children)

I doubt it would cost them a ton either, and it would be a great marketing tactic. In fact, they could pair it w/ a release of their own LLM that's tuned to run on those cards. It wouldn't get their foot in the commercial AI space, but it could get your average gamer interested in playing with it.

[–] dabaldeagul@feddit.nl 2 points 2 weeks ago (1 children)

It wouldn't cost much, but this way they can release a "pro" card with double the vram for 5x the price.

I doubt they will. Intel has proven to be incompetent at taking advantage of opportunities. They missed:

  • mobile revolution - waited to see if the iPhone would pan out
  • GPU - completely missed the crypto mining boom and COVID supply crunch
  • AI - nothing on the market

They need a compelling GPU since the market is moving away from CPUs as the high margin product in a PC and the datacenter. If they produced an AI compatible chip at reasonable prices, they could get real world testing before hey launch something for datacenters. But no, it seems like they're content missing this boat too, even when the price of admission is only a higher memory SKU...

[–] DarkThoughts@fedia.io 3 points 2 weeks ago (1 children)

Got the same card and you can definitely run smaller models on 8GB. There's no need to pay 200-300 bucks for a 4Gb ram upgrade though. Might be a nice card for people on the lower end but not in our cases. But yeah, I'd really like more vram too, especially with how expensive the higher end cards get - which AMD won't even bother with anymore anyway. Really hoping for something with 16+ GB for a decent price.

[–] sugar_in_your_tea@sh.itjust.works 1 points 2 weeks ago* (last edited 2 weeks ago)

Yeah, I really don't need anything higher than 6700/7700 XT performance, and my 6650 XT is still more than sufficient for the games I play. All I really need is more VRAM.

If Intel sold that, I'd probably upgrade. But yeah, 12GB isn't quite enough to really make it make sense, the things I can run on 12GB aren't meaningfully different than the things I can run on 8GB.

[–] RonnyZittledong@lemmy.world 9 points 2 weeks ago (2 children)

I keep eyeing these Arc cards to possibly put into my Plex system. They are looking pretty juicy.

[–] hessnake@lemmy.world 5 points 2 weeks ago (1 children)

I've got an A380 in my Jellyfin server and it's a beast

[–] DarkThoughts@fedia.io 3 points 2 weeks ago (1 children)

Weren't Intel cards super inefficient in regards to power draw for their performance?

[–] Chewy7324@discuss.tchncs.de 2 points 2 weeks ago (1 children)

It likely depends on how much they pay for power and how many users they serve.

E.g. I'd really like AV1 support on my server (helps with slow upload), but the cost for power of a dedicated GPU is inacceptable in my country. The few transcoding reams I'd theoretically need in a worst case scenario are more than met with an iGPU.

[–] sugar_in_your_tea@sh.itjust.works 1 points 2 weeks ago (1 children)

Yup. In my area, power usage is a non-issue. I pay $0.12-0.13/month, so my concerns around power usage are only because I don't want to be wasteful (our energy largely comes from coal and natural gas). So I wouldn't buy the A380 despite it not mattering too much because it's just too wasteful.

This new set of cards seem to be a lot more power efficient though, so maybe they're worth a look if you need something for transcoding.

[–] Chewy7324@discuss.tchncs.de 2 points 2 weeks ago (1 children)

I've just looked it up and the A380 seems to only draw ~17W at idle. That's better than I thought, but still 2-3 times a HDD.

I wonder whether the new generation will lower the idle power usage too, or only the performance per watt.

[–] sugar_in_your_tea@sh.itjust.works 2 points 2 weeks ago* (last edited 2 weeks ago)

Yeah, my NAS uses something like 50W (measured from the wall), with two HDDs, a SATA SSD, Ryzen 1700, and an old GPU (750 Ti, won't boot without graphics). I haven't measured everything independently, but online sources say 6W for the GPU. So the A380 would be 3x higher. That doesn't matter too much in my area, but it's still extra power draw.

Hopefully the new gen is close to that 6W figure.

[–] Blackmist@feddit.uk 2 points 2 weeks ago

Sounds like overkill tbh, can't you run those things on a £200 mini PC?

[–] jet@hackertalks.com 5 points 2 weeks ago

Needs virtio support to really be a top seller!

[–] just_another_person@lemmy.world 3 points 2 weeks ago

Need benchmarks and TDP to even start thinking about trying one of these. AMD is years ahead and I've never had an issue.

[–] commander@lemmy.world 3 points 2 weeks ago

B770 to hypothetical B9XX is what I'm looking for. Phoenix benchmarks because not many doing Linux benchmarks. 8700-8800xt or B700-B9XX for me next year

[–] mox@lemmy.sdf.org 3 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

If this turns out to be a solid performer, the price could make it the best midrange value since AMD's Polaris (RX 480). I hope Intel's build quality has improved since the A770.

https://www.youtube.com/watch?v=N371iMe_nfA

[–] DarkThoughts@fedia.io 1 points 2 weeks ago

If that was some OEM design for a retail PC, fine. But fuck off with shit like glued back plates on dedicated GPUs you buy.

[–] WhiskyTangoFoxtrot@lemmy.world 2 points 2 weeks ago

So is this the one with the telephone sanitizers?