this post was submitted on 15 Apr 2025
33 points (94.6% liked)
Privacy
1683 readers
3 users here now
Icon base by Lorc under CC BY 3.0 with modifications to add a gradient
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
you can also, like, run it locally which will send precisely jackshit to anyone else
True, although I lack the 768 GB of RAM to run the full model.
so don't run the full model, condensed models are still fine
or are we really going to act like children who want their new shiny toy and won't accept anything less?
Yes I do want the best, suppose I'm a child.
I'd rather spend pennies and send technical logs to a 3rd party inference provider than run a cut down model at still quite slow speeds on my own hardware.