this post was submitted on 27 Aug 2024
339 points (90.3% liked)

Technology

59314 readers
4603 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] tal@lemmy.today 1 points 2 months ago (1 children)

and run local LLMs.

Honestly, I think that for many people, if they're using a laptop or phone, doing LLM stuff remotely makes way more sense. It's just too power-intensive to do a lot of that on battery. That doesn't mean not-controlling the hardware -- I keep a machine with a beefy GPU connected to the network, can use it remotely. But something like Stable Diffusion normally requires only pretty limited bandwidth to use remotely.

If people really need to do a bunch of local LLM work, like they have a hefty source of power but lack connectivity, or maybe they're running some kind of software that needs to move a lot of data back and forth to the LLM hardware, I think I might consider lugging around a small headless LLM box with a beefy GPU and a laptop, plug the LLM box into the laptop via Ethernet or whatnot, and do the LLM stuff on the headless box. Laptops are just not a fantastic form factor for heavy crunching; they've got limited ability to dissipate heat and tight space constraints to work with.

[โ€“] areyouevenreal@lemm.ee 1 points 2 months ago

Yeah it is easier to do it on a desktop or over a network. That's what I was trying to imply. Although having an NPU can help. Regardless I would rather be using my own server than something like ChatGPT.