this post was submitted on 04 May 2024
533 points (95.1% liked)

Technology

59605 readers
3100 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] cley_faye@lemmy.world 5 points 6 months ago (2 children)

In addition to being able to run the exact same thing on that phone you already have, too.

Their device does not have any specific hardware for their usage. Even if Google and Apple don't bring any improvement to their own solution, soon enough someone is bound to just provide an "assistant AI app" with a subscription, proxying openai requests and using the touchscreen, camera, micro and speaker that are already there instead of making you buy a new set of those.

[–] FlyingSquid@lemmy.world 6 points 6 months ago (1 children)

The "AI" in the R1 is utter shit. Wired eviscerated it in a review.

https://www.wired.com/review/rabbit-r1/

[–] ChaoticNeutralCzech@feddit.de 2 points 6 months ago (1 children)

It is somewhat OK considering it's a free app.

[–] FlyingSquid@lemmy.world 2 points 6 months ago (1 children)

You could say the same about Siri, which is also utter shit.

[–] ChaoticNeutralCzech@feddit.de 2 points 6 months ago

And yet, for both you are supposed to pay for an overpriced device. You can at least pirate the R1 app.

[–] Knock_Knock_Lemmy_In@lemmy.world 2 points 6 months ago (1 children)

I think there may be a market for an LMM that is executed locally and privately incorporates personal data.

[–] cley_faye@lemmy.world 2 points 6 months ago

Yes, there is. And yes, it would be huge. I know a lot of people that are staying away from all this as long as the privacy issues are not resolved (there are other issues, but at this point, the cat is out of the bag).

But running large models locally requires a ton of resource. It may become a reality in the future, but in the meantime allowing more, smaller provider to provide a service (and a self-hosted option, for corporation/enthusiasts) is way better in term of resources usage. And it's already a thing; what needs work now is improving UI and integrations.

In fact, very far from the "impressive" world of generated text and pictures, using LLM and integrations (or whatever it is called) to create a sort of documentation index that you can query with natural language is a very interesting tool that can be useful for a lot of people, both individual and in corporate environment. And some projects are already looking that way.

I'm not holding my breath for portable, good, customized large models (if only for the economics of energy consumption) but moving away from "everything goes to a third party service provider" is a great goal.