this post was submitted on 27 Nov 2024
206 points (94.0% liked)

Firefox

17957 readers
156 users here now

A place to discuss the news and latest developments on the open-source browser Firefox

founded 4 years ago
MODERATORS
 

They support Claude, ChatGPT, Gemini, HuggingChat, and Mistral.

you are viewing a single comment's thread
view the rest of the comments
[–] LWD@lemm.ee 10 points 2 days ago (4 children)

It is a sidebar that sends a query from your browser directly to a server run by a giant corporation like Google or OpenAI, consumes an excessive amount of carbon/water, then sends a response back to you that may or may not be true (because AI is incapable of doing anything but generating what it thinks you want to see).

Not only is it unethical in my opinion, it's also ridiculously rudimentary...

[–] TheMachineStops@discuss.tchncs.de 3 points 2 days ago* (last edited 2 days ago) (3 children)

It gives you many options on what to use, you can use Llama which is offline. Needs to be enabled though about:config > browser.ml.chat.hideLocalhost.

[–] Swedneck@discuss.tchncs.de 5 points 1 day ago (1 children)

and thus is unavailable to anyone who isn't a power user, as they will never see a comment like this and about:config would fill them with dread

[–] TheMachineStops@discuss.tchncs.de 4 points 1 day ago* (last edited 1 day ago)

Lol, that is certainly true and you would need to also set it up manually which even power users might not be able to do. Thankfully there is an easy to follow guide here: https://ai-guide.future.mozilla.org/content/running-llms-locally/.

load more comments (1 replies)
load more comments (1 replies)