this post was submitted on 17 Mar 2024
127 points (90.4% liked)

Open Source

33166 readers
428 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] independantiste@sh.itjust.works 55 points 11 months ago* (last edited 11 months ago) (3 children)

having contributors sign a CLA is always very sus and I think this is indicative of the project owners having some plans of monetizing it even though it is currently under AGPLv3. Their core values of no dark patterns and whatnot seem like a sales argument rather than an actual motivation/principle, especially when you see that they are a bootstrapped startup.

[–] silas@programming.dev 12 points 11 months ago

Thanks for pointing that out—looks like they’re working on a Server Suite. I’d guess that they try to monetize that but leave the personal desktop version free

[–] yogthos@lemmy.ml 5 points 11 months ago (1 children)

I mean anybody can fork it and keep developing it without a CLA under AGPL3.

[–] wiki_me@lemmy.ml 7 points 11 months ago (1 children)

Yeah it's easy to fall into a negativity bias instead of doing a risk benefit analysis , the company could be investing money and resources that could be missing from open source projects, especially professional work by non programmers (e.g. UX researchers) which is something that open source projects usually miss.

You could probably figure it out by going over the contributions.

[–] independantiste@sh.itjust.works 3 points 11 months ago

Of course, I am not against software being open-source, and I much prefer this approach of companies making their software open-source, but it's the CLA that really bothers me. I like companies contributing to the FOSS ecosystem, what I don't like is companies trying to benefit from free contributions and companies having the possibility to change the license of the code from those contributors

[–] ___@lemm.ee 1 points 11 months ago* (last edited 11 months ago) (1 children)

I’m starting to come around to big corps running their custom enhanced versions while feeding their open source counterparts with the last gen weights. As much as I love open source, people need to eat.

As was mentioned, if they start doing something egregious, they’re not the only game in town, and can also be forked. Love it or hate it, a big corp sponsor makes Joe six-pack feel a little more secure in using a product.

[–] Kindness@lemmy.ml 3 points 11 months ago

Free as in freedom, not free as in beer.

GPLv3 allows you to sell your work for money, but you still have to hand over the code your customers purchased. You buy our product, you own it, as is. Do whatever you like with it, but if you sell a derivative, you better cough up the new code to whoever bought it.

[–] Showroom7561@lemmy.ca 22 points 11 months ago

that runs 100% offline on your computer.

Goddamn, that's wonderful!

[–] silas@programming.dev 19 points 11 months ago (3 children)

Does this differ from Ollama + Open WebUI in any way?

[–] circuscritic@lemmy.ca 9 points 11 months ago* (last edited 11 months ago) (2 children)

Depends. Are either of those companies bootstrapping a for-profit startup and trying to dupe people into contributing free labor prior to their inevitable rug pull/switcheroo?

load more comments (2 replies)
[–] silas@programming.dev 2 points 11 months ago

Ok I tried it out and as of now Jan has a better UI/UX imo (easier to install and use), but Open WebUI seems to have more features like document/image processing.

[–] priapus@sh.itjust.works 2 points 11 months ago

This is a desktop application, not something you need to host.

[–] xigoi@lemmy.sdf.org 17 points 11 months ago (1 children)

“100% Open Source“

[links to two proprietary services]

Why are so many projects like this?

[–] yogthos@lemmy.ml 3 points 11 months ago

I imagine it's because a lot of people don't have the hardware that can run models locally. I do wish they didn't bake those in though.

[–] Wes_Dev@lemmy.ml 12 points 11 months ago* (last edited 11 months ago) (1 children)

Other offline tools I've found:

GPT4All

RWKY-Runner

[–] frogbellyratbone_@hexbear.net 3 points 11 months ago (1 children)

any feelings on what you like best / works best?

[–] Wes_Dev@lemmy.ml 2 points 11 months ago

They all work well enough on my weak machine with an RX580.

Buuuuuuuuuut, RWKY had some kind of optimization thing going that makes it two or three times faster to generate output. The problem is that you have to be more aware of the order of your input. It has a hard time going backwards to a previous sentence, for example.

So you'd want to say things like "In the next sentence, identify the subject." and not "Identify the subject in the previous text."

[–] natecheese@kbin.melroy.org 7 points 11 months ago

Anyone interested in a local llm should check out Llamafile from Mozilla.

[–] JackGreenEarth@lemm.ee 7 points 11 months ago (3 children)

I've been using Jan for a while now. It's great!

[–] Churbleyimyam@lemm.ee 4 points 11 months ago (3 children)

Would you say it's noob-friendly?

[–] priapus@sh.itjust.works 3 points 11 months ago

It's extremely noob friendly. You really don't need any prior knowledge to start using this.

[–] Kindness@lemmy.ml 1 points 11 months ago

Very. Just have a good enough internet connection and hardware to download and run models. Interrupted downloads must start over. 4-41 GB. Otherwise find the source, use wget, and download to the correct folder.

[–] Showroom7561@lemmy.ca 3 points 11 months ago (3 children)

Is there a model you prefer? I've been throwing the exact same question to different models and they seem to all give a very similar answer.

Also, how is it getting certain information if it's all offline? For example, I asked it to recommend some bike products, and gave very specific brands and models.

[–] joeldebruijn@lemmy.ml 2 points 11 months ago (1 children)

Train it online. Use it offline.

[–] Showroom7561@lemmy.ca 1 points 11 months ago (1 children)

That's crazy impressive, though. I've been playing with it more, and it's very specific about certain things. I guess you can hold a lot of data in the GB of space these models use.

[–] joeldebruijn@lemmy.ml 2 points 11 months ago (1 children)

Agree, no small feat. Two caveats tho:

  • These models prioritize plausibility above factual correctness. So verification often is needed.
  • Data from after the creation of trainingmaterial is absent of course.
[–] Showroom7561@lemmy.ca 3 points 11 months ago

These models prioritize plausibility above factual correctness. So verification often is needed.

100% I was telling my wife that anyone who knows about a subject, can easily point out the inaccuracies with the output from any of the models.

But if you don't know about a subject, the AI gives you an answer that seems like it could be right. Scary to see where this technology takes us, especially when the majority easily digests information without verifying any of it.

[–] silas@programming.dev 2 points 11 months ago

Trinity stood out the most to me, it seems to have less unnecessary fluff

[–] JackGreenEarth@lemm.ee 1 points 11 months ago

I use Stealth or Starling, usually.

[–] nutbutter@discuss.tchncs.de 3 points 11 months ago (1 children)

Is it better than GPT4All? Do they provide their own model(s) or do we have to download it from other sources?

[–] JackGreenEarth@lemm.ee 2 points 11 months ago

The provide a hub of models, in my case it was better than gpt4all because it didn't crash, but I also think it has a nicer user interface.

[–] jeena@jemmy.jeena.net 7 points 11 months ago (2 children)

I'm in the process of installing https://github.com/imartinez/privateGPT will check this one out afterwards.

[–] jeena@jemmy.jeena.net 2 points 11 months ago

The biggest difference seems to be that you can let privateGPT to let analyze your own files. Didn't see that functionality in Jan.

[–] jeena@jemmy.jeena.net 1 points 11 months ago (1 children)

One difference is that Jan is increadibly easy to install, just download the AppImage, make it executable and start it.

[–] chebra@mstdn.io 1 points 11 months ago (2 children)

@jeena And absolutely nothing can go wrong by downloading random files from the internet based on contemporary hype, making them executable and starting them...

[–] xigoi@lemmy.sdf.org 4 points 11 months ago (1 children)

As opposed to cloning a random repository and running make or something?

[–] chebra@mstdn.io 1 points 11 months ago (1 children)
[–] xigoi@lemmy.sdf.org 2 points 11 months ago (1 children)

How else would you install something that doesn’t happen to be in your favorite package manager?

load more comments (1 replies)
[–] Churbleyimyam@lemm.ee 5 points 11 months ago

This looks very cool, especially the part about being able to use it on consumer-grade laptops. Will try it out when I get a chance.

[–] Aria@lemmygrad.ml 4 points 11 months ago (2 children)

So what exactly is this? Open-source ChatGPT-alternatives have existed before and alongside ChatGPT the entire time, in the form of downloading oogabooga or a different interface and downloading an open source model from Huggingface. They aren't competitive because users don't have terabytes of VRAM or AI accelerators.

[–] Schlemmy@lemmy.ml 3 points 11 months ago* (last edited 11 months ago) (1 children)

Edit: spelling. The Facebook LLM is pretty decent and has a huge amount of tokens. You can install it locally and feed your own data into the model so it will become tailor made.

[–] Kindness@lemmy.ml 5 points 11 months ago (2 children)

I think you mean tailor. As in, clothes fitted to you.

[–] AeroLemming@lemm.ee 3 points 11 months ago (2 children)

No, some guy named Taylor has to train the model for you.

[–] Kindness@lemmy.ml 3 points 11 months ago

Lol. My mistake.

[–] Schlemmy@lemmy.ml 1 points 11 months ago

Good boy Taylor.

[–] Schlemmy@lemmy.ml 1 points 11 months ago

Exactly, my auto carrot likes Taylor.

load more comments (1 replies)
[–] xigoi@lemmy.sdf.org 1 points 11 months ago (1 children)

What are the hardware requirements?

load more comments (1 replies)
load more comments
view more: next ›