this post was submitted on 05 Mar 2024
62 points (94.3% liked)

AI

4568 readers
79 users here now

Artificial intelligence (AI) is intelligence demonstrated by machines, unlike the natural intelligence displayed by humans and animals, which involves consciousness and emotionality. The distinction between the former and the latter categories is often revealed by the acronym chosen.

founded 4 years ago
top 16 comments
sorted by: hot top controversial new old
[–] original_reader@lemm.ee 12 points 1 year ago
[–] desmosthenes@lemmy.world 8 points 1 year ago

it works on apple arm architecture surprisingly well; near instantly typically

[–] gandalf_der_12te@feddit.de 7 points 1 year ago* (last edited 1 year ago) (1 children)

Where did this come from? Did OpenAI finally release the source code? And where does the training data come from? Is the training data public domain/appropriately licensed?

[–] desmosthenes@lemmy.world 1 points 1 year ago

that’s all on the site - openai offers an api for openai model access

[–] PlutoniumAcid@lemmy.world 7 points 1 year ago (1 children)

There's a long listing of models to choose from. How to pick one? What are differences, benefits/drawbacks?

[–] desmosthenes@lemmy.world 2 points 1 year ago

it’s gonna take experimentation; there’s a list of all models in the app or on the site and maybe a little googling. you can still use openai too. mistral is solid overall though; good for programming

[–] Empricorn@feddit.nl 6 points 1 year ago (3 children)

Is running using a GPU a bad thing? I'm new to this...

[–] Fisch@lemmy.ml 13 points 1 year ago

No, a GPU would be ideal but not everyone has one, especially one with enough VRAM. I have an AMD card with 12gb of VRAM and I can run 7B - 13B models but even the 7B models (which seems to be the lowest that is still good) use a little more than 8gb of VRAM and most people probably have an Nvidia card with 8gb or less. 13B models get very close to using the full 12gb.

[–] DoYouNot@lemmy.world 9 points 1 year ago

Not everyone has a dedicated GPU, I would guess. GPUs are good at doing tensor calculations, but they're not the only way.

[–] Sabata11792@kbin.social 2 points 1 year ago

Its better if you have a good GPU, but will not run without a card from the last few years. It can run on CPU but it's much slower.

[–] DoucheBagMcSwag@lemmy.dbzer0.com -1 points 1 year ago (3 children)

Uh....how are they going to pay for the server load?

[–] sane@feddit.de 10 points 1 year ago (1 children)
[–] null@slrpnk.net 4 points 1 year ago

What server load?

[–] hswolf@lemmy.world 2 points 1 year ago (1 children)

you just need to pay your energy bill on time,that's all

I just read its local