this post was submitted on 13 Mar 2024
89 points (88.7% liked)

Asklemmy

43821 readers
897 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy πŸ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

I saw Generative AI for Beginners from Microsoft on GitHub. I've looked at https://fmhy.pages.dev/ai but I'm not sure what I'm really looking for.

I write fiction, and I want a chatbot that will function like chat gpt3.5, but not shut down if things get bloody or sexy, as they so often do.

You know ready, aim, fire? I'm in the AIM stage.

all 32 comments
sorted by: hot top controversial new old
[–] CetaceanNeeded@lemmy.world 31 points 8 months ago (1 children)

I've been using GPT4All on my laptop and using mostly 7B models due to my RAM limitations and I am amazed how good some of them are.

It's been really easy to use. There are models you can download from within the UI or you can get adventurous and download them from elsewhere, they just need to be in the .gguf format. I get most from TheBloke on hugging face.

So far my favourite has been solar-10.7B-instruct-v1.0-uncensored, it has been astonishingly good.

[–] neidu2@feddit.nl 5 points 8 months ago (2 children)

Oooh, do tell me more, please. I've been toying with the idea of setting up gpt4all myself, but I haven't really had the time to look into it very much yet. I have a couple of questions, though:

  • I guess it's safe to assume that it runs on linux?
  • Is it possible, with some scripting, to provide additional training data, such as connecting it with a wikipedia crawler?
  • By combining it with some script-foo, can I have it also look up stuff for me on the fly, for example "extract THIS kind of information from THAT site ?
[–] CetaceanNeeded@lemmy.world 3 points 8 months ago

Yes it runs on Linux, my laptop is running Manjaro and I installed it from the AUR. I'm not sure if the scripting is possible, there is an openAI compliant web API you can turn on so maybe possible through that, you would probably have to feed in the content of the site with the prompt though, I'm not sure there is a better way but I guess that sort of behaviour is a bit out of scope for GPT4All.

There is a local documents feature that allows it to access text files on your machine that you give it specific access to but I think it's fairly limited in its ability.

[–] zeluko@kbin.social 2 points 8 months ago* (last edited 8 months ago) (1 children)

The GPT services out there use something called 'tools'.
They get presented to the model and the model can 'call' a tool with arguments, which can then extract some data and input it into the context for the model to continue.

I found out, the models which can run on a normal PC (or even a Laptop) are okay, but not super great. (around or a bit worse than ChatGpt3)
The good stuff (e.g. Nous-Capybara 31B or the Mistral/Mixtral ones) needs some more memory and compute.

[–] db0@lemmy.dbzer0.com 26 points 8 months ago (1 children)

Check out the ai horde. https://aihorde.net or direct llm frontend at https://lite.koboldai.net. Free Foss crowdsourced with uncensored models that won't ever be rugpulled

[–] Melatonin@lemmy.dbzer0.com 9 points 8 months ago (1 children)

I ought to have known you'd have a good answer! Thank you!

[–] db0@lemmy.dbzer0.com 6 points 8 months ago

Spread the word!

[–] blargbluuk@sh.itjust.works 10 points 8 months ago* (last edited 8 months ago) (1 children)

LM Studio is one of the most user friendly ways to play around with LLMs imo. You can run some of the smaller models without too much memory (it will be slow without a decent GPU though).

[–] PeepinGoodArgs@reddthat.com 2 points 8 months ago

Seconding LM Studio. It's been my go to for a bit now.

[–] SpaceNoodle@lemmy.world 7 points 8 months ago (1 children)

Oh, I thought you were asking how to just ... write using your laptop.

[–] Melatonin@lemmy.dbzer0.com 5 points 8 months ago

That's a discussion for future generations, but one that will be had I'm sure.

[–] wyre@lemmy.world 6 points 8 months ago (1 children)

I've been playing a bit with llama2 in Ollama it does not have any restrictions perhaps using Ollama to run models locally is something that would solve some problems for you?

[–] carl_dungeon@lemmy.world 4 points 8 months ago

Yeah there are a bunch on uncensored models on ollama. It’s stupid easy to use!

[–] Ziggurat@sh.itjust.works 4 points 8 months ago

I installed gpt4all https://gpt4all.io/index.html it's a bit slow but runs on CPU. Then if you have a powerful GPU and like to play with technology I hard good stuff about that project https://github.com/oobabooga/text-generation-webui which wants to be the automatic111 of language model. But I haven't used it yet

[–] GammaGames@beehaw.org 4 points 8 months ago

From what I’ve heard Mistal is what you’d want to generate explicit content. Not sure what you’d want to run locally, but be warned that is slooooww unless you’ve got a beefy laptop

[–] wathek@discuss.online 3 points 8 months ago* (last edited 8 months ago) (1 children)

I would look into NovelAI for writing, it's quite specifically for that. It's a paid servicd similar to chatgpt, but it's uncensored and private.

You can run your own lightweight LLM on a laptop but the output will be useless. Good output requires big boy compute.

If you do want to run it on your own hardware, look into Ollama. There's also options to run your own LLM in the cloud with a not too difficult process for non-techies.

Frankly, id find the right LLM for your needs and just pay for it per month, maybe novelai, maybe something else, but chatgpt is not great for creative fiction.

[–] Melatonin@lemmy.dbzer0.com 3 points 8 months ago (2 children)

I got a little TOO MUCH involvement from NovelAI. I guess I want suggestion help, idea spitball help, but it's specialized what I'm looking for.

I want my ai to stay on the shelf with my thesaurus until I'm ready to use it.

[–] lazylion_ca@lemmy.ca 3 points 8 months ago (1 children)

What does too much involvement mean? As someone looking to get into this myself I'm curious about your experience.

[–] Melatonin@lemmy.dbzer0.com 3 points 8 months ago

The two things that stand out to me are

  1. Creating backstories, characterizations, etc etc etc. All that stuff they tell you to do in novel writing class that sometimes you might do, or if you're me you don't.

  2. The AI likes to just start writing narratively, taking on the job of crafting words. That's not what I'M looking for. I want a sounding board to bounce ideas off, suggest alternative directions, possible motives, help me brainstorm.

Point two has always been a problem for me, because they tell me I'm autistic. All I know is I have some coping mechanisms I've had to use over the years. It wasn't really a thing when I was a kid. But I don't see alternatives very well. So having an outside entity suggest things that hadn't occurred to me gives me more choice in plot lines and decision points. I write ok alone. But I'm excited to see what ai can help me do.

[–] wathek@discuss.online 2 points 8 months ago

Interesting, im vaguely interested in this too. i have half of a world written that i want to turn into a game maybe (probably not but, amhaving fun) I have the hardware to turn what i have into an embedding for an open model, and the hardware to run it. So that's the way i would go about it, though i can't advocate for how helpful it would be (yet)

[–] Mahlzeit@feddit.de 2 points 7 months ago

You should probably hook up with the SillyTavern crowd. It's a frontend to chat with LLMs that will do what you want. Its main purpose is chat role-play. You can assign a persona to the LLM and ST will handle the prompt to make it work. It also handles jailbreaks if you want to use one of the big ones (no idea if it works well). You can also connect to other services that run open models, including aihorde.

https://github.com/SillyTavern/SillyTavern

https://www.reddit.com/r/SillyTavernAI/


If you want to host your own model you can find more help here:

https://www.reddit.com/r/LocalLLaMA/

!localllama@sh.itjust.works

[–] turkishdelight@lemmy.ml 2 points 8 months ago

ollama helps you to easily run llms locally: https://ollama.com/

I'm running llama2-uncensored on my laptop with 8GB of memory.

[–] lurkerlady@hexbear.net 2 points 8 months ago

gpt4all is the easiest way in, hands down, no contest

[–] Harbinger01173430@lemmy.world 2 points 8 months ago (1 children)

Why not use your own organic talents to write something like millions of other fanfiction writers...?

[–] Melatonin@lemmy.dbzer0.com 5 points 7 months ago

I've been doing that for 40+ years. I just want to try using AI as a partner to bounce ideas back and forth. As I explained in another post, I have trouble generating alternatives once I've created an Idea. It's nice to have a means to get outside myself.

I intend to do the writing, it's the planning and plotting I'm using AI for.

[–] Vampire@hexbear.net 1 points 8 months ago (1 children)

There's a reddit forum called local llama

[–] Fisch@lemmy.ml 3 points 8 months ago
[–] Schlemmy@lemmy.ml 1 points 7 months ago (2 children)
[–] PipedLinkBot@feddit.rocks 1 points 7 months ago

Here is an alternative Piped link(s):

https://piped.video/WxYC9-hBM_g?feature=shared

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[–] Melatonin@lemmy.dbzer0.com 1 points 7 months ago (1 children)

Whew, that guy has too much energy. He reminds me of the Minecraft videos my kids watch.

Slower.

Calmer.

Better.

[–] Schlemmy@lemmy.ml 1 points 7 months ago

He's energized. You could watch at half speed πŸ˜