This is about as good as NovelAI text generation and I can run it locally fairly well :P

https://koboldai.net/

@beardalaxy@gameliberty.club Yeah, wasn't able to run the heavily models only around the 16 GBs ones. Nice this is it can off load a lot of it to RAM and disk instead of VRAM, but even with this only allows me to load 16GBs ones which seems fairly good still

@beardalaxy@gameliberty.club Also they have some weird cluster service you can use for free, but I think you can only use one model with that

@fuggy i'm gonna' start off with an 8GB one and see how it goes!

@beardalaxy@gameliberty.club I have 32GBs RAM and 8GBs VRAM, so if you have the same setup you can load models like that, just mess with the "caching settings" as I think they are labeled as you load them

@beardalaxy@gameliberty.club Even if I loaded all the 32GB+ models to my disk it completely filled my RAM, so I guess it doesn't load everything to the disk or VRAM has to load at least some things into RAM

Sign in to participate in the conversation
Game Liberty Mastodon

Mainly gaming/nerd instance for people who value free speech. Everyone is welcome.