This is about as good as NovelAI text generation and I can run it locally fairly well :P
https://koboldai.net/
@fuggy it runs locally? siiiiick.
@beardalaxy@gameliberty.club Even if I loaded all the 32GB+ models to my disk it completely filled my RAM, so I guess it doesn't load everything to the disk or VRAM has to load at least some things into RAM
@beardalaxy@gameliberty.club I have 32GBs RAM and 8GBs VRAM, so if you have the same setup you can load models like that, just mess with the "caching settings" as I think they are labeled as you load them