I will mainly use it for help in coding and general questions.
You must log in or register to comment.
Look interesting
its great and has active development. Hit up the discord, lotta helpful folks in there
Check ollama.
Thanks
Ollama is great!
None, don’t host it yourself.
You will need a ton of ram and a really powerfull PC. Additionally your electricity bill will go up significantlyYou do know this is selfhosted, right? People tell us that what we do is “not a good idea” all the time.
Im all for selfhosting but LLMs is just stupid. But go for it you do you and evite the ride ♥
Agree people be selfhosting to save money in subscriptions but then spend a 4090 and tons of energy to use an LLM from time to time
Text Generation WebUI, feature-rich. Download models by yourself.