Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This business venture was completely dependent on a main service from a single 3rd party - risky!

Why didn’t the author hosted his own uncensored open source LLM on the cloud?



I will try to make it open source.


Have you considered running your own uncensored LLaMA2 instance via something like llama.cpp on a cloud for hire like, say, Azure?

Some examples - there's probably a whole zoo of these:

https://ollama.ai/blog/run-llama2-uncensored-locally

https://huggingface.co/TheBloke/Luna-AI-Llama2-Uncensored-GG...

https://ollama.ai/library/llama2-uncensored


Thank you for providing the link, I'll take a look.


Good luck - please let us know how it worked out for you!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: