Lately I’ve been using AI locally instead of cloud stuff.
No internet needed, no accounts, nothing leaving my machine. Cloud AI is cool, but you share more than you realize.
Local just feels safer for me. Using models like dolphin-mistral:7b, llama3.1:8b, mistral:7b.
Unus Nemo
in reply to liv • •@liv
Yes, I only use local AI. llamma.cpp (not related to the Meta llama) and I have used Ollama in the past (which is just a user friendly wrapper for llama.cpp ) though have mostly migrated to llamma.cpp. I also use ComfyUI, I love playing with stable diffusion. I also actually make content I need with it as well 😉.
liv likes this.