

2·
10 days agoOllama as a general LLM server and then LLaVa as model
HW/FW security researcher & Demoscene elder.
I started having arguments online back on Fidonet and Usenet. I’m too tired to care now.
Ollama as a general LLM server and then LLaVa as model
I host a SearXNG instance and follow the Matrix channel. Haven’t seen anything along those lines.
The AI support doesn’t hurt you if you don’t use it - and they’ve done the right thing by making sure you can do things locally instead of cloud.
Here’s what AI does for me (self-hosted, my own scripts) on NC 9:
When our phones sync photos to Nextcloud a local LLM creates image descriptions on all the photos, as well as creating five tags for each.
It is absolutely awesome.
Snopes should take this opportunity to post once a day that the best way to know if something’s true or not is to leave Xitter.