Have you run anything locally on device?
Seriously, if you haven't:
download lmstudio from lmstudio.ai
download a couple of models
unplug internet connection/turn off WIFI
play with it. be stunned
e.g., fragment of a conversation with a 16 GB model run entirely on device - this was not polling the internet for this, it was entirely on-device and works without internet connection. it was a general model (i.e. not just trained on medical data), and I've discussed all this with the vet looking after my cat - its legit.
This is just a fragment of a longer conversation and the diagnostic process matches pretty much exactly what both the initial vet, and the specialist the cat is with this evening are following.
Again, this model is Google's gemini model aimed at answering questions about ANYTHING.
You can also import documents to the LLM for summary/analysis - e.g., get a big document, ask the LLM queries about it. It will cite the sections of the document in its responses. Great for condensing/summarising a heap of info you have no time to read or analyse in detail.
Yes, there's a lot of LLM hype, and no it's not "AI" per-se. But holy crap its powerful.
View attachment 2456321