I don't consider any PC to be "AI-enabled" unless it can run a model as good as GPT-4 locally at a similar token/s to ChatGPT. It's my own personal requirement.
So probably 2-3 more years.
Unlikely. RAM bandwidth just won’t be there. Maybe a heavily quantized model.