that crossed my mind too. 8GB is not going to be enough for any local LLM ai stuff.8/256 may be fine for the minimum of a modern consumer laptop, but I will wait for the schadenfreude when those specs are too low to effectively use Apple's future client LLM.
16GB is generally considered the minimum and that's on PCs where RAM and VRAM are not unified so I guess 16GB might not even be enough. I really can't see Apple limiting features to 24GB RAM macs.... (fingers crossed)
as for storage. if a LLM is ~30-40GB (that's the low end) then that's a huge chunk out a 256GB drive.