Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ksj1

macrumors 6502
Original poster
Jul 17, 2018
294
535
I'm thinking a Mini size machine with an Apple supported LLM, 4e cores and a huge number of gpu cores. This would be for inference with voice response, etc.

Thoughts?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.