I use Logic X as a live, real-time host for virtual instruments, and multicore performance has been great after the latest update. I notice a dramatic difference in CPU performance when I'm using 16-bit versions of my samples versus 24-bit. For recording, I always use 24-bit, but for live use, 16-bit samples tax the CPU a lot less.
My question is: Is this performance gain contingent on me loading 16-bit source samples at the instrument level (e.g. a 16-bit soundset provided by the sample developer in Kontakt)? What about true soft-synths that are not sample-based in any way? And audio coming in on the analog inputs of my interface (MOTU Ultralite)?
I'm not interested in unpacking/repacking sample sets from 24-bit to 16-bit myself. I already use VEP6 in both live and studio applications. Just looking for quick and easy ways eke out as many sounds as possible at the buffer size (512) and latency that I can live with.
Thanks, everyone!
My question is: Is this performance gain contingent on me loading 16-bit source samples at the instrument level (e.g. a 16-bit soundset provided by the sample developer in Kontakt)? What about true soft-synths that are not sample-based in any way? And audio coming in on the analog inputs of my interface (MOTU Ultralite)?
I'm not interested in unpacking/repacking sample sets from 24-bit to 16-bit myself. I already use VEP6 in both live and studio applications. Just looking for quick and easy ways eke out as many sounds as possible at the buffer size (512) and latency that I can live with.
Thanks, everyone!