But surely the very act of taking any analogue signal, be it audio or video, and converting it to digital results in some form of loss, unless you can sample it with an infinate resolution. Or am I just talking rubbish![]()
Technically, that's true. When you digitally sample an analog signal, some information is lost (whatever is happening between samples). However, if you sample at a high enough rate, the lost information will be a much higher frequency that you're interested in and the end result is a signal that is indistinguishable from the original.
Of course, that's not what people mean when they use the term "lossless", though.