(I was writing my post while Altec stepped in with the answer I was pointing toward.)
Digital equipment designers and engineers … a question:
In the receiving of digital information by many types of digital equipment, does not the incoming digital information enter a buffer before being fed for manipulation and/or analog conversion? And does not the play-out from the buffer of that digital stream, before conversion, depend on an internal clock in the receiving equipment?
If so, and if the incoming information was captured properly in the first place when placed in the buffer, then the degree to which digital jitter affects the quality of the analog signal depends not on the digital time-stability of the source device’s output nor any jitter effects introduced by a cable or other means, but on the quality of the output from the buffer, through processing and the DAC, based on the clock and electronics within whatever piece of gear finally converts the digital source to analog. Can we say that if the incoming signal had less jitter than the subsequent processing and DAC, any advantage of the ‘better’ incoming digital stream is lost? Can we also say that if the incoming signal had more jitter than the internal electronics of the processor and DAC, as long as the incoming information was properly captured, that the disadvantage of the arriving ‘worse’ digital stream has been made irrelevant?
In other words, is it not true that as long as any digital signal degradation does not exceed the ability of the receiving equipment to capture information properly, the last piece of gear in the process, the one that finally takes the digital information and turns it into analog, determines how much or how little jitter is introduced and any effects this has on the final signal?