Quote:
Originally posted by psyprof1:
If the human hearing apparatus could be influenced by a phase difference of, let's say, 10 degrees at 20 KHz (which I don't think any adult raised in an industrial environment can detect at a useful level anyway), how big a difference in interconnect or speaker wire length would be required to cause that difference? Nice little calculation problem. My guess is it would be a mile or so. If that's true, a difference of a few feet, or even a hundred yards or so, between one side and the other is really negligible.
But the effects on signal level and/or frequency response might be significant. I'll bet Altec could tell us.
The way cables behave is a complex matrix of the influences of inductance, capacitance, resistance, frequency and impedance presented to the line.

Suffice to say that the extremely low output impedance of all modern solid state amplifiers lowers any significant influences from reactance, either capacitive or inductive. What remains is resistance. This would influence the damping factor if the additional length of the one cable was significant enough to contribute enough resistance to lower the effective damping factor below around 20 on the speaker with the longer wire. However, with any reasonably sized speaker cable, the difference in length would have to be on the order of 100 or more feet to make any measurable difference, let alone any audible difference.

The difference in propagation delay between two cables of anything short of many, many miles is going to be so short as to be only detectable in the laboratory.