I have always had an interest in electronics, and have long been a tinkerer. I was an EE major for a while (though I wound up in the IT profession), and I believe I have a pretty good grasp of at least the fundamentals of electronic circuits.

There's a basic fact: no wire is perfect. Any wire will have some resistance, capacitance, etc. that becomes a part of the circuit. And those effects will be different from one type of wire to the next... the materials, the construction, the geometry, all make a difference. So it's not a stretch to say that one wire can have better characteristics than another for a given application.

But are those differences audible? You bet! I've observed it myself. But that does not mean that you have to spend huge bucks to get good-sounding cables. In fact, I'm sure that there IS a lot of snake-oil in the audio cable industry.

Being a DIY-er, I constructed my speaker cables out of CAT5 network cable, using a design based on this design . These cables are vastly better than the Monster cables I was using before. And I spent quite a bit more for the Monsters.

Of course, you could say that my opinion is biased because I made them myself. But I took those cables to try them on my friend's very expensive, very high-end audio system that has comperably high-end speaker cables. My audiophile friend and I did A/B comparisons, and found that while the super-expensive cables were better, my CAT5 ones were nearly as good.

As far as (line-level) interconnects, I havn't done much experimenting yet. But the DIY articles on the TNT audio have some generally sane discussion on interconnects.

With my background in both computers and electronics, I have always been very skepical that a digital audio connection can be affected by the cable. After all, "bits is bits", right? How can one CD transport sound different than another, when all it has to do is read the numbers off the CD and spit them out the digital connection? I've done a bunch of reading, and some experimenting, on the matter.

Yes, it's jitter that is the culprit. With digital systems, you'll have a "clock", which is basically a signal that goes "on off on off on off..." regularly at a specific frequency. A "clock source" is a circuit that generates a clock. Like everything else, no clock source is perfect. Ideally, the interval of time between one "on" and the next should be the exactly the same, but it practice, it varies. That variation is jitter.

Ok, but what is jitter's effect on audio? Take a typical playback chain where you have a CD transport with a digital connection to a receiver. The CD transport reads the data off the CD and applies a little processing to convert the raw data into the standard digital PCM stream. That data stream goes out the jack in the standard S/PDIF format, which combines clock and data into one signal. The signal clock is derived from the master clock source in the transport, so any jitter in the master clock will also appear in the output signal. So now the signal is passing through your optical cable or coaxial cable. This cable, being imperfect, is smearing the transitions (where the signal goes from "off" to "on" and vice-versa). It's a small amount of smear: not enough to obscure the data, but it does increase the random variation in time between one bit and the next (i.e., more jitter). The signal now enters the receiver's digital input. A circut called a "PLL" tracks the incoming signal, and attepts to sychronize with it. This is necessary to be able to extract the data bits. The PLL, when locked on to the incoming signal, generates a clock based on that incoming signal, and the clock is carried forward into the Digital-to-Analog Converter (DAC) along with the data. It is at THIS point that the jitter (which was present in the clock source in the transport, and which was made worse by everything in between) has its degrading effect on the resulting audio. Because the samples are supposed to be "evenly spaced", but they aren't, so the resulting analog waveform has a different shape than it should.

One might ask, "Why can't you generate a new clock for the DAC, and throw away the jittery clock that came from the digital input?" The answer is, you can. I know of several products that do just that. But in the early days of CD transports and DACs, such a fix might have been difficult to implement, and anyway the designers probably weren't aware of the negative effects of jitter.

All of the previous discussion pertains to CD audio only. When it comes to Dolby Digital or DTS bitstreams, jitter should have no impact on sound quality (which means that the quality of the digital interconnect makes no difference). This is because the digital data is processed by a computer in the surround receiver, which then creates new digital audio streams to feed the DACs.

I have done some A/B tests, where I compared Dolby Digital over: A) an el-cheapo ($6) TOSlink cable, and B) a high-quality coax. I could hear no difference between the two when playing Dolby Digital or DTS. My DVD player and receiver each have both optical and coax connections, so I was able to flip back and forth instantly with the remote.

I have also compared an el-cheapo CD player with an optical out (using my $6 optical cable) with a good CD player using the high-quality coax. In that comparison, I could hear a difference, though it was subtle.