Hi John - long time no speak.
Ok, having looked at jneutron's previous diagrams and text file, I have performed a much more rigorous simulation, and the measured difference remains effectively zero.
The test circuit consisted of a pair of voltage generators, both producing 10V peak. One at 100Hz, the other at 1kHz. These feed a simple 6dB crossover (1kHz), with each branch having an 8 ohm load, via a 'speaker cable'.
High frequency results were then filtered using a 10th order Butterworth high-pass filter (800Hz) so that the HF signal could be seen in isolation ... without the LF making a mess of the graph, and thereby making accurate measurement impossible. The filter was implemented using a Laplace transfer function should anyone be interested.
Three test runs were performed. The first used a single cable with a series resistance of 0.1 ohm, the second used a single cable of 0.05 ohm (equivalent to two parallel cable runs), and the third used 2 x 0.1 ohm cables.
John (jneutron) tells me that my initial test was flawed because I didn't include the branches (presumably the two sections of crossover network), but this current test tells me that it doesn't make a rat's backside worth of difference
The test run with two separate cables (bi-wired) vs. the 50 milliohm cable gave identical values of voltage at the load (to at least 3 decimal places). To minimise sampling errors in the simulator (I use SIMetrix) the time interval was set for a maximum of 2us (roughly equivalent to a 500kHz sampling rate).
Naturally, the single 100 milliohm cable attenuated the signal by exactly the amount one would predict based on the resistance.
I suspect that we have a herring here - and it appears to me to radiate a distinct reddish hue
While there are points that appear valid, it just doesn't work that way in practice.
If the LF current caused the HF current to be modulated as claimed, this implies that intermodulation distortion components will be the result. But, we are talking about a metallic conductor at audio frequencies - intermodulation (or harmonic) distortion doesn't exist in a cable. We would all be in serious trouble if it did, since the entire fabric of electronics is based on the use of metallic conductors (that don't contribute distortion).
The 'straight wire with gain' has long been held up as the standard we aspire to, but John's analysis tries to convince us that even this is no good. Some of the magic cables have the ability (or so we are told) to make sure that HF signals pass through their designated conductors and the LF signals through their appropriate conductors - presumably to reduce this non-problem.
Sorry John, but you're not going to convince me on this one. While it appears 'obvious' that the summed currents will cause an additional loss, if this were the case we would be unable to use resistors in electronic circuits because the summed currents would cause
major problems (100k resistor vs 100 milliohms of cable, all manner of branching - usually DC + wide band AC for example ... not to mention modulated RF systems, etc., etc.).
There is absolutely no evidence that signal modulation of any kind is an issue with any conductor at audio frequencies and at normal audio levels. While superconducting magnetic circuits (for example) may exhibit unusual behaviour is not really at issue - odd things are bound to occur at perhaps a million amps or so. AFAIK, there are no audio components that fall into this category
Bi-wiring
can reduce crossover interactions caused by the common cable's resistance, but as I mentioned in the earlier post, there is no evidence that the effects are audible to most listeners. Using an active crossover is still the best approach anyway.
Do with this red herring what you will, but I have better things to do with my time
Nice theory though - I just hope the magic cable brigade don't get hold of jneutron's data. They'd have a field day, and would certainly be able to baffle the average customer with that lot (what a depressing thought).
Cheers, Rod
sound.au.com