Ok here's one for all of you (first of all let me say i dont believe a word about cable break in, and i havent owned any kind of equipment good enough to be able to asses the effects of run-in, but hey, i guess that if capacitors blow over time, that means that use affects them in some way right?)
now back to the cables...
check this OUT: an electrical current be it AC or DC will cause the wire to heat up, HENCE the insulation will be heating up also. we could argue that heating the insulating material can cause its properties to change. PERHAPS they might get permanently altered, who knows... These capabilities include heat storage and dissipation.
Resistivity of a material goes up with temperature. So as the copper heats up, some of the heat is transferred to the insulation which now, due to the "changed" properties, have a better ability to disspate heat. this in turn allows the copper to run cooler, therefore decreasing its resistivity, and better preserving the purity of the signal.
i think from the physics point of view, the argument is "reasonable". however, i believe i've never written such a bag of wank.
there is NO effect, no matter which, that oculd ever be big enough to alter the sonic properties of the cable.
i dont believe in spending 2k in cabling either.