The premise behind using larger cable for a given application (to be on the safe side) is to overcome the losses incurred due to cable impedance. In other words, you want to make sure the cable is big enough to supply the load during peal demand and still have enough headroom to allow the losses to occur within the cable. With that in mind, I’d like to point out the difference between 10 and 12 gauge speaker cable so you can see what’s going on in your system.
I quickly picked out two Belden cables from their site for comparison: 5T00UP is a 10 gauge commercial speaker cable with 65x28 construction. 1860A is a 12 gauge audio cable with 19x25 construction. Using the stated ohmic values for these two cables it turns out there is a 570 micro-ohm/foot difference between the two (we don’t need to get into L and C characteristics since they are virtually the same for both cables also). Over a 60 foot run the difference would be about 34 milli-ohms. If you could push an 8 ohm load to 200 watts pk, the difference in terms of current handling ability in these two cables (at that rating) would amount to 171 milli-amps. Now since the 200 watt load requires 5 amps, you could go with 18 gauge and still be on the safe side (according to the US National Electrical Code) ... and I bet you wouldn’t here the difference either.
I’m not saying you should use 18 gauge wire, I’m just trying to point out the level of overkill you’re into by using that really big a$$ wire. All you’re really doing is raising the price of copper on the commodities market.
Maybe I should be buying copper futures.