2. Bridging two amps together only creates 3 dBW that's not certainly not enough to risk bridging them together in my book.
Doesn't it depend on the amp(s) in question? Theoretically, you could bridge a stereo amp and get 4x the watts. More realistically, the increase is rarely greater than 3x, and sometimes not that much. Who doubles the impedance of their speakers when switching to a bridged amplifier?
Example 1: Thousands of years ago, Carver provided an amplifier that was easily field-bridgeable: The M1.0 was rated for 200 watts (23 dBW)/channel into 8 ohms, they claimed 1000 watts (30 dBW) into 8 ohms, bridged, but with distortion not specified.
Example 2: Adcom GFA-555se is shown on the Adcom web site as being 200 watts (23 dBW)/channel/8 ohm; or 600 (27.8 dBW) watts into 8 ohm bridged.
PROPERLY bridged, all common-mode distortion decreases, and voltage rise-time doubles. The trouble begins with a pair of amps that
aren't that good to begin with--not actually identical; and especially when a weak output section is fed with a weak power supply. The resulting bridged amplifer wets the bed when presented with a low-impedance speaker load, and promotes the idea that a bridged amp cannot have reasonable sound quality. As an aside, in the whole history of Hi-Fi, there've never been so many 4-ohm (and less) speakers for sale.
In short, there's little wrong--and a great deal right--with PROPERLY bridged amplifiers of suitable quality; provided they'll drive the speaker load you intend to use them with.