In the interest of answering the actual question asked:
On the back, near the AC inlet, of every single device in your home theatre is a label with the current draw in watts.
Add up the watts drawn by all of the devices in your home theatre. Alternately, add up the watts for only the devices you could, in a worst-case scenario, use at the same time.
Generally speaking, a 15A circuit should probably be limited to 13Amps, because a safety margin is ... well ... nice ... if fire is something you prefer to avoid.
But strictly speaking, assuming nothing ever fails, you could allow yourself to draw 15A provided there are no extension cords involved and the AC cords are kept reasonably short.
Assuming you live in the US or Canada, that's 13A times 120V = 1560 watts and 15A times 120V = 1800 watts *
If it's more than either of those figures (use the one based on how fire safe you want to be) you need a 20A circuit (at least). Or two 15A circuits, or some similar combination or arrangement.
The reason I bring up fire safety and a 2A (15-2 = 13A) safety margin, is because the circuit breaker that is supposed to keep you and your family safe has no idea how much power is being drawn by the devices plugged into the wall.
What it does have an idea of, is how much heat is being generated by your power usage. When it gets hot enough to (in theory) equate to more than 15A (or 20A, as the case may be) it trips.
Each time a breaker trips, it becomes somewhat less reliable. Breakers wear out if they are cycled a large number of times, to the point where they might not trip at the heat that corresponds with the amperage rating of the breaker.
Breakers also need a certain amount of time for the heat to build up. Brief demands may not trip it, even if the load exceeds the breaker's rating (15A, etc).
So, it is always good to remind yourself that if your gear, perhaps due to a failure, demands 20A or even 30A from the 15A circuit, the electrical system will do everything it can to supply that power, until the breaker trips. If the breaker doesn't trip, well, I suggest gathering up the children and getting out. That is, if you know it's happening, which you may easily not.
* Electric Utilities vary with regard to how much power they can send out and how regulated that power is. I know that in certain parts of North America 110V is common. Where I live, with no devices drawing from the circuit, I routinely measure 127V, which falls to somewhere between 117 and 122V if something is turned on and draws power on that circuit.
This affects how many watts your system can support. More voltage is better, to a point. Too much voltage can stress devices on the circuit, although it also might allow your amplifier to be able to put out more power. It's better to have a nice, steady 120V than either of the extremes.
** The above calculations for power draw are somewhat of a simplification, but do work well with Home Theatre devices. If you have a device that has a poor Power Factor ... Compact Fluorescent Lamps are notoriously bad ... they draw more power than what is actually printed on the label. But most people know not to use CFL bulbs on the same circuit as their Home Theatre or HiFi (LED lamps should also be avoided, they can introduce noise into the AC line). A good old low-wattage tungsten bulb lamp is best if you need supplemental lighting in your audio rack.