Not sure that I fully understand what you want to know, Seth=L.
For smaller power transformers (say up to 300W) I think "full load" is calculated on the basis of 5 - 10% power loss, usually equally divided between copper (winding) losses and iron (core) losses. That is the background for one way of estimating a power transformer's wattage. If one presumes 5% total copper (winding resistance) loss, which is supposed to be equally divided again between primary and all secondaries, then one measures the primary winding's wire resistance. Say this is 14 ohms for a 230V primary. The wire loss voltage drop can then be assumed to be 2,5% or 5,75V. That over 14 ohms gives 410mA maximum primary current, or a transformer of 230 x 0,41 = 94W. This is approximate as the standards of transformer companies vary, but it will serve to illustrate. (One measures the primary winding resistance, because the secondary resistance with semiconductor power supply transformers may be too low to measure accurately with an ordinary DVM.)
Also there is a difference in the iron loss between normal E/I core, C-core and toroids. The latter is smaller, giving a smaller core area for the same power, thus shorter total winding lengths, less copper loss and so on round the circle.
This comes from my general experience; others may be able to reply (and perhaps amend) in greater detail.