A waveform has an inherent average power. There are no standards for that level in the music world. Older cds were mastered very conservatively with average power levels of around -18dB, whereas newer cds are mastered much more aggressively with average power levels of -12dB or even higher. You can have two songs, by the same artist, on different cds, with vastly different power levels.
You cannot know the source of the song that was downloaded. If it was taken from an older cd with lower power levels or re-mastered from original master tapes, but you compare it to a newer cd that is hotter, naturally it will sound lower in volume, because it IS lower in volume.
Burning any track, whether ripped from a commercial cd, downloaded from an online music store, or ripped from a cd and then converted to a compressed format (mp3, aac, etc) will NOT change its average power level.
Converting to a compressed format may change the power level, but only by an infinitesimal amount. Just did that for a rock song ripped from a cd to give you a concrete example. The wav had power levels of -16.98 (L) and -17.41 (R); after conversion to 192 kbps MP3, the power levels are -17.13(L) and -17.57 (R).