Edit: Sorry, my last post was confusing.
What I meant to say was that the bandwidth of 1080i/60 vs 1080p/30 or 1080p/24 is irrelevant. The source is shot in either film at 1080p/24 or video at 1080i/60. It is then wrapped in a broadcast format of 1080i/60 (like most HDTV) or 1080p/24 (like BluRay) or 1080p/30 (actually never heard of anyone using this one but it is possible). The codecs that are used are smart enough to recognize a duplicate frame without any extra bandwidth required. Film material wrapped in a 1080i/60 wrapper is actually the same exact material as it was before it was wrapped in 1080i/60 except that the TV or STB has to perform 3:2 pulldown and deinterlace the film from its wrapper. If the same 1080p/24 film content is wrapped in both 1080p/24 and 1080i/60 at the same bitrate, then when properly deinterlaced, the quality of both is the same.
Broadcasting in 1080p format simply takes out the deinterlacing step and makes it easier to get a better picture on cheap hardware.
This is totally different than what you typically see at home on a HDMI connection. At home on your BluRay player, you are sending raw picture information and the framerate is typically left at 60 frames per second (if you are in the US). So, 1080i/60 takes less HDMI bandwidth than 1080p/60 because there are twice the frames in 1080p/60. You are broadcasting the raw picture information without the help of the compression of a codec. Lots of frames are duplicated if you are watching a 1080p/24 film movie. If you are cool, you can set your BluRay player to 1080p/24 and the bandwidth on the HDMI cable is even less than 1080i/60 because there are less than half the frames broadcast.