There is another matter that has not been covered, and that is the fact that upscaling does not magically create more detail than is being broadcast. A 1080p source, all else being equal, is better than a 1080i source, but upscaling (which your 1080p TV already does) is not going to magically give one more signal information than is already in the 1080i signal.
So, if you had a receiver that would convert between 1080i and 1080p, it would make your TV say "1080p" for the input, but the
original signal would still only be 1080i and upscaled by the receiver. Whether that would be better than what you are already doing or not is entirely dependent upon the relative quality of the upscaler in the receiver and the TV. Many people waste money buying a DVD player that upscales, a receiver that upscales, and a TV that upscales. One of them does the work*, while the others do nothing, so one has wasted money on upscalers that are totally irrelevant. What is ideal is to have one great upscaler, and no others, as any others are a waste of your money.
In your particular situation, with a Blu-ray disc that is 1080p, if everything is adjusted properly and working properly, that should look better than any 1080i input. However, if you are far away from the screen, any difference may be too small to notice. In fact, if you are far enough away from the screen, you won't see a difference between 480i and 1080p. For more on that, see:
http://www.soundandvisionmag.com/hitech/1137
The difference between a 1080i and 1080p signal at, say, 60 Hz, is that the 1080i sends half a picture frame (every other line) 60 times a second, while the 1080p sends a full picture frame 60 times a second. This means that you get only 30 full frames per second with 1080i @ 60 Hz, but you get 60 full frames per second with 1080p @ 60 Hz. So there is going to be better motion with a 1080p signal (at the same frequency) than with a 1080i signal, because more information is sent per second. Indeed, if blurring motion is a concern, a 720p signal @ 60 Hz will be better than a 1080i signal @ 60 Hz, but it will have less overall picture detail (with 720 lines instead of 1080 lines). So, some TV stations broadcast in 720p because they are more concerned with avoiding blurred motion, while other stations broadcast in 1080i because they would rather have more picture detail. (You would probably notice 1080i and 720p if you used an antenna to watch TV; your cable company may convert everything to one format.) The more picture information that is broadcast, the more bandwidth it take up, so there is a limit to what is allowed to be broadcast. So this is why you don't see everyone broadcasting 1080p. And, of course, HDTV is several years old, so it is based upon technology that is several years old.
____________________
*There is a way to have more than one do the work, and that is by having each one do part of the upscaling, so that, for example, the DVD player may output 480p which might be upscaled by the receiver to 720p and then upscaled by the TV to 1080p. This, usually, is a very bad way to go, as each conversion only looks at what is there, not whatever was lost in an earlier conversion, so that it is a good idea to use one great upscaler to go from the original source format to the screen/TV format. On rare occasions, it is better to use more than one, as in a case where the DVD player has a better line doubler than the other upscaler, but usually one is better off using one upscaler to do all the work.