At present I'm befuddled by 1080p and 1080i.
Stogie, I like to use languages as an example of what you are having issues with. Because, you may be confused by a displays native resolution, and it's accepted resolutions.
Think of yourself as an English speaker. You speak English - it's your native language.
You don't speak French, or Spanish, or anything else.
Now, when you hear things - you don't understand anything.
HUH?
That's right! You have no ability to understand English, French, or anything else that anyone says.
So, you need an interpreter between what someone else says and your ears so that you can understand and then repeat what is being said to you.
Your best results occur when someone speaks English and then it is translated to you as English that you understand. You then repeat it, in English and everything matches up - one for one - word for word.
The hard parts come when someone speaks French, or German, or Spanish. Those languages must be converted to something you understand, but because you only speak Engish, some nuances - some quality of their language is lost in translation.
Even worse, if someone speaks a language that the interpreter can't translate, you may not be able to understand or repeat that language at all!
All of this relates to displays.
Every digital display has a native resolution. It is typically listed in the specifications and is then used for marketing. Often this resolution is 1920x1080. This is called 1080p (typically). Sometimes the resolution is 1280x720, which is 720p. Sometimes it is something else. 1365x768, 1024x768, 854x480, etc.
This native resolution is the actual number of individual physical picture elements (pixels) which make up what you are viewing. If you got out a magnifying glass, and counted, one by one, all of the dots which make up the image on a digital display from your cellular phone, to an iPad, to a 102" Panasonic plasma, you could count those dots and come up with the native resolution.
If a display is 1920x1080, then that's what it is.
Period.
It can't fill the screen up with any image at all without first making that image 1920x1080. If it gets a image that is 192 pixels wide, by 108 pixels tall, it must make that image ten times as large on each side... Because it is NOT the native resolution!
Now, if you feed a 1920x1080 display a 1920x1080 image, it will be able to map every single pixel, one for one, to every single dot on screen! That's fantastic right?
Except, some displays (still!) do not accept 1920x1080 native resolution (1080p) they only accept 480i, 480p, 720p and 1080i.
Wait! What's 1080i? 1080i is half the resolution of 1080p. Really? Yes, really.
1920x1080 is a lot of data to be sent at once, and in the old days (when TV first came around) 480 lines of information was a lot to be sent. So, instead of sending the full image in one shot, it was broken into two different parts. Half a frame with only the odd lines was sent, then the next half frame with the even lines was sent. Odd, even, odd, even, repeat! You get the 'full' 1080 lines of information eventually, but not as a single frame.
This sounds simple enough to deal with. Just combine the two half frames into one whole frame and you have a full 1920x1080 image!
Well, that would be great if all displays could do this properly - many can't.
It also doesn't work when material is natively 1080i. Because it's not ONE frame broken in half. It is actually two separate half frames. Each about 1/60th of a second. So, if there was motion, then the difference between every half frame produces a stair stepping effect or combing effect.
It looks like this:
http://justsayyes.files.wordpress.com/2007/06/interlace1.jpg
A good TV can recognize this combing effect and can reduce or nearly eliminate it. Comb filtering!
There is a ton of reading about what interlaced (480i/1080i) video is, but the bottom line is that progressive video (480p, 720p, 1080p) sends the entire frame in one shot, while interlaced video sends half a frame at a time.
Generally, these days, if a display has a native resolution of 1920x1080 and does not accept a 1080p source, then it has poor processing inside of it and should be avoided. It may do okay, and may keep price down, but with more and more 1080p native sources coming to market, it will become more and more important for all the displays in your home to support 1080p natively. Even some cheap 19" displays I recently bought with 1365x768 resolution supported 1080p on their inputs.