I've been trying to figure this out lately, what with all the hubbub about upscaling and so forth.
I know everybody makes a big stink about how an upconverted signal is "1080i/p" or whatever, but the original source is still just 480p. How much better can an upscaled image look when the source material only contains so much information? To me, it's like upscaling a picture in Photoshop. It's bigger, sure, but it's not really any
better.
Now, I know all the ins and outs of interlacing/deinterlacing/progressive and all that. Having worked directly with raw video sources in a variety of formats, I have a full understanding of how they work (which is a rarity on most of the boards I visit.. hehe).
So basically, is there a difference between "upconverted" video on a HD display versus just displaying a 480p image on that same display?
Bear in mind I don't have any first-hand knowledge of this.. of all the fancy gadgets in my HT, my TV is still antiquated. For two reasons. One, I'm not in a position where I can easily drop a few grand on a nice big HD set, although I'm working on it (eyeballing the Westinghouse 47" 1080p). And Two, I'm one of those people who doesn't like to replace something unless it's broken or otherwise useless. Since my old Sanyo still works just fine, and has surprisingly good picture quality considering it's age, I'm kind of loathe to replace it, since I don't really
need to, ya know?
Speaking of pro-scan, is there a way to determine what kind of pro-scan my DVD player and/or TV actually performs? (Once I get a TV that does it, I mean.. hehe.. My DVD player is pro-scan, though). On the one hand, there's "proper" pro-scan, where it does inverse 3:2 pulldown, or otherwise "merges" the two fields into a single
full frame, and there's the "crap" kind of pro-scan where it simply discards one field entirely, so you end up seeing only half of the vertical resolution.
Feel free to get as technical as you like, I'll understand it.