Why is TRUE 1080p so hard to come by?

Hi Ho

Hi Ho

Audioholic Samurai
Weather it's truely far superior to 1080i or not, I'm not here to debate that at the moment. What I want to know is why there are so many "1080p" TV's that cannot accept a 1080p signal! It's just plain silly to me. What's the point of making a 1080p display that cannot accept signals in it's native resolution? Is there some sort of technological hurdle? Does it take immense processing power? Does it add a whole lot to the cost?

Most recently, it's HD-DVD and Blu-Ray. HD-DVD outputs 1080i, while the discs are encoded at 1080p. Blu-Ray, from what I can gather, will output 1080p but it is not true 1080p. It is interlace, then deinterlaced. That doesn't make sense to me at all.

What's up with 1080p?
 
WorldLeader

WorldLeader

Full Audioholic
Apple's Quicktime Trailers are available in true 1080p if you need some material.

As for why it's so hard to come by, I'm not sure. :confused:
 
avliner

avliner

Audioholic Chief
Hi Ho,

yeah, you're right! All that "novel" sound like it's on purpose.
Tell you what: from my side, I'll be start thinking about 1080P, Blue Ray - HD DVD stuff, only & when dust settles down on this subject and that might be in a 2-3 years time frame.

So far, I'm pretty happy with my "half analog, half digital" life!

Cheers / Avliner.
 
BMXTRIX

BMXTRIX

Audioholic Warlord
It seems like a reasonable question... "Why don't they just do it?"

Answer is a bit more complex though. The deal is that with 1080p you are dealing with about 2 million pixels that must be displayed 24 to (more often) 60 times per second! That level of processing is not inexpensive. Until recently, it hasn't even been necessary at all at the consumer level.

You don't think that these major manufacturers custom make every microchip that goes into your display do you? HECK NO! They buy in huge quantities from other companies that specialize in making chips specific to the task that needs to be performed. Video processing chips are one of those big areas - and guess what? Nobody makes a chip that does full 1080p on both ends... well, okay, now there are companies that do, and there are companies that did, but those chips weren't inexpensive.

THIS YEAR: With almost all new displays we are seeing this year, we are finally seeing the ramp up of 1080p processing chips that handle 1080p all the way through the system. The chips are fairly new, they are a bit more expensive than the old chips, and I don't think quality has been thoroughly reviewed with many products at this time. But, we will see the newest Pioneer and Sony 1080p displays... all currently being rolled out as we speak... with full 1080p capability from beginning to end.

A bit of history, that may put it into some perspective... About 8 to 10 years ago there wasn't much in the way of digital projectors. You had CRT units and to get an image processor that would de-interlace (double) the image from 480i to 480p, would cost about $20,000. Nowadays, the processing on a single chip is far more capable than the $20K box of less than a decade ago, and is basically a standard item on televisions and digital displays costing well under $1,000.

Pretty amazing when you think about it.

But, in a couple of years, when HD discs (hopefully) are all over the place, we should see that very few displays are incapable of accepting a 1080p source.

What I really am hoping for is displays that not only accept 1080p/24 but actually display it at 1080p/24 and don't convert it to 1080p/60.
 
B

BEACHDUD110

Enthusiast
Well, a good test would be

A good test is to use your PC to test the output on the Tv MODEL, many models now come with a VGA port or dvi ports(you can probably use the hdmi port as well).

If your computer can support 1920x1200 try outputting it to the TV, many people have done that with the westinghouse.

Hence the Westinghouse was called a 1080p Lcd Monitor rather than "TV".
 
D

docferdie

Audioholic
Hi Ho said:
Is there some sort of technological hurdle? Does it take immense processing power? ?
I myself had been waiting for the samsung 4696 or 5797 but yesterday my installer updated me that Samsung had recalled these models and cancelled pending distributor orders. Something about chips failing after a few days. Hopefully the sony bravia 1080p tvs actually make it to the retail market.
 
Buckeyefan 1

Buckeyefan 1

Audioholic Ninja
BEACHDUD110 said:
A good test is to use your PC to test the output on the Tv MODEL, many models now come with a VGA port or dvi ports(you can probably use the hdmi port as well).

If your computer can support 1920x1200 try outputting it to the TV, many people have done that with the westinghouse.

Hence the Westinghouse was called a 1080p Lcd Monitor rather than "TV".
My computer has a setting for 1920x1440, and 2048x1536. What the heck is that? The second setting amounts to over 3.1 million pixels.

And BMX, why is 1080p at 24 frames per second "better" than 1080p at 60 frames per second. I thought the more fps, the better then moving image (assuming the media was recorded at 60 fps).
 
L

LEVESQUE

Junior Audioholic
I'm using a Gennum VXP scaler to a Sony Ruby, and with my Toshiba HD-A1 (HD-DVD player) and my Samsung BD-P1000 (Blu-ray), I get true 1080p. :D The Gennum chip is performing true inverse-telecine with 1080i signals, so the end result is 1080p.

There is not alot of chips performing true inverse telecine in film-mode and per-pixel motion adaptive de-interlacing in video-mode on the market right now. Gennum VXP, Realta HQV and Reon, and the new National SemiConductor chip. The top end Lumagen is also doing it, but it's not up to the level of the others.

All the rest of the video processors on the market are almost all doing "bob" with HD signals, and loosing vertical resolution with interpolations.
 
B

BostonMark

Audioholic
film

Buckeyefan 1 said:
My computer has a setting for 1920x1440, and 2048x1536. What the heck is that? The second setting amounts to over 3.1 million pixels.

And BMX, why is 1080p at 24 frames per second "better" than 1080p at 60 frames per second. I thought the more fps, the better then moving image (assuming the media was recorded at 60 fps).

I know Im not BMX, but I am certain it is because FILM is shot at 24 frames per second.
 
BMXTRIX

BMXTRIX

Audioholic Warlord
BostonMark said:
I know Im not BMX, but I am certain it is because FILM is shot at 24 frames per second.
Yuppers. It is just about reproducing the original as close as possible. If you give me a 1080p/60 original source, then I definitely want to see it at 1080p/60 in the reproduction. But, since film is (generally) not shot at anything other than 24 frames per second, that is the way I want to see it honestly and accurately reproduced. Zero judder, no repeated frames - just pure filmlike quality... at home... and BIG!
 

Latest posts

newsletter

  • RBHsound.com
  • BlueJeansCable.com
  • SVS Sound Subwoofers
  • Experience the Martin Logan Montis
Top