Shadowmonic

Shadowmonic

Banned
hey im picking out a LCD television Screen sometime this year and i was wondering what is Hz, like 60hz or 120hz and is it important

note: the television i want to get is 60hz
 
B

bhodge

Junior Audioholic
Hz represents the refresh rate, or rather how many times the picture is drawn per second. So 60Hz means that your tv will redraw the screen 60 times a second.

Why is this important? Think of how a flip book is made. Drawing a scene on each page. The faster you flip the pages the smoother the animation looks to your eye.

When will you notice the difference between refresh rates? Well the faster the motion, the higher probability it will look chunky/sluggish with lower refresh rates because there is more motion per frame. So the faster (higher refresh rate) the tv can display the smoother the picture will look. Due to LCD technology, they have the lowest refresh rates. Plasmas by contrast have a 600Hz refresh rate.

The real question is will you notice or care? That I cannot answer for you. I'm a plasma guy through and through so haven't had to deal with it. The best thing is to find some movies with a lot of fast action, or sporting events, and judge for yourself if you think the picture looks better/worse depending on the refresh rate. There are some articles out there which will tell you that sometimes the higher refresh rate doesn't always equal in visual performance. There are several articles on this matter, just google "lcd refresh rate" here are a couple:
http://www.topreviewshop.com/lcd_refresh_rates_explained_240hz_vs_120hz_vs_60hz
http://www.lcdtvbuyingguide.com/lcdtv/120hz-240hz-60hz.html

So I will defer now to people who have a lot of experience with LCDs but in the end, its going to be up to how you view your tv so the best advice is to demo it with a wide variety of things you like to watch.
 
H

highfigh

Seriously, I have no life.
hey im picking out a LCD television Screen sometime this year and i was wondering what is Hz, like 60hz or 120hz and is it important

note: the television i want to get is 60hz
Only important if you're looking at an LCD/LED TV. LED only refers to the back or side lighting, so the actual video still comes from an LCD screen.
 
mtrycrafts

mtrycrafts

Seriously, I have no life.
hey im picking out a LCD television Screen sometime this year and i was wondering what is Hz, like 60hz or 120hz and is it important

note: the television i want to get is 60hz
One more thought to keep in mind. Film in theaters, actual films, are done at 24 frames per second. Live broadcasting is done at 60 per second. There is no new info added at higher refresh rates but some try to interpolate.
 
B

bhodge

Junior Audioholic
One more thought to keep in mind. Film in theaters, actual films, are done at 24 frames per second. Live broadcasting is done at 60 per second. There is no new info added at higher refresh rates but some try to interpolate.
This is a bit misleading. While its true higher refresh rates don't add "additional information" that does not mean there is no added benefit. On top that the technology behind the refresh rate is just as important as the numeric value itself.

While film is done in 24 frames per second, thats not how its displayed. The film projectors actually have shutter on them that opens and closes several times a frame, as a result the actual refresh rate is higher:
http://en.wikipedia.org/wiki/Refresh_rate

In addition, TV displays content from a variety of sources causing a mixture of frame rates and the display unit must convert those frame rates into proper refresh rates. 24 frames per second does not go even into 60, which is what 3:2 pulldown technology is for. IMO the technology used to smooth the video is more important the the actual refresh rate which is why there are discrepancies with reports on if a higher refresh rate is actually better.
 
bandphan

bandphan

Banned
*cough*plasma*cough* not like the thread is going anywhere:rolleyes:
 
H

highfigh

Seriously, I have no life.
One more thought to keep in mind. Film in theaters, actual films, are done at 24 frames per second. Live broadcasting is done at 60 per second. There is no new info added at higher refresh rates but some try to interpolate.
Electronically produced video is 30 frames per second, not 60. It's 60 fields per second- 30 for luminance (the actual image) and chrominance (colors). 24 fps is from the original film rate and applies to anything larger than 16mm. The human eye is tricked into seeing a smooth image when the frame rate is over a certain number, although some people still see flickering. Frames cost money, so they went with the lowest number that looked good to most people. If they had found 30 fps to be better, it would have taken 25% more film to shoot and that not only takes more space for storage and camera size, it also means they would waste much more in the editing phase.

Refresh rate higher than 60Hz makes it possible for smooth motion when the object moves faster across the screen.
 
BMXTRIX

BMXTRIX

Audioholic Warlord
You've kind of gotten answers, but not exactly.

60hz/120hz/240hz are all the output capability frame rate of the televisions you are looking at.

Why does it matter?

1. While most HD is presented at 720p/60hz or 1080i/60hz televisions can have issues properly showing the material, ESPECIALLY LCD displays which can have issues with what is called image smearing. To get around this failure of the display, they up the frame rate which helps to significantly reduce (but not eliminate) this.

2. Perfect frames! Most movies are shot at 24 frames per second (which we could call hz here). They are encoded onto things like Blu-ray Disc at 1080p/24hz. 24 divides into 60... well, not evenly... 2.5 times. So the video playback from film based content can introduce an bit of stutter into smooth pans of images called judder. 120hz displays on the other hand are divisible evenly by 24 and typically (not always) repeat the frame 5 times in a process called 5:5 pulldown. This, should eliminate judder.

3. Frame interpolation. Forget 24hz or 60hz original stuff. Some TVs can now add extra frames to the original video to make a source which was 24hz or 60hz into 120 different frames per second (hz)! Or even 240!!! This effect looks horrible to some people while others absolutely love it. They call it very video looking - which is not at all accurate. I think it looks like s strobe light. It deblurs images, which is not natural to our normal eyesight and adds extra frames. You tend to introduce artifacts and sharpness errors when utilizing frame interpolation, but typically this can be turned off.

That's about it for now. This year as 3D televisions come to market we will see that plasmas and LCDs will all have a minimum 120hz response time and will also accept 120hz sources, which is extremely rare for a display to do right now.

If I were to buy an LCD right now, I would look for a better model with 120hz and adjustable/removable frame interpolation to set the way I wanted it.

If I wasn't doing that, I would get a plasma instead which will produce a better image for less money (typically).
 
BMXTRIX

BMXTRIX

Audioholic Warlord
Electronically produced video is 30 frames per second, not 60.
Really? I thought most camers shot at 720p or 1080i neither of which is 30 frames per second? I know a lot of TV shows are shot on film which is typically done at 30 true fps similar to movies at 24fps, then broadcast a 1080i/60 (most often).

It's 60 fields per second- 30 for luminance (the actual image) and chrominance (colors).
Huh? I've never heard this. I've always heard that it is either progressive (full frames) or interlaced (odd/even lines), I have never heard of a breakdown between chrominance and luminance which would be the broadcast color space, which would not be related to fields.

24 fps is from the original film rate and applies to anything larger than 16mm. The human eye is tricked into seeing a smooth image when the frame rate is over a certain number, although some people still see flickering. Frames cost money, so they went with the lowest number that looked good to most people. If they had found 30 fps to be better, it would have taken 25% more film to shoot and that not only takes more space for storage and camera size, it also means they would waste much more in the editing phase.
This is accurate, but it should be noted that the 'artists' which have owned the film industry rejected 30fps and 60fps trials because of how much better it did make things look. 'Too real!' was the general statement, which took away from the artistry. I would think in the next generation or so we may see a shift towards digital 1080p/60 or higher resolutions with raised frame rates move towards becoming a standard. When film is 'free', there is no longer a limitation that was born of budgetary and forced artistic concerns. Though, the art of 24fps can still be utilized by whomever so chooses.

Refresh rate higher than 60Hz makes it possible for smooth motion when the object moves faster across the screen.
What's weird is that it rarely actually does. Our eyes expect blur in motion. When blur is removed, such as we get from frame interpolation turned on high, it typically looks inaccurate. Not like video, but a video camera set to a 1/1000th of a second shutter speed. Proper 24/30/60fps image captures will be done for as close as possible to their respect segment of a second. 1/24th, 1/30th, 1/60th, etc. with minimal lapse between. This produces very natural looking video, but as motion increases, so does blur.

I'm not sure if I have seen 120fps video shot at a proper 120fps (1/120th of a second for each frame exposure) and shown as such, but I am sure it would look nothing like frame interpolation.

If you have information which contradicts what I've presented, please let me know as I wasn't putting it out there to call you out, but to try to figure out if there is something inaccurate about what I think I know. :D ;) :D I just want to be sure I have it right. Thanks.
 
H

highfigh

Seriously, I have no life.
Really? I thought most camers shot at 720p or 1080i neither of which is 30 frames per second? I know a lot of TV shows are shot on film which is typically done at 30 true fps similar to movies at 24fps, then broadcast a 1080i/60 (most often).


Huh? I've never heard this. I've always heard that it is either progressive (full frames) or interlaced (odd/even lines), I have never heard of a breakdown between chrominance and luminance which would be the broadcast color space, which would not be related to fields.


This is accurate, but it should be noted that the 'artists' which have owned the film industry rejected 30fps and 60fps trials because of how much better it did make things look. 'Too real!' was the general statement, which took away from the artistry. I would think in the next generation or so we may see a shift towards digital 1080p/60 or higher resolutions with raised frame rates move towards becoming a standard. When film is 'free', there is no longer a limitation that was born of budgetary and forced artistic concerns. Though, the art of 24fps can still be utilized by whomever so chooses.


What's weird is that it rarely actually does. Our eyes expect blur in motion. When blur is removed, such as we get from frame interpolation turned on high, it typically looks inaccurate. Not like video, but a video camera set to a 1/1000th of a second shutter speed. Proper 24/30/60fps image captures will be done for as close as possible to their respect segment of a second. 1/24th, 1/30th, 1/60th, etc. with minimal lapse between. This produces very natural looking video, but as motion increases, so does blur.

I'm not sure if I have seen 120fps video shot at a proper 120fps (1/120th of a second for each frame exposure) and shown as such, but I am sure it would look nothing like frame interpolation.

If you have information which contradicts what I've presented, please let me know as I wasn't putting it out there to call you out, but to try to figure out if there is something inaccurate about what I think I know. :D ;) :D I just want to be sure I have it right. Thanks.
Fields is video recording and frames is film. AFAIK, not much TV is shot on film but some movies are because it still saturates better and looks warmer than video. If you watch some movies, they add noise to make it look like film that's shown on a large screen- Transformers is a good example of this, even if the movies sucked. I think Quantum Of Solace has this, too.

Re: chrominance and luminance- videotape was here before digital video and that's where these come from. The signal still needs to conform to this in order for older displays to be able to use it. Digital or analog, the baseband video waveform to the TV needs to be the same. DVI/HDMI is another animal and because it's digital, older TVs don't need to be considered for this format. Consider videotape- all VCRs have at least two heads on the head drum- one for luminance and one for chroma. The ones that recorded on 8 hour tapes had four and this also worked for smooth fast forward because a head was always on video tracks.

I never said blur doesn't appear if the frame rate is high enough, I said flickering. Back when film was first used for moving pictures, they worked with the format so it would look as good as they could make it, but they also needed to be able to afford it. Remember, this was about 125 years ago. Blur comes from motion that's significantly faster than the frame rate, so looking at that frame would show blur, too. Only high speed video will capture clear images at most speeds for moving objects but even at 10000 frames/second, a bullet moving 2700 ft/sec will be blurred if it's close enough to the lens.

Film, videotape or digital, there's no such thing as free recording media. The difference is that digital can be reused and moved around more easily.
 
J

James NM

Audioholic
Wow.

Turns out there's lots of useful info in this thread. Who'd u thunk it?

I guess someone should thank the OP. OTOH ...
 
BMXTRIX

BMXTRIX

Audioholic Warlord
Fields is video recording and frames is film.
Yes, but most video is shot at 60 interlaced fields, not 30fps video, but interlaced fields when video is used. That which is shot progressive at 720p is basically interchangable with frames I believe.

AFAIK, not much TV is shot on film but some movies are because it still saturates better and looks warmer than video.
Some prime time stuff is shot to film. Friends, for example, was shot to film, and when this was done it was done at 30fps then broadcast at 1080i/60. I don't have a list of how many are, because many are not, but the bigger ones I believe still are shot to film.

If you watch some movies, they add noise to make it look like film that's shown on a large screen- Transformers is a good example of this, even if the movies sucked. I think Quantum Of Solace has this, too.
Yes, and we are likely to see more and more moving forward. Eventually the pseudo grain may be dropped as people get used to seeing digital naturally.

Re: chrominance and luminance- videotape was here before digital video and that's where these come from.
Yes, but this has nothing to do with fields or frame rate. S-video separates the two while composite layers them and component breaks it out further, but this doesn't deal with hz or frame rates in the least, which is what I was picking up from your prior post but not from this one.

I never said blur doesn't appear if the frame rate is high enough, I said flickering. Back when film was first used for moving pictures, they worked with the format so it would look as good as they could make it, but they also needed to be able to afford it. Remember, this was about 125 years ago. Blur comes from motion that's significantly faster than the frame rate, so looking at that frame would show blur, too. Only high speed video will capture clear images at most speeds for moving objects but even at 10000 frames/second, a bullet moving 2700 ft/sec will be blurred if it's close enough to the lens.
My point being that if you move your hand in front of your face you don't see a stop motion hand. You see a blurry hand. Our eyes do NOT see frames, but expect to see blur. Only this crappy frame interpolation stuff actually removes blur and artificially increases shutter speed through deblurring techniques that are incredibly unnatural looking. Some people like it, but it is a long ways from natural. They say it looks like 'video'. I say it looks like a video game - before they added blur to video games to make them look better. Blur is supposed to be a part of every single frame shot if there is any motion involved. Always will be.

Film, videotape or digital, there's no such thing as free recording media. The difference is that digital can be reused and moved around more easily.
The cost differential between digital and analog is tremedous. Phenomenally different, and dropping rapidly every year. The buy in cost is huge, but once again, dropping all the time. Analog film is likely to be nearly gone within a generation I expect. Our kid's kids won't know what film projectors are.
 
H

highfigh

Seriously, I have no life.
Yes, but most video is shot at 60 interlaced fields, not 30fps video, but interlaced fields when video is used. That which is shot progressive at 720p is basically interchangable with frames I believe.

Some prime time stuff is shot to film. Friends, for example, was shot to film, and when this was done it was done at 30fps then broadcast at 1080i/60. I don't have a list of how many are, because many are not, but the bigger ones I believe still are shot to film.

Yes, and we are likely to see more and more moving forward. Eventually the pseudo grain may be dropped as people get used to seeing digital naturally.

Yes, but this has nothing to do with fields or frame rate. S-video separates the two while composite layers them and component breaks it out further, but this doesn't deal with hz or frame rates in the least, which is what I was picking up from your prior post but not from this one.

My point being that if you move your hand in front of your face you don't see a stop motion hand. You see a blurry hand. Our eyes do NOT see frames, but expect to see blur. Only this crappy frame interpolation stuff actually removes blur and artificially increases shutter speed through deblurring techniques that are incredibly unnatural looking. Some people like it, but it is a long ways from natural. They say it looks like 'video'. I say it looks like a video game - before they added blur to video games to make them look better. Blur is supposed to be a part of every single frame shot if there is any motion involved. Always will be.

The cost differential between digital and analog is tremedous. Phenomenally different, and dropping rapidly every year. The buy in cost is huge, but once again, dropping all the time. Analog film is likely to be nearly gone within a generation I expect. Our kid's kids won't know what film projectors are.
re: moving your hand in front of your face- in natural light, you won't see stop motion but with lighting produced with AC voltage, you do. Our electrical system uses 60Hz and lights flicker at that rate.

The whole issue is a matter of synchronization and converting 24/30 frames to fit in a certain number of lines on a TV screen. NTSC color TV was the worst of the 3 main ones (NTSC, PAL and SECAM) because we already had an existing B&W system and the TVs used before color came out couldn't be made obsolete, so they used the same number of lines, unlike the rest of the world. If all lighting used DC voltage, it wouldn't matter as much, when lighting is concerned.
 
newsletter

  • RBHsound.com
  • BlueJeansCable.com
  • SVS Sound Subwoofers
  • Experience the Martin Logan Montis
Top