Jump to content

Recommended Posts

  • Replies 26
  • Created
  • Last Reply

Top Posters In This Topic

Posted (edited)
Hi guys can you compare the 120hz to 100hz full hd lcd tv.

and how much it costs?

I believe 120hz only applies to countries like the US where their power supply frequency is 60hz.

Our power supply frequency is 50hz so we can double to 100hz.

Somebody spank me if I'm wrong. :mellow:

Edited by fossicker
Posted
I believe 120hz only applies to countries like the US where there power supply frequency is 60hz.

Our power supply frequency is 50hz so we can double to 100hz.

Somebody spank me if I'm wrong. :mellow:

Yeah... I don't think it really depends so much on the power supply any more. For example, all HDTVs here can show signals at both 50hz and 60hz. And some in the US can show 50hz, though many only 60hz.

It may be that 100hz is easier to do with a 50hz power supply than 120hz is, though I would have thought that if you can do 60hz you can do 120hz.

120hz is useful because it can display a 24fps signal cleanly by showing each frame 5 times. 100hz is not really useful for that, though some 100hz sets also do 96hz which is useful again for the same reason. I think that some sets marketed as 100hz here can also do 120hz.

Posted

for some reason i thought the tvs like sony xbrs that are 100hz here and 120hz in US, actually both do 120hz, just the marketing targets 100hz for this region. the xbr is a 110-240v tv, presumably 50-60hz, and that would correspond to 100-120hz

Posted

All,

This topic has recently been covered.

All of these TVs do the same thing. They show each frame 4 times.

So if playing a Bluray disk at 24 frame/s (24p) the display will show 96 images/second

If watching Australian TV there is 25 frame/s transmitted at 50 field/s and will show 100 images/s (100 Hz)

If watching US TV there is 30 frame/s and the display will show 120 images/second.

So the input frame rate is multiplied by 4!

AlanH

Posted
All,

This topic has recently been covered.

All of these TVs do the same thing. They show each frame 4 times.

So if playing a Bluray disk at 24 frame/s (24p) the display will show 96 images/second

If watching Australian TV there is 25 frame/s transmitted at 50 field/s and will show 100 images/s (100 Hz)

If watching US TV there is 30 frame/s and the display will show 120 images/second.

So the input frame rate is multiplied by 4!

AlanH

So in your opinion Alan, is there any pinot in upgrading from a 100hz tv to a 120hz... or not noticeable difference?

Posted
So in your opinion Alan, is there any pinot in upgrading from a 100hz tv to a 120hz... or not noticeable difference?

Going by what Alan just said, we don't get 120hz here, so you don't have that option.

Posted

D3CID3R,

120 will not divide into 25 with a whole number so if you were to repeat each frame 5 times the refresh rate would be 125 Hz.

This is why the maximum repeated frames is the whole number of 4.

Any TVs advertised at 120 Hz are using American/Japanese copy and not copy for the rest of the world including Australia.

So in short no TV will run at 120 Hz in Australia unless they are playing Bluray at the wrong frame rate or region 1 SD DVDs.

The difference between 100 & 120 should not be worried about anyway. (100/30 is not a whole number either)

AlanH

Posted
Hi guys can you compare the 120hz to 100hz full hd lcd tv.

and how much it costs?

I was assuming he was talking about 'LCD 100Hz ' vs 'LCD 120Hz '

???

Posted

Hi i am talking about 100hz vs 120hz any differences. which would be better etc.

but there has been goo responses which is being useful.

so its a waste of time of getting a 120hz? or would the picture still better then the 100hz?

I was assuming he was talking about 'LCD 100Hz ' vs 'LCD 120Hz '

???

Posted

Both,

If you buy a set which boasts 120 Hz that is what it will do in the USA, Canada and Japan. If it operates here it will automatically run at 100 Hz.

So any 120 Hz set is incorrectly labelled for use here.

The picture difference between the two is tiny. A very slight decrease in flicker at high brightness.

AlanH

Posted

ok so i guess i should be looking for 100hz and ignore the 120hz or really doesnt matter.

Both,

If you buy a set which boasts 120 Hz that is what it will do in the USA, Canada and Japan. If it operates here it will automatically run at 100 Hz.

So any 120 Hz set is incorrectly labelled for use here.

The picture difference between the two is tiny. A very slight decrease in flicker at high brightness.

AlanH

Posted

like alan said, pretty sure all the tvs do a 4x frame repeat, and work out the difference between the frames to produce a smooth looking picture, but of course faker.

im also pretty sure a 100hz tv will play at 120hz if you feed it an NTSC dvd and a game console at 60hz(although the difference won't be seen because a console can offer a full 60fps/hz signal)

so in terms of a 120hz and 100hz, im pretty sure the tvs are almost identical with the processing, they just either have a 120hz listed firmware and labels/packaging etc, or 100hz firmware and labels etc.

there won't be any difference between the tvs and the hz rating is purely done to target the market the tv is in, PAL 50hz, 100hz tv, NTSC 60hz, 120hz tv.

Posted

For me, the issue is still whether the set correctly displays a 24fps signal at a multiple of 24hz without frame interpolation. You can't assume that all 100hz/120hz sets will do this. There are still displays out there that will take a 24fps signal, apply 3:2 pulldown to get 60hz and then double that to get to 120hz. That's not as good as repeating each frame 4 times to get 96hz or 5 times to get 120hz. It's always worth checking the specs and reviews.

Posted

Petercat,

To put in extra memory for storing another frame is not worth it. If the receiver is advertised that it will accept 24p and you feed it with HDMI, the data will tell the TV that the source is 24p and continue to repeat each frame 4 times, thus giving 96 images/s. This assumes the Blu-ray play has been set to output 24p.

The HDMI standard for interconnection carries the data about the number of pixels, interlaced/progressive and the frame rate.

Which brands and models upconvert to 30 frame/s? This is a lot more effort within the design and operation of the display, for a worse result. The best is to watch the titles scrolling up the screen.

Specs and reviews are not always accurate.

AlanH

Posted

I know it makes more sense to just do a 4:4 or 5:5 pulldown, but if you already have the circuitry to do a 3:2 pulldown (as is present in all existing 60hz TVs) then it may seem simpler just to use what you've got and double the refresh rate, and I gather that's what some manufacturers did especially in the early days of 100/120hz. There's a couple of long threads on the AVSForums about it.

Also, some of the manufacturers seemed to assume that the whole point of 100/120hz was to interpolate generated frames and wouldn't let you turn that option off if you wanted 100/120hz mode. The first Samsung 100/120hz sets had this problem, though it's since been corrected in firmware. Things are getting better as manufacturers realise that there is value in unadulterated 4:4 or 5:5 pulldown but I still think it's worth checking.

I don't think a set being advertised as 100hz is enough to guarantee that. Although if you buy a new 100hz set today it'll almost certainly do the right thing.

Posted
Peter,

The conversion circuitry you talk of is in the player not the TV.

AlanH

Historically that's true, but nowadays a TV that accepts a 24fps signal but only displays at 60hz must have some conversion circuitry in it as well, no? I'm not saying it does any good as opposed to the player doing the conversion, but there are a lot of those TVs out there. The market evolution for TVs dealing with HDM seems to have been from (1) only accept a 60i/60p source, to (2) accept a 24p source and convert to 60hz for display, to (3) accept a 24p source and display at 48/72/96/120hz.

Posted

Peter,

The whole idea of replaying at 24p is to match the frame rate to the speed used in the film camera. If the display converts the image to 30 frame/s then the motion becomes jerky or blurred. There has never been a need for a display to convert to 30 frame/s. If the display is fed an NTSC or an ASTC digital signal then it will show each frame 4 times. No conversion. If the signal source is PAL/SECAM or DVB then the frame rate is 25 frame/s and it will show each frame 4 times. No conversion. If the player outputs its signal at 24 frame/s it will shown 4 times, no conversion.

This produces the best illusion of motion. Why would a manufacturer add extra circuitry to upconvert 24p when the result is worse and it is not required for any other signal. As I said this 24p up conversion is only required for NTSC analog TV output of the player. Region 4 SD DVD players output at 25 frame/s.

As I said HDMI identifies the frame rate of the source. HDMI is also a two way connection so if the display cannot display the that signal it can tell the ource to change to another standard.

To conclude TVs which accept 24p show it at either 48 or 96 images/s and no other rates!

AlanH

Posted (edited)
... The market evolution for TVs dealing with HDM seems to have been from (1) only accept a 60i/60p source, to (2) accept a 24p source and convert to 60hz for display, to (3) accept a 24p source and display at 48/72/96/120hz.
Yes, I have read of situation '2' , TVs that accepted a 24p source only to perform 3:2 pulldown to suit the 60fps refresh rate of the screen. This was in response to a market that perceived capacity to accept a 24fps signal as an indicator of superior performance. Implementing 3:2 pulldown was far cheaper and easier than resdesigning the set for true 96fps or 120fps performance.

I am just guessing but I suspect that once the refresh rate gets up above 90fps it will not be all that noticeable if there is a slight irregularity in the conversion from 24fps.

For a start, most Blu-ray and HD-DVD recordings although loosely called '24p' are actually 23.976fps, but 24.0fps is possible in the specifications, so a TV would have to be able to cope with both. I suspect the display is kept rock steady (crystal oscillator controlled) at 96fps 100Hz 120Hz or whatever nominal rate is chosen, and no attempt is made to synchronise it precisely to a multiple of the incoming signal frame rate, in the way the scanning of analogue televisions sets of old was locked to the input. Rather, if there were a slight lag in the input timing, then every few seconds the conversion might create say 5 output frames from the next input frame rather than 4 output frames. Conversely, if the display were slightly slow, relative to the multiple of the input signal frame rate, one output frame could be dropped, every few seconds.

I'd be surprised if a human observer could detect such a slight change in the cadence. In other words:

4444444444444444444444444444444444444444445444444444444444444444444444444444444

and

4444444444444444444444444444444444444444444444444444444444444444444444444444444

and

4444444444444444444444444444444444444444443444444444444444444444444444444444444

would all look uniformly smooth, subject of course to the inherent jerkiness of film of only 24 frames per second!

Manufacturers have released little or no detail to the public as to just how the frames are managed.

Edited by MLXXX
Posted

All,

All HD DTV standards are derived from 74.25 MHz.

To display full HD at 24p divide 74.25 MHz by 2750 gives a line frequency of 27.000 kHz.

To display full HD at 25p divide 74.25 MHz by 2640 gives a line frequency of 28.125 kHz.

To display full HD at 30p divide 74.25 MHz by 2200 gives a line frequency of 33.75 kHz.

This is the only modification required in the display is a change the division ratio above.

If you divide the line frequency by 24, 25 and 30 respectively. In each case the result is 1125. All 1920 x 1080 frames have 45 lines which do not contain picture information.

All other signal characteristics are identical.

The only thing the manufacturer needs to do is to recognise the frame rate information from the HDMI signal and use it to make a change ot the divide ratio mentioned above. This only requires a tiny piece of code in the firmware controlling the display.

AlanH

Posted (edited)

As a matter of detail, I'd mention that by far the most common frame rate created by a Blu-ray player when it is set for nominal '24p' playback is not 24.000 fps but 24*1000/1001 = 23.976 fps (a vestige from the NTSC standard); and that will slightly affect the figures Alan has provided immediately above.

A point I raised in my post is whether the circuitry in a high frame rate TV is likely to uses a continual phase locked loop arrangement to lock onto the incoming frame rate or whether the display refresh rate frequency synthesis is absolutely independent of the phase of the input, with accumulated drift in the refresh rate of the display (relative to the required multiple of the input frame rate) being cancelled by occasionally adding or dropping an output frame.

This type of detail does not come up in the broad descriptions manufacturers provide for their high refresh rate 24p capable TV sets.

Another situation where this issue arises of which device is dictating the screen refresh rate and phase, the display or the signal, is using a PC desktop to display a live digital tv broadcast. I think that if the PC graphics card normally runs at say 49.999Hz and a particular televison broadcaster is providing 50.0000001 Hz, the PC rate prevails. Others may have more specific comment on how slight refresh rate discrepancies are dealt with when using a PC to display a TV broadcast and where the PC desktop is set to the same nominal frame rate as the TV transmission.

Edited by MLXXX
Posted

MLXXX,

The 74.25 MHz is the pixel clock rate for uncompressed HD digital video. This is the oscillator is phase locked to the signal.

So to run at 23.97602398 frames/ the pixel clock rate becomes 74.17582418 MHz which is -0.099 % lower than 74.25 MHz. This is well within the lock range of the phase locked loop.

All divide rates are the same as for 24p.

So the result on the display is a refresh rate of 95.9040959 images/s This is not visible.

There is no frame rate lock required. The image has to be stored in memory. The video contains the start of line and start of frame codes which will reset the write clocks at the start of each line and field respectively. The frequency they operate is derived from the pixel clock above.

There is no dropped or added pixels. The standards do vary the number of blanked pixels outside the picture area. These are not stored in the picture memory.

The read clock must be a whole number division from the pixel clock otherwise the memory will overflow or underflow. In the examples we have been using a 4 times refresh rate. The pixel clock is divided by 687.5 for 23.98 or 24, 660 for 25 frame/s or 550 for 29.94/30 frame/s.

So I said there is no point in converting 23.98 to 29.94 frame/s in the display unless you wish to produce NTSC.

As for graphics cards the pixel clock must lock to a whole multiple of the incoming pixel clock rate otherwise the picture will roll horizontally and vertically.

AlanH

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...
To Top