24 fps or off?


Have a Samsung 8500 4k player dedicated to LG 6150 4k tv.  Should I set it to 24 fps or turn it off? 
128x128jpainter236

Well, if you are thinking of modern display devices, they will usually up-scale the frame rate to their native display rate. Good display devices will have a native frame rate of 240 hz. This allows natural multiplication of both 24fps and 30/60fps signals.  Some other devices have 120hz framerate, which is not the best for movies. Older CRT projectors could do a native framerate of 48Hz, but nothing really less.

The 24fps would be to keep the natural frame rate of the movie. Otherwise, you’re looking at up-converting it to 30 or 60 fps, which can cause motion defects (judders, etc.).

"24fps seems pretty low. Are you saying a TV that is able to display Blu Ray quality resolution may have trouble with 24fps?"

Yes, you are confusing HT with computer graphics.  Most (basically all) films are actually shot at 24 fps.  To display these movies perfectly, both the source and the display need to be capable of, and set to, true 24 fps.

auxinput had a very good explanation of the technical reasons underlying this issue, much more information can be found on line, of course.


24p support has to do with maintaining the source material of film, as most films are shot in 24fps. Depending on the display in question (LCD, plasma, OLED, etc.), this format will either be left in 24fps, get up-sampled in multiples of 2-3 (48-72hz, i.e. advance mode on the Pioneer Kuros), or it will get converted to 30 or 60fps (or multiples thereof in the case of LCD displays i.e. 120hz-240hz) using frame interpolation/pull-down techniques, which can cause everything from excessive judder to the dreaded "soap opera effect" common to LCD/LED displays. If your source player and TV support true 24p, leave it on (or set it to auto so that it will engage when running film content and disengage when running video content). Hope this helps!

-David