Manufacturers chosen screen resolutions and effect on picture quality

dehlbtel

Inactive User
Joined
Dec 18, 2005
Messages
10
Reaction score
1
Following on from Hamba's thread on the poor quality of LCD/Plasma screens on analogue sources (http://www.digitalworldz.co.uk/forums/showthread.php?t=113961) I would like to raise some points about the achievable quality on HD material (for which HD-Ready sets were designed -- you would have thought!!)

Resolutions currently being transmitted in digital format (in UK):
768 x 576 (SD 4x3)
1024 x 576 (SD WS)
1280 x 720 (HD 720p)
1920 x 1080 (HD 1080i, future HD 1080p)

"True HD" screens are now becoming available (with an appropriate price tag) with a native resolution of 1920 x 1080 -- an obvious resolution to manufacture as the broadcast 1080i (and for some 1080p) will map without any need for scaling.

Lets be very clear about this, using scaling leads to a dramatic reduction in picture quality, lowering of effective resolution and horrible "aliasing" effects (sloping lines look jagged). The electronics then uses some "clever wizardry" to minimise how much our brain "notices" these jaggies by softening the focus, blurring the pixels and further reducing the resolution (called anti-aliasing). The broadcasters have tried to put in extra resolution just for our TVs to remove it for us :) If you don't believe this is significant then get hold of some 720p content and display it in a 1280x720 window on a PC and then in a 1366x768 window (on a display that will go to 1920x1200). Even more obvious if this is thrown up on a projector (of the appropriate resolution) so you can get real close and "see" the pixel structure.

The obvious resolution for the cheaper HD version displays would therefore be 1280 by 720 to natively map 720p and accept that 1080i will need scaling. How many manufacturers produce a 1280 x 720 panel -- N O N E . They all make 1366 x 768 (for LCDs, plasma uses "non-square" pixels and even more electronics). The story is that this was a "standard" resolution for PC monitors and therefore the first panels were produced at this res to save "retooling" the production line and ensure the panels were "dual-purpose", thus giving economies-of-scale and price/profit benefits. However now the TV production lines have enough volume to stand on their own and so why do they not switch? OK retooling a line is expensive, but the "obvious" size has almost 14% less pixels meaning a cheaper panel with higher yields as well as the leap in visual quality.

The fact that 720 is fairly close to 768 actually makes aliasing far worse (the "steps" in sloping lines are further apart making detection more difficult or masking more intrusive over a larger area); in fact the picture quality would be significantly better if the unscaled image was shown unscaled with a black border (24 pixels spare top and bottom, 43 spare left and right). However apparently this is not acceptable because the customer has bought a 32inch (or whatever size) and therefore expects that all 32inches will be used to display a picture. The average customer will notice the black window but will not notice the reduction of quality having never had anything that doesn't show the effect to compare it with.

I may be very sensitive and critical of this quality reduction due to my background and training but seeing as we have the hi-def standards the customer has the right to expect to actually see the quality that is available from the standard. What can be done to get the message across to the manufacturers that we do actually want non-scaled HD panels?

Yet again the customer loses out in the quality stakes (after over-compression of digital signals giving poorer picture quality than analogue broadcasts -- don't get me started on that one!!).
 

pinkhelmets

Inactive User
Joined
Aug 10, 2001
Messages
5,187
Reaction score
174
Location
essex colchester-ipswich
i dont agree screen manufacturers should change, it should be the broadcasters and the actual 'HD standard' that should change and its been an arguement right from the beginning many years back.
Media centre and pc influence is the future and I'm 100% behind it. Even sky/BT/Virgin/Microsoft and many more large companies are changing and agreeing this is the eventual way forward. I totally like using my 1366 x 768 screen resolution for my fully functional media centre that uses the pc standard resolution. PC pictures have never been so good and function is now the ultimate, so why should I have to have a reduced resolution, no matter how slight? Gaming machines, home-computers, and other screen entertainment is just as important as tv, so why should everything now change just because the tv broadcast industry fecked up?

The pc standard resolution is correct IMHO and always has been, and thats why it has the backing of every screen manufacturer. It is the crappy 'HD ready' standard that was set and agreed that is totally incorrect. It should be called "HD-consumer-rip-off standard", and the true HD standard of 1920 x 1080 should be the only correct way forward for HiDef. In the mean time come up with a new term for the popular 'PC format' standard that we all generally use (1366 x 768), then get that made as the widely accepted industry mid-standard. Broadcasts could then be made into this new 'mid-standard', everyone would understand it easier, so no more consumer confusion and rip-off, all industry will have definate lines to agree and follow.

Its a feckin mess, but its all about money so nobody will change anything. You may be very critical of this subject but in my view you blame the wrong people and sit on the wrong side of the fence to reality. We dont need to get the message across to manufacturers we need to get the message across to the broadcast industries and to the public who blame confusion on screen companies. Pc's, media centres, home-hubs, equipment networking & integration have more influence on our future than an outdated 720pixel 'hd-ready' rip-off format that should never have existed.
Regards.
 
TEST
Top