It’s funny when I think about the world my children live in – a world of games consoles, tablet computers, super-powerful desktop and laptop PCs, and high definition TVs - and compare it to the world of my own childhood where games consoles had blocky graphics and poor sound effects, tablet computers were the stuff of science fiction, a ‘powerful’ computer would take up a whole room, and the television was a bulky box with, now that I think about it, pretty poor screen resolution.
Yes, that makes me sound like an old man, but I’m only 35… it’s just that I think technology has moved on with increasing speed in recent years.
Take televisions, for example. I haven’t long upgraded my DVD player to a Blu-Ray device, and I’m impressed at the difference it makes when watching on a 1080i television. Details can be seen that were just blurs before. I once had a friend say, “HD isn’t worth it, you can’t see the difference” but, and I’m sure I’m not just trying to convince myself that it is, I think it makes a real difference.
So imagine the difference pushing the resolution of the screen even higher would make. A 1080p television screen runs at a resolution of 1920 x 1080 pixels, but Sony have recently started producing a range of TVs at 3840 x 2160 pixels. They call them the 4K TV.
The 4K TV represents a doubling of the resolution over the 1080p standard, and I find myself wondering what difference it will make. Part of me says, “1080p is already incredibly sharp” and that an even higher resolution just isn’t needed. But another part of me recalls that I thought the telly of my childhood was pretty sharp too and now just looks fuzzy by comparison to HD. So, will Ultra HD make me think the same about today’s HD TVs? It’s hard to say without actually sitting down and trying a 4K TV for an extended period of time, but I have to think it will make some difference.
The tricky part, I guess, will be in finding the extra bandwidth to enable Ultra HD broadcasts as a doubling in the number of pixels displayed brings a corresponding increase in the amount of information that must be transmitted. Sure, video compression will help alleviate that a little, but when we’re talking about super high resolutions you don’t want to be compressing the video stream too much otherwise viewers will start to see compression artifacts like blocky areas or ghosting.
The only question I have now is, how much further can televisions go? Would it be feasible to double the resolution again? Would it, in fact, make any difference to double the resolution again? I guess there must be a point beyond which the human eye just can’t preceive any more pixels (one calcuation puts the resolution of the human eye at 576 megapixels) and the fact that we don’t sit right on top of our TV screens must mean we miss out on some detail. So I’m throwing this open to gather your thoughts – why not comment and let us know how far you think TVs can, or should, go? We’d love to know what you think.
While you’re mulling over your thoughts, why not check out this infographic from Sony about how TVs have progressed over the years?
Courtesy of Sony
Sponsored Post - I will receive financial payment for posting this article. Please be aware that I will never accept offers of paid posts where I am required only to give a positive opinion – objectivity is important to me and you can be sure that what I write, even in paid posts, is what I really think.