, figuerres wrote

*snip*

Apple has done what they call the "retina" display on the iPhone, the iPad and the mac book.

the rest of the hardware vendors have not even come close to that.

small screens like the iPhone have had 200 ppi for a while now.  time for the larger ones to catch up.

 

Actually, most vendors have done better than "retina", so I assume you're really saying no large monitors have done "retina" or better yet. This includes monitors from Apple. This might help you to understand why: http://www.engadget.com/2012/06/05/viewsonic-vp3280-led-4k-monitor-hands-on/

Granted, that's a rather large monitor, which is why it's only getting 150 ppi, but 4k displays are still rather expensive to make. Not to mention what it takes to pump that many bits out to a display. ViewSonic claims a Core i5 is only able to display still images on their 4k display. To get video you need a Core i7. Both of these are things that can be overcome, and we're getting there, but don't expect it this year.

As for movies, most movies now are being shot with 4k cameras, AFAIK. Pretty much none are being shot on film, regardless. The real issue with 4k content is just getting it to you, though it sounds like we may have solutions for that as well, now. Up until recently everything I read indicated you could only fit about 15 minutes of 4k video on a Blu-ray disc. At CES though, Red (the largest manufacturer of 4k cameras) announced REDRAY, a 4k player. This uses a codec that supposedly can fit a full 4k movie onto a Blu-ray. Movies are stored on a very large internal HD. I assume dedicated hardware is then used to process that codec fast enough to output the video to a 4k TV/display. In other words, I don't think the codec is enough on it's own to solve the bandwidth issues involved with 4k processing, but it's still a huge advancement.