Granted, that's a rather large monitor, which is why it's only getting 150 ppi, but 4k displays are still rather expensive to make. Not to mention what it takes to pump that many bits out to a display. ViewSonic claims a Core i5 is only able to display still images on their 4k display. To get video you need a Core i7. Both of these are things that can be overcome, and we're getting there, but don't expect it this year.
As for movies, most movies now are being shot with 4k cameras, AFAIK. Pretty much none are being shot on film, regardless. The real issue with 4k content is just getting it to you, though it sounds like we may have solutions for that as well, now. Up until recently everything I read indicated you could only fit about 15 minutes of 4k video on a Blu-ray disc. At CES though, Red (the largest manufacturer of 4k cameras) announced REDRAY, a 4k player. This uses a codec that supposedly can fit a full 4k movie onto a Blu-ray. Movies are stored on a very large internal HD. I assume dedicated hardware is then used to process that codec fast enough to output the video to a 4k TV/display. In other words, I don't think the codec is enough on it's own to solve the bandwidth issues involved with 4k processing, but it's still a huge advancement.
well part of my issue is that there is a huge gap with all the common monitors being stuck at 70 to 90 ppi in general and so few "desktop" displays that go past 90ppi.
300ppi is a "nearly ideal goal" for having displays where we stop even counting the dots and just enjoy it.
but in the mean time I would like to see desktop displays that are more than 100 ppi be more common. in laptops they do it, why not for a desktop screen ?