TWC9: Visual Studio 2012 5.5 million downloads, VS2013 Virtual Launch, ScriptCS and more...

Sign in to queue

The Discussion

  • User profile image
    androidi

    Can someone who's tried the Oculus Rift tell me how it works in the following sense:

    If I have a PC monitor at hands distance, obviously my eyes are focusing at the distance of the monitor. Or if I watch the sky, they are focusing at "infinity".

    One of the problems with these devices is that since the panel/lcd in the Rift is physically only ~inch away from your eyes, unless there is something that two ophthalmologists I asked claimed to be impossible, your eyes are focusing just a couple inch away (bit like looking at ones nose). And I suspect that if there was some "magical optics" to solve this, they might need invidual adjustment, possibly dedicated adjustments for both eyes, atleast in my case, unless one is supposed to wear corrective lenses also with them.

    I read hints suggesting that in some of the military planes and windshield displays they have somehow solved this, allowing one to focus "at the sky" while somehow projecting the CGI to a surface like the windshield or helmet visor. Allowing to keep focus at infinity while being able to read the stuff that's being projected from/at close distance (but "at infinity", as far as focusing ones eyes goes).

    I've long wanted such technology as if one could swap from "infinite focus" to "near focus" while keeping the CGI-text readable, this would solve some of the reading related 'lazy focus' issues that come from extensive focusing of eyes at certain distance as the eye focus mechanism isn't getting practised enough.

     

    If Kinects big deal is end to end response time/latency, I think the above focusing issue is the classic big deal with wearable computer displays (not panel resolution which some might believe is an issue - visible spacing between pixels is another real issue but irrelevant unless the focus issue is also solved). Has Oculus Rift really solved that or have I been right to ignore this hype?

  • User profile image
    Andrii

    Extremely useful and interesting information.
    Thanks guys!

  • User profile image
    Adam​Speight2008

    Larry: Just stick some large googly eyes on the large black panel of the glasses, and a bandito mustache.

  • User profile image
    LarryLarsen

    @AdamSpeight2008: Done. Smiley

    @androidi: I know what you're talking about and I'll try to respond as best I can having used this and other HMD's but keep in mind this isn't really my field of expertise.  

    The Oculus has two thick lenses in front of the screen. That is your first line of adjustment. If I remember correctly my Canon glasses and the VFX-1 had these lenses as well. Once you have them on there is a single 1280x720 display in front of your face, half devoted to each eye. You typically see this same display on the computer monitor so you can see what they are seeing for the most part (though crossing your eyes as some do to see what they see isn't a good idea because the pictures are backwards and the effect not the same.) This is the first time I've seen a device that did it this way rather than try to have a monitor for each eye. It works quite well. 

    As for focus, when you look at your monitor at your desk your eyes are getting two completely different pictures and then background processing in your mind is stitching the two together for you. Similar thing here, the software determines how far apart your eyes are (which may need adjusting in the software) and then it paints two pictures at the correct focus gaze distance. Your brain does the rest. 

    How do military helmets or Google Glass present so close to the eye without it being blurry? I'm not completely sure but I suspect they have a lense like the Oculus that is between the display and the glass it reflects off of.  

    As for the spacing between pixels being evident, you can see this on the Dev Kit Oculus. They refer to it as the screendoor effect. There are good reports that this will be close to eliminated when the consumer model comes out supporting a 1080 screen. I assume you see the pixels so clearly because the stereo matching your eyes do is down to the pixel and therefore so is the spacing, so the void of light between pixels because apparent at a 3D level which may make it seem a little more obvious. I would ignore the hype, the screendoor effect is not a concern IMHO. 

     

  • User profile image
    chanmm

    Can someone let me know whether ScriptCS support Windows RT?

Add Your 2 Cents