£160 if they fail outside of the warranty.
Loading User Information from Channel 9
Something went wrong getting user information from Channel 9
Loading User Information from MSDN
Something went wrong getting user information from MSDN
Loading Visual Studio Achievements
Something went wrong getting the Visual Studio Achievements
£160 if they fail outside of the warranty.
@Ray7: This is intriguing though and the fact they they doubled the number of pixels instead of using an intermediate density makes me believe that it is pretty easy to support standard and "retina" resolutions at the same time (just use 4 pixels instead of one for low res).
The biggest limitation I see is that most of the web today contains low resolution raster images which make this technology superfluous. I wonder what will happen.
Probably the same thing when TV went HD....channels and programs slowly came over. Now we have shows like The Simpsons and Family Guy being done in HD.
Which reminds me that Star Trek:TNG is coming out in HD soon on BlueRay
I just bought a 27 inhes 1080p monitor this 1AM
A lot of people praise the size and the ability to read documents without using a magnifying glasses.
Thank you! I had forgotten. Is it true they went back to the original film master? Hopefully the sets don't look too cheesy in HD. I saw a TOS episode in HD and couldn't stop staring at the cheap cardboard everywhere.
Well, the few designers I know are pretty excited about everything except the bandwidth implications. It's a great step forward, I'm told. :-/
I thought the fan blades looked a bit odd.
Turns out that they're shaped to stop the fans generating noise at a constant, annoying pitch.
I'm not excited by these displays - all it means is that fonts are even more unreadable and even more stuff breaks than before.
I have a 1920x1080 LCD. I have to have Text Size at Medium because Large breaks in damn near everything. I also have to CTRL+ most web-pages/images because the initial display is tiny (even with the browser's size turned up).
Operating Systems and software just aren't designed for resolutions this high. I think a lot of people who claim they can read normal fonts on 1920x1080 resolution must have super-human vision or only spend 20 minutes in front of the computer at a time.
Honestly computing was better when everything was 1024x768, the only things that benefit from the resolution boom are movies and games.
But apple just doubled the specs in both dimensions. Fonts are exactly the same size as on old MacBooks, except sharper. I'd argue that makes them more readable. It's like how the iPhone 4's screen resolution was quadrupled but everything was still the same size. It's just sharper.
Also, I have no problems reading text on my 27" 1920x1080 display at home and 24" 1920x1200 display at work. And I have regular eyes for a 30-year-old. My laptop is a different matter though, that's 1920x1080 at 13", so I have to run that at high DPI. It breaks some things, but overall it works well enough.
Not sure I follow you. They have doubled the resolution but will be displaying them at the same size. As Sven Groot says, they will appear the same size but will be much sharper, which will make them easier to work with over long periods. If you change to a non native resolution, then the OS should still give you a display which looks sharp enough to be native still.
It still looks like overkill to me, but better to aim higher if you can.
Then you are assuming wrong. They've doubled the resolution, but are also basically using 200% DPI settings. Everything will be the same size as before, but sharper.
I've owned an iPhone 3G (pre-retina display) and currently own an iPhone 4 (with retina display). The latter has double the resolution of the former, but everything is still the same size. And the difference in clarity, especially for text, is stunning. If screen-reading currently causes eye-strain for you, then increasing the resolution (without altering the physical font size) will only improve that.
Text actually benefits far more from high-DPI than movies or games.
"retina" display is the dumbest marketing gizmo speak since web 2.0 and podcasting.
When I heard the term for the first time, I thought they have made some sort projector that beams the picture directly onto the eye. Then I found out that they increased the dots on the screen..
Now that hosting an mp3 on a server is podcasting, increasing the resolution is "retina display" and adding comments on a site is web 2.0, - it doesn't surprise me that uploading your documents to a 99 cent hoster is called cloud computing nowadys.
Maybe I should write a Metro IRC client and and call it "Caster", and every hapless * who uses it would be "wordcasting" when typing, and "cloudcasting" when sending files. IRC itself would be never mentioned in the app, the network would be called retina-net 2, and the servers "retina-clouds" ("retina 2" because it all works in f*cking REAL TIME - Just as your eye movements! w00t!).
@wastingtimewithforums: I agree that retina as in Retina Display is a nonsense marketing term.
But I think you take it too far. Podcasting is a thing. When people talk about podcasts they're talking about a kind of programming delivered in a particular way. It is the modern replacement of radio shows. Streaming is used to broadly describe the delivery of MP3 (music) over the wire.
Web 2.0 typically means using AJAX/JSON to deliver contents to pages in-line without requiring reloading the page. It is particularly useful for operations like error checking and real-time updates of content (where the loss of the back-button isn't relevant anyway).
Cloud is an over-used term: but it is largely driven by a very real shift in technology and trends. The problem "cloud" has is that it is sometimes use to describe the storage of data on a third party server but other times used to describe the operation of servers as VMs. As I said it is over-used.
Just what I said. Take a decade old tech (since when was the xmlhttprequest object available? IE 5?). and hype it with a BS term.
You're also clearly wrong. The hypists define just about everything as web 2.0 nowadays. From bullying on facebook to writing comments about huge knockers on youporn.
@Sven Groot: That would be great if OS X and Windows could do that, but they cannot. At least not without introducing issues.
True in the general case yeah (although I'm mostly fine running at 120DPI on my laptop, there's only one or two applications that I use that don't work right).
But apple doubled the resolution, which makes it simple. Apps that explicitly state that they support the high resolution get to use it. Apps that don't simply get scaled 200%. That's how the iPhone did it. That's how the iPad did it. And that's how OSX is now also doing it.
It means that for some time, you mostly get apps with slightly fuzzy graphics (at worst, they look the same as they did on the lower resolution display), although the text will still be super-sharp in almost all applications. But over time, more and more will support the high resolution.
Safari will do the same for the web, and simply apply a 200% zoom level to all pages. I assume there's some way for pages to provide high-res graphics if they want, though that I don't know about.