Apple does not ship any laptops that cost more (by default) than their retina equivalents. There are still trade-offs to high resolution displays, cost being one of them (power being the other). I suspect Apple will migrate all their notebooks to high DPI displays when they can (as costs go down and power efficiency goes up), but Haswell is the first generation that makes a high DPI ultrabook (or mac air) feasible.
In fact, much like Haswell has lead to some high DPI Ultrabook announcements, I suspect it will lead to Apple pushing their migration to retina displays a bit further. They may introduce a 13-inch retina Air (the 11 inch is a lot less likely at this point for cost/power reasons), or they may complete the migration of one of the Pro notebooks by discontinuing the non-retina version.
PC manufacturers will follow Apple off a cliff like idiot lemmings. 13" 1080P is enough if not more than enough, but Apple (a premium brand, "the premium brand") decides to distinguish itself with 1600P on 13" and that's completely fine for them to do. But why oh why would lost, fragmented PC makers follow? It's ridiculous, who's making these decisions?... I swear the majority of your readers could advise these companies straight to the top. Super pixel density is already claimed, Apple claimed it (good for them, it's neat, it's great). Let Apple have that and claim all that is obvious, practical, useful and unclaimed... 15" 1920x1200, 15" 2560x1440, 17" 2560x1440. I'm typing this on a 13" 1080P glossy screen and I want more brightness and that's all I want, otherwise it's freaking "perfect". I'm not a manufacturing genius, but I have to imagine that doing a screen like this one but 17" and 1440P can't be that hard. What am I missing? Why do we as customers/users have to live in constant frustration from the horrid company choices?
The goal should be 300 dpi displays and an OS that can draw the interface based in standard world dimensions (like inches or centimeters) instead of pixels. I'm all for higher dpi displays.
It's not a simple problem to solve; if you standardize on real-world measurements ("this dialog box is 3 inches wide"), then something that works well on a 27" screen will be a horrible experience on a 11" screen.
A better solution may be to decouple render resolution and display resolution entirely. Instead of varying the display resolution, vary the render resolution. So if your display is 2560x1440, your backbuffer that you render to might be anything from 2560x1440 to 5120x2880, but a dialog box might always be 200 pixels wide. By varying the resolution of the backbuffer, you can change the size of the dialog box on-screen without losing detail (because you're always downscaling for display).
I believe Apple took an approach similar to this, and I hope Microsoft does as well.
Just because you're downscaling doesn't mean it will always look good. Even at a perfect 1/2 scale you will get awkward looking text and images that still need to apply filtering. That and rendering at something like 5120x2880 just for the purpose of downscaling later is a huge waste in gpu performance and screen composition.
Well, you give up hinting for antialiasing, but Apple doesn't do AA hinting on text anyhow. If you're doing pure supersampling, stuff is generally going to look good at these pixel densities regardless of the supersample ratio.
In terms of rendering at higher resolution being a waste of composition power, I'd argue that it's not that burdensome (remember we're not talking about rendering 3D graphics at that resolution, but doing 2D composition), it's exactly what Apple is doing in OS X, and it seems to work quite well... They're using a GF 650M in the 15" retina, which is only a bit faster than the fastest Haswell chips, and they're using an HD 4000 in the 13" retina, which is a great deal slower than the fastest Haswell chips, and things seem to be working out for them.
I think they should be given some credit for being the first company to pull off a consumer operating system with resolution-independent rendering... As a primarily PC user, I would have really liked to have something like that years ago, to deal with high res notebook displays. Windows DPI scaling never worked very well, however.
Rendering oversized frames and rescaling them means you have to move a lot more data around in memory. This wastes energy, because high-bandwidth off-chip buses are power hogs. Also, it messes up subpixel AA unless the text rendering library knows what the native resolution of the screen is so it can offset 1/3 of a native resolution pixel, instead of 1/3 of a fake framebuffer pixel.
>I think they should be given some credit for being the first company to pull off a consumer operating system with resolution-independent rendering...
Resampling the entire desktop is not resolution-independent rendering.
Have you ever looked at the Retina displays? If you did, you'd understand it's not about the resolution but the pixel density. The 13" retina MacBook Pro screen is actually a "1280x800" screen in the default Retina mode. It's less space than your 13" 1080P screen but it is twice as sharp (not exactly twice but feels like it).
More pixel density at the same *virtual* resolution means text becomes much shaper without making them smaller. Folks who tried Retina display could not go back to the older 1080P 13" because the text are now blurry to them.
For an example, 15" resolution: 2880x1800 panel but in Retina mode it is 1440x900, that's how much space we have on 15". Yes, we could switch it to use more space but it's not as sharp.
The point I'm trying to make is that many folks actually want HiDPI screens because it's that easier on their eyes and it's more useful than having higher resolution with everything smaller.
Many of my Windows friends are waiting for these HiDPI screens.
If you're not an expert, then why would you bother saying it is easy? The manufacturing defects of these devices is probabilistic, the larger the size the greater the chance of manufacturing defects. Clearly manufacturing these high density displays is difficult at the best of times, and judging by much even tablet size screens have lagged behind phone screens in pixel density is a clear indication that manufacturing defects are a serious problem, now imagine that same defect rate on a screen with ~3.5x the area (10" to 17") or 5.75x (10" to 24").
The higher the DPI, the better scaled resolutions look. On 1080p or 900p screens, running a 768p resolution to play a game (because the iGPU can only run it at that resolution with good frame rates) looks pretty bad. But by increasing the DPI, you make those resolutions look okay to good again. Also, in the case of 1440p/1600p, you can just run the game at 720p/800p and have perfect scaling. And this will force MS and other developers to put some thought into good scaling for their OS/programs.
If games rendered the UI at native resolution and everything else at whatever-the-GPU-can-handle we wouldn't have such problems at all. Wouldn't sell new GPUs, though.
For someone with perfect eyesight (not everybody but still a good benchmark) they should be able to distinguish 2 pixels in a 13" retMBP under 38cm (15") that is a good working distance for a laptop. Higher densities would be probably useless, deciding factor being how often do you work closer to your display than the definition distance. For old displays (1440x900 13.3") that distance becomes 68cm (27").
Have you ever actually used an Apple "retina display"? Once you try an iPad 4, the text on PC screens looks fuzzy and indistinct in comparison. Text rendering at HiDPI resolutions is really beautiful.
And from a technical point of view, HiDPI will eventually let us phase out hacks like hinting and antialiasing on fonts. None of this is done on printers (300-1200 DPI) - it's purely to allow acceptable results at the shamefully low resolutions we currently suffer through on our monitors.
We already can phase out hinting (as we should, because it looks terrible). To phase out antialiasing, you'd have to render at a resolution so high that the spatial bandwidth cutoff of the human eye becomes the AA filter in most of the populace (99th percentile would probably be good). That would look only marginally better than using a lower resolution (say, 60th percentile) and using AA, and you'd have to shuffle much bigger buffers around, and use significantly brighter backlights. This wastes power.
> "Windows has traditionally done a terrible job of DPI scaling on the desktop"
How exactly is this a Windows fault? Applications need to tell they're DPI aware, and propertly support it. Windows tries to upscale low resolution icons, but its application's fault if it looks bad at High DPI. It's like complaining that Chrome looks bad (and IE10 doesn't).
It's really a mixed bag. I have a rMBP and I use Windows on it often. Windows 7 scales poorly. Even the base OS is full of ugly. It is usable, but it's a long way from working properly (Win7x64 is also still retarded about its expectations at boot time).
Windows 8 is significantly better in all respects. Metro is all just fine. When you go to the desktop layer much has been fixed. There's glitches here and there but overall I would say the improvement is significant. Very usable.
Programs are where things fall apart. Office 365 seems to do a pretty good job, as do some of the flagship Win8 Microsoft bundled applications (IE). However, outside of that it's the wild west. 3rd party programs are almost universally horrible, and all of Microsoft's unconverted software is equally lame. Unsurprisingly games are commonly the most compatible, able to select and use the very high resolutions (although sometimes the U/I assets are half size).
Since Vista, the way DPI scaling works in Windows is this: If the underlying app says "I'm DPI-aware" in the manifest, then the OS trusts it and lets it display the way it wants. If that flag is absent, then the window is drawn at standard DPI to an offscreen buffer and then upscaled to the correct DPI setting during rendering. It will look fuzzy, but should not have any out-of-place or missing elements (as the XP scaling method sometimes did).
Unfortunately, there's nothing stopping an application from saying "I'm DPI-aware!" when it's really not. This is where most of the errors come from.
Currently, Adobe has fixed Lightroom to work with HiDPI settings properly, but Photoshop hasn't been done yet. They say it will be done soon, but who knows if that will be ported to CS6 or if you'll have to shell out the monthly "cloud" fee to get this basic fix. Acrobat hasn't been fixed to support HiDPI either.
Firefox 21 doesn't properly support HiDPI, but Firefox 22 will.
Still doing the 16:9 ratio I see. I still wish they would move back toward the taller displays. If someone would offer the Chromebook Pixel hardware with Windows on it, I sure would like to try it. In my opinion 16:10 is the minimum height. 4:3 might be taller than needed.
I would like to directly compare 16:10 with 3:2 to see which is better.
Anand, do you ever look at these magnified images of the screen and say to yourself that they could make even higher PPI/DPI screens but they are choosing not to? Well we know that they can have it up in the 400s but I believe that they have the capability to go even much higher but are choosing not to. Just thinking about Moores law and how it would have applied to LCDs. Over a decade ago we had at the cutting edge 200 PPI/DPI 22" LCDs, now we are barely over 100! Ridiculous!
You do realize that those 400+ PPI/DPI screens are used on cell phones where power consumption is a very important consideration so why would it all of a sudden start consuming lots of power when scaled up to 20+"? As for cost, if they already have to technology to build 400+ PPI/DPI screens most to the cost are already sunk so only the up-scaling cost is left for the manufacturer.
The only "cost" is to us, the consumers, who are being suckered into believing that the exorbitant cost they are going to try selling this technology at is justified because “they have to keep the runaway power consumption down and it’s expensive to do.” I hope there are more companies like Seiki to burst the bubble.
I can't see the difference between a 720p 4.7" screen and a 1080p one, to be honest. It's all marketing. Well, on LCDs anyway. On OLEDs, the shrunken subpixels means they finally don't look jarringly horrid.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
38 Comments
Back to Article
xTRICKYxx - Thursday, June 6, 2013 - link
I think the days of 1366x768 Displays are behind us!hypopraxia - Thursday, June 6, 2013 - link
I sure hope.Gigaplex - Thursday, June 6, 2013 - link
They'll still be around for years and years at the low end of the market. Hopefully they'll soon vanish completely from the mid range gear.tipoo - Friday, June 7, 2013 - link
Yet Apple still uses 1280x800 on laptops over 1000 dollars :/Guspaz - Friday, June 7, 2013 - link
Apple does not ship any laptops that cost more (by default) than their retina equivalents. There are still trade-offs to high resolution displays, cost being one of them (power being the other). I suspect Apple will migrate all their notebooks to high DPI displays when they can (as costs go down and power efficiency goes up), but Haswell is the first generation that makes a high DPI ultrabook (or mac air) feasible.In fact, much like Haswell has lead to some high DPI Ultrabook announcements, I suspect it will lead to Apple pushing their migration to retina displays a bit further. They may introduce a 13-inch retina Air (the 11 inch is a lot less likely at this point for cost/power reasons), or they may complete the migration of one of the Pro notebooks by discontinuing the non-retina version.
gochichi - Thursday, June 6, 2013 - link
PC manufacturers will follow Apple off a cliff like idiot lemmings. 13" 1080P is enough if not more than enough, but Apple (a premium brand, "the premium brand") decides to distinguish itself with 1600P on 13" and that's completely fine for them to do. But why oh why would lost, fragmented PC makers follow? It's ridiculous, who's making these decisions?... I swear the majority of your readers could advise these companies straight to the top. Super pixel density is already claimed, Apple claimed it (good for them, it's neat, it's great). Let Apple have that and claim all that is obvious, practical, useful and unclaimed... 15" 1920x1200, 15" 2560x1440, 17" 2560x1440. I'm typing this on a 13" 1080P glossy screen and I want more brightness and that's all I want, otherwise it's freaking "perfect". I'm not a manufacturing genius, but I have to imagine that doing a screen like this one but 17" and 1440P can't be that hard. What am I missing? Why do we as customers/users have to live in constant frustration from the horrid company choices?jaroche - Thursday, June 6, 2013 - link
But finally PC makers are shipping IPS high resolution displays, thanks to the Retina Display BS.pixelstuff - Thursday, June 6, 2013 - link
The goal should be 300 dpi displays and an OS that can draw the interface based in standard world dimensions (like inches or centimeters) instead of pixels. I'm all for higher dpi displays.Guspaz - Friday, June 7, 2013 - link
It's not a simple problem to solve; if you standardize on real-world measurements ("this dialog box is 3 inches wide"), then something that works well on a 27" screen will be a horrible experience on a 11" screen.A better solution may be to decouple render resolution and display resolution entirely. Instead of varying the display resolution, vary the render resolution. So if your display is 2560x1440, your backbuffer that you render to might be anything from 2560x1440 to 5120x2880, but a dialog box might always be 200 pixels wide. By varying the resolution of the backbuffer, you can change the size of the dialog box on-screen without losing detail (because you're always downscaling for display).
I believe Apple took an approach similar to this, and I hope Microsoft does as well.
inighthawki - Friday, June 7, 2013 - link
Just because you're downscaling doesn't mean it will always look good. Even at a perfect 1/2 scale you will get awkward looking text and images that still need to apply filtering. That and rendering at something like 5120x2880 just for the purpose of downscaling later is a huge waste in gpu performance and screen composition.Guspaz - Friday, June 7, 2013 - link
Well, you give up hinting for antialiasing, but Apple doesn't do AA hinting on text anyhow. If you're doing pure supersampling, stuff is generally going to look good at these pixel densities regardless of the supersample ratio.In terms of rendering at higher resolution being a waste of composition power, I'd argue that it's not that burdensome (remember we're not talking about rendering 3D graphics at that resolution, but doing 2D composition), it's exactly what Apple is doing in OS X, and it seems to work quite well... They're using a GF 650M in the 15" retina, which is only a bit faster than the fastest Haswell chips, and they're using an HD 4000 in the 13" retina, which is a great deal slower than the fastest Haswell chips, and things seem to be working out for them.
I think they should be given some credit for being the first company to pull off a consumer operating system with resolution-independent rendering... As a primarily PC user, I would have really liked to have something like that years ago, to deal with high res notebook displays. Windows DPI scaling never worked very well, however.
vegemeister - Monday, August 5, 2013 - link
Rendering oversized frames and rescaling them means you have to move a lot more data around in memory. This wastes energy, because high-bandwidth off-chip buses are power hogs. Also, it messes up subpixel AA unless the text rendering library knows what the native resolution of the screen is so it can offset 1/3 of a native resolution pixel, instead of 1/3 of a fake framebuffer pixel.>I think they should be given some credit for being the first company to pull off a consumer operating system with resolution-independent rendering...
Resampling the entire desktop is not resolution-independent rendering.
MikhailT - Friday, June 7, 2013 - link
Have you ever looked at the Retina displays? If you did, you'd understand it's not about the resolution but the pixel density. The 13" retina MacBook Pro screen is actually a "1280x800" screen in the default Retina mode. It's less space than your 13" 1080P screen but it is twice as sharp (not exactly twice but feels like it).More pixel density at the same *virtual* resolution means text becomes much shaper without making them smaller. Folks who tried Retina display could not go back to the older 1080P 13" because the text are now blurry to them.
For an example, 15" resolution: 2880x1800 panel but in Retina mode it is 1440x900, that's how much space we have on 15". Yes, we could switch it to use more space but it's not as sharp.
MikhailT - Friday, June 7, 2013 - link
The point I'm trying to make is that many folks actually want HiDPI screens because it's that easier on their eyes and it's more useful than having higher resolution with everything smaller.Many of my Windows friends are waiting for these HiDPI screens.
Cold Fussion - Friday, June 7, 2013 - link
If you're not an expert, then why would you bother saying it is easy? The manufacturing defects of these devices is probabilistic, the larger the size the greater the chance of manufacturing defects. Clearly manufacturing these high density displays is difficult at the best of times, and judging by much even tablet size screens have lagged behind phone screens in pixel density is a clear indication that manufacturing defects are a serious problem, now imagine that same defect rate on a screen with ~3.5x the area (10" to 17") or 5.75x (10" to 24").Death666Angel - Friday, June 7, 2013 - link
The higher the DPI, the better scaled resolutions look. On 1080p or 900p screens, running a 768p resolution to play a game (because the iGPU can only run it at that resolution with good frame rates) looks pretty bad. But by increasing the DPI, you make those resolutions look okay to good again. Also, in the case of 1440p/1600p, you can just run the game at 720p/800p and have perfect scaling. And this will force MS and other developers to put some thought into good scaling for their OS/programs.MrSpadge - Friday, June 7, 2013 - link
If games rendered the UI at native resolution and everything else at whatever-the-GPU-can-handle we wouldn't have such problems at all. Wouldn't sell new GPUs, though.Torrijos - Friday, June 7, 2013 - link
First of all there are physics in play here...For someone with perfect eyesight (not everybody but still a good benchmark) they should be able to distinguish 2 pixels in a 13" retMBP under 38cm (15") that is a good working distance for a laptop.
Higher densities would be probably useless, deciding factor being how often do you work closer to your display than the definition distance.
For old displays (1440x900 13.3") that distance becomes 68cm (27").
JDG1980 - Friday, June 7, 2013 - link
Have you ever actually used an Apple "retina display"? Once you try an iPad 4, the text on PC screens looks fuzzy and indistinct in comparison. Text rendering at HiDPI resolutions is really beautiful.And from a technical point of view, HiDPI will eventually let us phase out hacks like hinting and antialiasing on fonts. None of this is done on printers (300-1200 DPI) - it's purely to allow acceptable results at the shamefully low resolutions we currently suffer through on our monitors.
vegemeister - Monday, August 5, 2013 - link
We already can phase out hinting (as we should, because it looks terrible). To phase out antialiasing, you'd have to render at a resolution so high that the spatial bandwidth cutoff of the human eye becomes the AA filter in most of the populace (99th percentile would probably be good). That would look only marginally better than using a lower resolution (say, 60th percentile) and using AA, and you'd have to shuffle much bigger buffers around, and use significantly brighter backlights. This wastes power.Meaker10 - Thursday, June 6, 2013 - link
You catch the msi 3k edition? 15.6 inch with 2880x1620 matched with 780m and 4930mx.jaroche - Thursday, June 6, 2013 - link
> "Windows has traditionally done a terrible job of DPI scaling on the desktop"How exactly is this a Windows fault? Applications need to tell they're DPI aware, and propertly support it. Windows tries to upscale low resolution icons, but its application's fault if it looks bad at High DPI. It's like complaining that Chrome looks bad (and IE10 doesn't).
EnzoFX - Thursday, June 6, 2013 - link
What about even basic system UI elements? I don't think those scale well either.Klimax - Friday, June 7, 2013 - link
They do. At minimum since XP...Spoony - Friday, June 7, 2013 - link
It's really a mixed bag. I have a rMBP and I use Windows on it often. Windows 7 scales poorly. Even the base OS is full of ugly. It is usable, but it's a long way from working properly (Win7x64 is also still retarded about its expectations at boot time).Windows 8 is significantly better in all respects. Metro is all just fine. When you go to the desktop layer much has been fixed. There's glitches here and there but overall I would say the improvement is significant. Very usable.
Programs are where things fall apart. Office 365 seems to do a pretty good job, as do some of the flagship Win8 Microsoft bundled applications (IE). However, outside of that it's the wild west. 3rd party programs are almost universally horrible, and all of Microsoft's unconverted software is equally lame. Unsurprisingly games are commonly the most compatible, able to select and use the very high resolutions (although sometimes the U/I assets are half size).
JDG1980 - Friday, June 7, 2013 - link
As these HiDPI screens become more popular, the application vendors will have to fix their products.JDG1980 - Friday, June 7, 2013 - link
Since Vista, the way DPI scaling works in Windows is this: If the underlying app says "I'm DPI-aware" in the manifest, then the OS trusts it and lets it display the way it wants. If that flag is absent, then the window is drawn at standard DPI to an offscreen buffer and then upscaled to the correct DPI setting during rendering. It will look fuzzy, but should not have any out-of-place or missing elements (as the XP scaling method sometimes did).Unfortunately, there's nothing stopping an application from saying "I'm DPI-aware!" when it's really not. This is where most of the errors come from.
Currently, Adobe has fixed Lightroom to work with HiDPI settings properly, but Photoshop hasn't been done yet. They say it will be done soon, but who knows if that will be ported to CS6 or if you'll have to shell out the monthly "cloud" fee to get this basic fix. Acrobat hasn't been fixed to support HiDPI either.
Firefox 21 doesn't properly support HiDPI, but Firefox 22 will.
pixelstuff - Thursday, June 6, 2013 - link
Still doing the 16:9 ratio I see. I still wish they would move back toward the taller displays. If someone would offer the Chromebook Pixel hardware with Windows on it, I sure would like to try it. In my opinion 16:10 is the minimum height. 4:3 might be taller than needed.I would like to directly compare 16:10 with 3:2 to see which is better.
mjz - Thursday, June 6, 2013 - link
thats why i get apple laptops.. taller displays than the typical PC laptopseallan - Thursday, June 6, 2013 - link
Google Pixel??piroroadkill - Friday, June 7, 2013 - link
Good comment. Chromebook Pixel has the best screen of the lot. A refreshing taller ratio. 16:9 on PCs needs to be sidelined.Vi0cT - Thursday, June 6, 2013 - link
Hey it's me or no one here is going to talk about the Fujitsu Lifebook UH90? it got a 3200 x 1800 14" display.piroroadkill - Friday, June 7, 2013 - link
You're right, damn, that's one hell of a resolution..Taracta - Friday, June 7, 2013 - link
Anand, do you ever look at these magnified images of the screen and say to yourself that they could make even higher PPI/DPI screens but they are choosing not to? Well we know that they can have it up in the 400s but I believe that they have the capability to go even much higher but are choosing not to. Just thinking about Moores law and how it would have applied to LCDs. Over a decade ago we had at the cutting edge 200 PPI/DPI 22" LCDs, now we are barely over 100! Ridiculous!MrSpadge - Friday, June 7, 2013 - link
400+ ppi? Increasig cost and power consumption for no good reason other than "to have the higher pixels"? Yay, sign me up!Taracta - Friday, June 7, 2013 - link
You do realize that those 400+ PPI/DPI screens are used on cell phones where power consumption is a very important consideration so why would it all of a sudden start consuming lots of power when scaled up to 20+"? As for cost, if they already have to technology to build 400+ PPI/DPI screens most to the cost are already sunk so only the up-scaling cost is left for the manufacturer.The only "cost" is to us, the consumers, who are being suckered into believing that the exorbitant cost they are going to try selling this technology at is justified because “they have to keep the runaway power consumption down and it’s expensive to do.” I hope there are more companies like Seiki to burst the bubble.
Azurael - Friday, June 7, 2013 - link
I can't see the difference between a 720p 4.7" screen and a 1080p one, to be honest. It's all marketing. Well, on LCDs anyway. On OLEDs, the shrunken subpixels means they finally don't look jarringly horrid.roltzje - Wednesday, June 12, 2013 - link
I plan on getting a 1080p 12-13" touchscreen for my next laptop.. praying that 8.1 fixes the scaling issues!