Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

seniorchang

macrumors member
Original poster
Jun 14, 2012
78
0
With the rMBP released and retina iMac rumors going on, do you think the Mac Mini will also get Retina this year?
 
Don't you mean can we have an external retina display and a mac mini powerful enough to drive it?
 
I know, but isn't Retina about the max resolution capable by the graphic hardware?

Mostly. Basically, for something to be Retina, it has to be at a resolution where the human eye cannot decern the displayed text for print. You can have a high resolution screen but with lower quality. Example: look at any dell laptop. Those pixels are the size of legos and you can easily see the grid dividing them all up.
 
If I remember correctly, the intel HD 4000 can drive a 4k display. Apple won't call it retina though, because if you have have that ungodly high resolution on a crappy display, it won't mean jack.
 
Considering HiDPI is built into Lion it's definitely capable; Using applicable iPad Apps I can run my iPad 3 as a Retina Monitor, though it's only 768x1024. It should certainly be capable of running the screen similar in size to the rMBP if it gets a spec bump this year.
 
First off, the Nvidia 650m struggles with the "retina" display in the new rMBP. At best, we can hope for an Nvidia 640E in the next discrete Mac Mini which will be considerably faster than the HD4000 that will come in the base Mini. What I am saying is while the GPUs can technically drive 4K screens, I don't think it will do it well. But then again, how many "retina" external displays are there right now?
 
Mostly. Basically, for something to be Retina, it has to be at a resolution where the human eye cannot decern the displayed text for print.

I am sure you know what Retina is. And I am sure you realise the sentence above is not what it is. (Although it is quite amusing because running the new Retina Macbook Pro at native res probably would make the text too small for the human eye to see it.)
 
I am sure you know what Retina is. And I am sure you realise the sentence above is not what it is. (Although it is quite amusing because running the new Retina Macbook Pro at native res probably would make the text too small for the human eye to see it.)

Please correct me. I was trying to regurgitate what I remember behind Jobs presenting retina at WWDC.
 
A "retina" (220dpi) display 21" or larger costs over $2000 just to make, not including retail mark-up. Are you really willing to buy a display worth three times your computer? :confused:
 
Please correct me. I was trying to regurgitate what I remember behind Jobs presenting retina at WWDC.

I believe you're looking for a definition like this: A retina display has pixels that are adequately dense (and a corresponding rendering engine in the OS adequately sophisticated) that you can no longer discern the individual pixels which work together to create images on the screen. The images themselves can be any arbitrary physical size; what matters isn't the overall size of the image, but rather the coarseness of the grid that is used to render the images.

So, as far as the average-sighted person would be concerned, the text (and all other visual elements on the screen) would appear as though they were made out of solid lines rather than rasterized collections of rows and columns of pixels.

The threshold for "retina" resolution varies depending on the distance you hold the image away from your face.

For a fixed
 
while they are very expensive today, 50 inch plasma screens in the 90's cost 20k now under 800

Which 1990's video system are you using with that modern display?

Future prices don't matter. Using your "1990's plasma screen" argument, in 20 years the 2012 Mini will be obsolete and unable to run modern software adequately let alone at "retina" resolution.
Try making a 1990's PowerMac play a 1080p video and you'll see what I mean. It can run the video player software and display the video, but it's antique performance makes it unusable for the task.
 
Oh, really?
Then how should we call a display that has such a high resolution that, when viewed at usual (normal) distance next to a display with two times higher density, they both look the same?

Think about what retina should be again. Not being able so notice pixelation OR go so far that doubling the resolution / density would make no sense as it would look the same.
 
I use my mini as an iTunes server. It's plugged into my tv, which at 32", 720p, and 3 metres away, IS retina! There's a 46" 1080p tv in my future, which will also qualify, and I have no doubt my 4000hd video card can drive it.
 
I use my mini as an iTunes server. It's plugged into my tv, which at 32", 720p, and 3 metres away, IS retina! There's a 46" 1080p tv in my future, which will also qualify, and I have no doubt my 4000hd video card can drive it.

Here are your current TV and your future TV:

angularresolutiondimini.png


Just because you can't see pixelation doesn't mean you wouldn't benefit from higher resolution. Just ask those who had a chance to see an 8K TV.

At 3 meter, angular resolution of your 32'' 720p TV is 95 pixels per degree. 46'' 1080p would be 99 pixels per degree.
 
Just because you can't see pixelation doesn't mean you wouldn't benefit from higher resolution.

Um, how would I benefit further if I already can't see pixelation? Surely that's the definition of Retina? Not that it matters that much anyway, I don't have much 1080p content to show on it, and it'll be a long time before I get anything higher.

Thanks for the graph, it's surprising they're so close. It's funny to think how big that 32" screen looked five years ago when it replaced a 28" CRT, and how it now seems so small when I can barely read the HUD in most current games. I was going to get a 40" with a slim bezel, which would actually have fitted into a smaller space, but I figured, what the hell.
 
Um, how would I benefit further if I already can't see pixelation? Surely that's the definition of Retina?

Not the same thing. At angular resolution of 20, 25 pixels per degree, you can's see pixelation or pixel grid (not talking about aliasing).

However, average person can perceive much higher quality in terms of angular resolution. Sense of realness of an object on display goes up up to about 200 pixels per degree. It seems like you wouldn't benefit much from those extra pixels but those who had opportunity to take a look at 8K displays say that content looked way more realistic than if it were on 1080p display.

To achieve those 200 pixels per degree, you'd need to follow this chart.

200ppdengleski.png
 
Fair enough, not having seen 8K I can't claim it won't be better. Even after seeing an iPhone 4 I was surprised how good the new iPad is. The point about the content stands though - it doesn't matter how good the TV is if I'm using it mostly to watch 525p or 720p content.

Is the Y axis on that chart correct? Who sits 60 feet away from their TV? If it is correct, than assuming the viewing distance and screen size stays the same, in 5 years or so I'll get a 4k TV.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.