The rMBP is using LPDDR3 and you can't get more than 16GB.
Going with a non-LP solution would increase power draw dramatically and require extra room.
About how much additional power are we talking?
The rMBP is using LPDDR3 and you can't get more than 16GB.
Going with a non-LP solution would increase power draw dramatically and require extra room.
About how much additional power are we talking?
I've looked at mounts. It isn't just the stand but the appalling plastic enclosure. Apple ditched acrylic plastics back in the early 2000's. We've enjoyed a decade or more of sleek aluminum. My point is really if apple and lg were really closely working on this to essentially be an "apple/Mac edition" it really could have been a lot more tantalizing with a sleek aluminum enclosure with matte screen something the pros have been crying out for since the release of the first cinema led display in 2008.wall mount that thing.. not that this is even an option for the majority of usages but if it is an option for you, consider doing it that way as it will probably look pretty ok then..
(assuming it's mountable in the first place)
As per the other thread i am not sure there is a saving between LPDDR3 and DDR4 as they both run at 1.2V. There is between DDR3 and LPDDR3 though (obviously)
The Imacs have had an ugly chin since the half-a-soccer ball Imac. Now the monitors have a forehead.A wall mount does not help against the ugliest aspect of the monitor, the top bezel, that top bezel alone is the reason i could not have the monitor sitting infront of my face
THIS. I've said it before but I think it bears repeating.I've looked at mounts. It isn't just the stand but the appalling plastic enclosure. Apple ditched acrylic plastics back in the early 2000's. We've enjoyed a decade or more of sleek aluminum. My point is really if apple and lg were really closely working on this to essentially be an "apple/Mac edition" it really could have been a lot more tantalizing with a sleek aluminum enclosure with matte screen something the pros have been crying out for since the release of the first cinema led display in 2008.
Overall it seems like it's half baked in my opinion. Furthermore it seems thicker than the Thunderbolt Display (similar to the iMac how it bluges out the rear center). I'm personally holding off to see what other third parties will offer in the coming year.
yeah, i don't think it will make it look amazing.. just better.A wall mount does not help against the ugliest aspect of the monitor, the top bezel, that top bezel alone is the reason i could not have the monitor sitting infront of my face
I am unclear why apple wouldn't make its own standalone display esp if the iMacs, the rMBPs are essentially inbuilt display devices. Their entire line up, barring the Mac mini and Mac Pro involves display.
The standalone TBD can also act as a powered USB/TB hub.
Just saw the news on anandtech and immediately though of it as well. It's high time, innit.Can the new Radeon Pro be an indication of a new Mac Pro release? They where show off several months ago and the first
estimate was a early 2017 release date. Now though, they seem to be here with "Currently the 7100 and 4100 are expected after the 10th while the 5100 should show up on the 18th." https://www.pcper.com/news/Graphics-Cards/AMD-Releases-New-Generation-Radeon-Pro-Workstation-Cards
Soooo, nothing should really stop Apple from releasing the Mac Pro in November, all parts are here, readily available, especially for a late November release of the Mac Pro. If the Mac Pro isn't buried yet ofcourse
I wish I can believe that. Of course anything is possible. What about the processor? What would they use?Can the new Radeon Pro be an indication of a new Mac Pro release? They where show off several months ago and the first
estimate was a early 2017 release date. Now though, they seem to be here with "Currently the 7100 and 4100 are expected after the 10th while the 5100 should show up on the 18th." https://www.pcper.com/news/Graphics-Cards/AMD-Releases-New-Generation-Radeon-Pro-Workstation-Cards
Soooo, nothing should really stop Apple from releasing the Mac Pro in November, all parts are here, readily available, especially for a late November release of the Mac Pro. If the Mac Pro isn't buried yet ofcourse
Yes, i agree, but the same thing was true when Apple introduced the original trashcan, atleast now, if Apple update the Mac Pro, they will use fairly new GPU:sI know specs don't tell the whole story, but they look to be way down on TFLOPS performance vs. Nvidia Pascal lineup. Color me underwhelmed, but I'll wait to see official benchmarks.
They would have to go with the E5v4, not brand new CPU:s ofcourse but atleast they where released this year Or, they would have to wait until the v5, which will be released next year, early or middle/late 2017 depending on platform, that would probably mean a whole year more waiting probably.I wish I can believe that. Of course anything is possible. What about the processor? What would they use?
More soundable for next year mark. I'm actually okay with current CPU.Yes, i agree, but the same thing was true when Apple introduced the original trashcan, atleast now, if Apple update the Mac Pro, they will use fairly new GPU:s
They would have to go with the E5v4, not brand new CPU:s ofcourse but atleast they where released this year Or, they would have to wait until the v5, which will be released next year, early or middle/late 2017 depending on platform, that would probably mean a whole year more waiting probably.
Try and run any modern game at maxed settings then get back with me on what's wrong with AMD's cards. Never mind 4k. And never mind GPGPU. AMD have given up on competing at the high end, and instead are trying to survive on price alone.So, when the 7,1 finally arrives.
The new GPUs that it has will be slightly breathed upon versions of the current D3/5/700 items.
Serious question, what is it about these cards that makes them unsuitable for gaming and also would you really notice any difference if you went from 1080p @ 30Hz to 1080p @60Hz?
Pricing is wrong all the way. Dual D700 is 1000$. Dual RX 470 which will be faster, and more powerful for compute are 340$.So, when the 7,1 finally arrives.
The new GPUs that it has will be slightly breathed upon versions of the current D3/5/700 items.
Serious question, what is it about these cards that makes them unsuitable for gaming and also would you really notice any difference if you went from 1080p @ 30Hz to 1080p @60Hz?
Maybe, but they must be better at something. This is what I’m wondering. What are they missing?Try and run any modern game at maxed settings then get back with me on what's wrong with AMD's cards. Never mind 4k. And never mind GPGPU. AMD have given up on competing at the high end, and instead are trying to survive on price alone.
Check out the VR comparisons at Hardocp.com for another eye opener. Or the hashcat.net forums.
So the compute metric is what you need to look out for normally?Pricing is wrong all the way. Dual D700 is 1000$. Dual RX 470 which will be faster, and more powerful for compute are 340$.
1000$ you can pay for 4K GPUs. Not basically for 1080p gaming GPUs.
The E5v4s look sweet enough for an update now. They can go for another update later with the next die shrunk whenever it comes to pass.
Just give us some pascal options please.
In new games, based on low-level APIs that is metric that reflect the performance of the GPUs. Vulkan for example difference between GTX 1070 vs RX 480 - 10% of performance. Compute difference? 10% 5.8 TFLOPs vs 6.5 TFLOPs.So the compute metric is what you need to look out for normally?
That sounds depressing.Try and run any modern game at maxed settings then get back with me on what's wrong with AMD's cards. Never mind 4k. And never mind GPGPU. AMD have given up on competing at the high end, and instead are trying to survive on price alone.
Check out the VR comparisons at Hardocp.com for another eye opener. Or the hashcat.net forums.