Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
wall mount that thing.. not that this is even an option for the majority of usages but if it is an option for you, consider doing it that way as it will probably look pretty ok then..
(assuming it's mountable in the first place)
I've looked at mounts. It isn't just the stand but the appalling plastic enclosure. Apple ditched acrylic plastics back in the early 2000's. We've enjoyed a decade or more of sleek aluminum. My point is really if apple and lg were really closely working on this to essentially be an "apple/Mac edition" it really could have been a lot more tantalizing with a sleek aluminum enclosure with matte screen something the pros have been crying out for since the release of the first cinema led display in 2008.

Overall it seems like it's half baked in my opinion. Furthermore it seems thicker than the Thunderbolt Display (similar to the iMac how it bluges out the rear center). I'm personally holding off to see what other third parties will offer in the coming year.
 
As per the other thread i am not sure there is a saving between LPDDR3 and DDR4 as they both run at 1.2V. There is between DDR3 and LPDDR3 though (obviously)


That's what I thought as well, but I keep hear people mention the power and I haven't seen any actual test of it myself. I'd love to actually see some data.
 
A wall mount does not help against the ugliest aspect of the monitor, the top bezel, that top bezel alone is the reason i could not have the monitor sitting infront of my face ;)
The Imacs have had an ugly chin since the half-a-soccer ball Imac. Now the monitors have a forehead. ;)
 
I've looked at mounts. It isn't just the stand but the appalling plastic enclosure. Apple ditched acrylic plastics back in the early 2000's. We've enjoyed a decade or more of sleek aluminum. My point is really if apple and lg were really closely working on this to essentially be an "apple/Mac edition" it really could have been a lot more tantalizing with a sleek aluminum enclosure with matte screen something the pros have been crying out for since the release of the first cinema led display in 2008.

Overall it seems like it's half baked in my opinion. Furthermore it seems thicker than the Thunderbolt Display (similar to the iMac how it bluges out the rear center). I'm personally holding off to see what other third parties will offer in the coming year.
THIS. I've said it before but I think it bears repeating.

Apple's first Acrylic Cinema Displays were nothing short of breathtaking.

Then the Aluminum (Aluminium) versions... holy cow!!! Truly lustworthy, and people snatched them up left and right, just to have a matching monitor. They also integrated nicely into the ecosystem:

- adjust brightness easily via keyboard or monitor
- sleep or wake your machine via a touch on your monitor
- firewire ports
- matched iSight nicely

This lazy ass **** with LG is deplorable, and Apple ought to be ashamed of themselves. When they're making that much money, adding some nice touches (yet functional and thoughtful) for the sake of it goes a long, long way. Individuals and corporations alike will spend a bit more for one stop shopping and slick integration.

What we're seeing now is crap.

I used to say that Bang & Olufsen should have taken a page or two from Apple's playbook, but now I think the pendulum's swinging the other way.
 
A wall mount does not help against the ugliest aspect of the monitor, the top bezel, that top bezel alone is the reason i could not have the monitor sitting infront of my face ;)
yeah, i don't think it will make it look amazing.. just better.

idk, i have a samsung display that i use for a second monitor to an imac.. it looks sort of like the LG:

sung.png

...albiet, this thing has a worse pedestal imo.. getting it onto the wall (with wide range of movement) made it look a lot better as well as enhance the usability..

just an idea was all.
 
I am unclear why apple wouldn't make its own standalone display esp if the iMacs, the rMBPs are essentially inbuilt display devices. Their entire line up, barring the Mac mini and Mac Pro involves display.

The standalone TBD can also act as a powered USB/TB hub.
 
  • Like
Reactions: Restes
I am unclear why apple wouldn't make its own standalone display esp if the iMacs, the rMBPs are essentially inbuilt display devices. Their entire line up, barring the Mac mini and Mac Pro involves display.

The standalone TBD can also act as a powered USB/TB hub.

Apple used to make printers, too. I think the display business has evolved enough that there's so much competition, and very little to differentiate new Apple monitors from the competition. Weren't the old displays a response to a lack of high quality monitors up to Apple's standard? Maybe now the landscape's different. They saved themselves manufacturing cost of bespoke aluminum chassis, while filling the need for TB3 monitors that fit the same profile as their iMacs (Retina, P3). Personally, when looking for quality monitors I settle for the ugly but proficient kind, from EZIO or NEC.

I have read (from https://twitter.com/ATP_Tipster1 ) that the internals of these monitors are probably Apple designed, and that they were meant as eGPU displays at first, before that was nixed. That's why there's eGPU support in the software, at the moment anyway. The eGPU monitor would have been a coup for Apple, I think, but since these displays are so expensive already it might have not been really financially feasible to ask people to buy 50% of an iMac to go along with their new 25% more expensive MacBook Pro.
 
Last edited:
  • Like
Reactions: singhs.apps
Can the new Radeon Pro be an indication of a new Mac Pro release? They where show off several months ago and the first
estimate was a early 2017 release date. Now though, they seem to be here with "Currently the 7100 and 4100 are expected after the 10th while the 5100 should show up on the 18th." https://www.pcper.com/news/Graphics-Cards/AMD-Releases-New-Generation-Radeon-Pro-Workstation-Cards

Soooo, nothing should really stop Apple from releasing the Mac Pro in November, all parts are here, readily available, especially for a late November release of the Mac Pro. If the Mac Pro isn't buried yet ofcourse
 
Can the new Radeon Pro be an indication of a new Mac Pro release? They where show off several months ago and the first
estimate was a early 2017 release date. Now though, they seem to be here with "Currently the 7100 and 4100 are expected after the 10th while the 5100 should show up on the 18th." https://www.pcper.com/news/Graphics-Cards/AMD-Releases-New-Generation-Radeon-Pro-Workstation-Cards

Soooo, nothing should really stop Apple from releasing the Mac Pro in November, all parts are here, readily available, especially for a late November release of the Mac Pro. If the Mac Pro isn't buried yet ofcourse
Just saw the news on anandtech and immediately though of it as well. It's high time, innit.
 
  • Like
Reactions: Aldaris
I know specs don't tell the whole story, but they look to be way down on TFLOPS performance vs. Nvidia Pascal lineup. Color me underwhelmed, but I'll wait to see official benchmarks.
 
Can the new Radeon Pro be an indication of a new Mac Pro release? They where show off several months ago and the first
estimate was a early 2017 release date. Now though, they seem to be here with "Currently the 7100 and 4100 are expected after the 10th while the 5100 should show up on the 18th." https://www.pcper.com/news/Graphics-Cards/AMD-Releases-New-Generation-Radeon-Pro-Workstation-Cards

Soooo, nothing should really stop Apple from releasing the Mac Pro in November, all parts are here, readily available, especially for a late November release of the Mac Pro. If the Mac Pro isn't buried yet ofcourse
I wish I can believe that. Of course anything is possible. What about the processor? What would they use?
 
I know specs don't tell the whole story, but they look to be way down on TFLOPS performance vs. Nvidia Pascal lineup. Color me underwhelmed, but I'll wait to see official benchmarks.
Yes, i agree, but the same thing was true when Apple introduced the original trashcan, atleast now, if Apple update the Mac Pro, they will use fairly new GPU:s

I wish I can believe that. Of course anything is possible. What about the processor? What would they use?
They would have to go with the E5v4, not brand new CPU:s ofcourse but atleast they where released this year ;) Or, they would have to wait until the v5, which will be released next year, early or middle/late 2017 depending on platform, that would probably mean a whole year more waiting probably.
 
Yes, i agree, but the same thing was true when Apple introduced the original trashcan, atleast now, if Apple update the Mac Pro, they will use fairly new GPU:s


They would have to go with the E5v4, not brand new CPU:s ofcourse but atleast they where released this year ;) Or, they would have to wait until the v5, which will be released next year, early or middle/late 2017 depending on platform, that would probably mean a whole year more waiting probably.
More soundable for next year mark. I'm actually okay with current CPU.
 
The E5v4s look sweet enough for an update now. They can go for another update later with the next die shrunk whenever it comes to pass.

Just give us some pascal options please.
 
So, when the 7,1 finally arrives.
The new GPUs that it has will be slightly breathed upon versions of the current D3/5/700 items.

Serious question, what is it about these cards that makes them unsuitable for gaming and also would you really notice any difference if you went from 1080p @ 30Hz to 1080p @60Hz?
 
So, when the 7,1 finally arrives.
The new GPUs that it has will be slightly breathed upon versions of the current D3/5/700 items.

Serious question, what is it about these cards that makes them unsuitable for gaming and also would you really notice any difference if you went from 1080p @ 30Hz to 1080p @60Hz?
Try and run any modern game at maxed settings then get back with me on what's wrong with AMD's cards. Never mind 4k. And never mind GPGPU. AMD have given up on competing at the high end, and instead are trying to survive on price alone.

Check out the VR comparisons at Hardocp.com for another eye opener. Or the hashcat.net forums.
 
  • Like
Reactions: ssgbryan
So, when the 7,1 finally arrives.
The new GPUs that it has will be slightly breathed upon versions of the current D3/5/700 items.

Serious question, what is it about these cards that makes them unsuitable for gaming and also would you really notice any difference if you went from 1080p @ 30Hz to 1080p @60Hz?
Pricing is wrong all the way. Dual D700 is 1000$. Dual RX 470 which will be faster, and more powerful for compute are 340$.

1000$ you can pay for 4K GPUs. Not basically for 1080p gaming GPUs.
 
Try and run any modern game at maxed settings then get back with me on what's wrong with AMD's cards. Never mind 4k. And never mind GPGPU. AMD have given up on competing at the high end, and instead are trying to survive on price alone.

Check out the VR comparisons at Hardocp.com for another eye opener. Or the hashcat.net forums.
Maybe, but they must be better at something. This is what I’m wondering. What are they missing?
[doublepost=1478542323][/doublepost]
Pricing is wrong all the way. Dual D700 is 1000$. Dual RX 470 which will be faster, and more powerful for compute are 340$.

1000$ you can pay for 4K GPUs. Not basically for 1080p gaming GPUs.
So the compute metric is what you need to look out for normally?
 
The E5v4s look sweet enough for an update now. They can go for another update later with the next die shrunk whenever it comes to pass.

In the E5 (and equivalent ) serious there is no die shrink coming before 2018; maybe 2019. E4v5 ( Skylake baseline arch) should come in 2017. If Intel does a Kaby Lake or CoffeeLake like update for v6 it will be another 14nm optimization ( maybe 2018, probably early 2019). Probably would have to get to v7 to get to a die shrink. That probably would come with yet socket change. Intel's schedule isn't simply just "Tick/Tock" anymore.


The Mac Pro 2013 jumped on the back end of the "shrink" cycle. This 3 year wait is jumping once again onto the back end of a skrink cycle. If they wait for another one it likely would be an even longer wait.


Just give us some pascal options please.

That probably has as much to do with what Nvidia is willing to do as what Apple will do.
 
So the compute metric is what you need to look out for normally?
In new games, based on low-level APIs that is metric that reflect the performance of the GPUs. Vulkan for example difference between GTX 1070 vs RX 480 - 10% of performance. Compute difference? 10% 5.8 TFLOPs vs 6.5 TFLOPs.
Call of Duty: Infinite Warfare. 12-16% of difference in performance between the GPUs.(this game is still DX11 game).
https://www.computerbase.de/2016-11...rk/2/#diagramm-cod-infinite-warfare-1920-1080
Also what is apparent again, is that with each, and every another game/drivers release, the AMD GPUs are getting better. RX 470 is right now on the same level with GTX 1060 6 GB and R9 390.

Improved Geometry performance in AMD GPUs turns to be a good direction for GCN, after all.
 
Try and run any modern game at maxed settings then get back with me on what's wrong with AMD's cards. Never mind 4k. And never mind GPGPU. AMD have given up on competing at the high end, and instead are trying to survive on price alone.

Check out the VR comparisons at Hardocp.com for another eye opener. Or the hashcat.net forums.
That sounds depressing.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.