Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
so finally unboxing, my GeekBench for 10-Core as expected ~38000 and Open CL is 190500, my cinebench R20 giving me 4535. I don't know what is that since I am new to R20, can't fine R15. will be doing some 4K editing testing and posting back.
But WoooW!! that thing is extremely fast, can't imagine what can be from the MacPro (I am coming from MBP 2016)
 
so finally unboxing, my GeekBench for 10-Core as expected ~38000 and Open CL is 190500, my cinebench R20 giving me 4535. I don't know what is that since I am new to R20, can't fine R15. will be doing some 4K editing testing and posting back.
But WoooW!! that thing is extremely fast, can't imagine what can be from the MacPro (I am coming from MBP 2016)

this means 10% faster gpu in vega 64x vs vega 64. a bit more than the gtx 1080. which is 2.5 years old and more than half the price and made for 1080 resolution. means apple needs 4-6 times the gpu power to drive the new 6k screen reasonably to allow themselves to call the machine „pro“. meaning at least rtx 2080 ti sli.
 
another update with CineBecnh r15, OpenGL 134 FPS and CPU score is 2050
[doublepost=1555829434][/doublepost]
this means 10% faster gpu in vega 64x vs vega 64. a bit more than the gtx 1080. which is 2.5 years old and more than half the price and made for 1080 resolution. means apple needs 4-6 times the gpu power to drive the new 6k screen reasonably to allow themselves to call the machine „pro“. meaning at least rtx 2080 ti sli.
I guess Apple upgraded the GFX to 64X so the iMac Pro can drive the new display, it doesn't make sense to have a display that works only with MacPro, that's not apple style anyway.
 
this means 10% faster gpu in vega 64x vs vega 64. a bit more than the gtx 1080. which is 2.5 years old and more than half the price and made for 1080 resolution. means apple needs 4-6 times the gpu power to drive the new 6k screen reasonably to allow themselves to call the machine „pro“. meaning at least rtx 2080 ti sli.
If you think about gaming in full native resolution, than yes, 2080 ti sli would suit the needs. And Apple prices for GPU power are ridiculous, agreed.
But for all other purposes a Vega would be just fine. Remember that they are already driving a 5K + second and third monitor with 4K/5K resolution.

Talking about pricing, I find the upgrade prices a bit strange:
580X to Vega 48, about 10% performance gain: 540 €
Vega 56 to Vega 64, 10%: 660€
Vega 64 to Vega 64X, 10%: 180€ <- best buy :-D
 
this means 10% faster gpu in vega 64x vs vega 64. a bit more than the gtx 1080. which is 2.5 years old and more than half the price and made for 1080 resolution. means apple needs 4-6 times the gpu power to drive the new 6k screen reasonably to allow themselves to call the machine „pro“. meaning at least rtx 2080 ti sli.

Any GPU with DP1.3 / DP1.4 / HDMI 2.0 can drive a 6K display.

Are you thinking about running games at 6K or something?
 
Any GPU with DP1.3 / DP1.4 / HDMI 2.0 can drive a 6K display.

Are you thinking about running games at 6K or something?
of course, there used to be a time when native resolution was on par with gpu performance hehe
 
If you ordered the X and didn’t get it, contact Apple within 14 days.

I like the AIO design on iMac and won't need more than what iMac can offer today and at least 3/4 years,
I will be surprised if I outgrow mine in 3/4 years but could happen.

The LG 5K 27” monitor is nice but $100 more gets you this new 34” UltraWide 5K (huba huba!). I wonder if Apple will offer it through their store.
https://www.lg.com/us/monitors/lg-34WK95U-W-ultrawide-monitor
I went with a pair of 27” LG 4K monitors instead and didn’t spend the extra $200 ea. For TB3. Using TB3 to Display Port instead. $409 with free shipping and no CA sales tax from B&H
https://www.lg.com/us/monitors/lg-27UK650-W-4k-uhd-led-monitor
[doublepost=1555856909][/doublepost]
Just placed an order for the below, couldn't decide with the disk space and the processor, I have a nas and fast SSD external storage so decided for the 1tb and the 10 core seems to be the sweet spot, seeing I went for the Vega 64 thought I'd spend the extra, no idea is it will be worth it :)
  • 10-core Intel Xeon W processor,
  • 64GB 2666MHz DDR4 ECC memory
  • 1TB SSD storage
  • Radeon Pro Vega 64X with 16GB of HBM2 memory
I was figuring that setup with a 2TB SSD would hit my sweet spot. But, when I found a used 14 core with 128GB RAM and a Vega 64 for a little less, I jumped on it instead. I think I’ll be ok.
 
If you ordered the X and didn’t get it, contact Apple within 14 days.


I will be surprised if I outgrow mine in 3/4 years but could happen.

The LG 5K 27” monitor is nice but $100 more gets you this new 34” UltraWide 5K (huba huba!). I wonder if Apple will offer it through their store.
https://www.lg.com/us/monitors/lg-34WK95U-W-ultrawide-monitor
I went with a pair of 27” LG 4K monitors instead and didn’t spend the extra $200 ea. For TB3. Using TB3 to Display Port instead. $409 with free shipping and no CA sales tax from B&H
https://www.lg.com/us/monitors/lg-27UK650-W-4k-uhd-led-monitor
[doublepost=1555856909][/doublepost]
I was figuring that setup with a 2TB SSD would hit my sweet spot. But, when I found a used 14 core with 128GB RAM and a Vega 64 for a little less, I jumped on it instead. I think I’ll be ok.

do you really get much power out of the 14 cores, the 128G RAM worth it,, in my opinion, if Apple was offering a better price I would have gone that route!!
 
do you really get much power out of the 14 cores, the 128G RAM worth it,, in my opinion, if Apple was offering a better price I would have gone that route!!
There are many AV apps that benefit and More Cores = Faster = More Better. Although, my need for speed vs my budget dictated 10 cores, I will certainly enjoy the time savings and extra overhead afforded by the extra 4 cores.

Apple’s iMac Pro page has comparisons between the 10 core and 18 core running a few apps. I expect the 14 core to split the difference.

High end AV machines can come with 56 cores, 1TB RAM and 8TB SSD on board (if you want a $150K Win 10 machine). The new Mac Pro is going after that market — about the only thing we really do know about it. Speculation is running high that it will use the new Intel 28 core CPU. We’ll see...
 
There are many AV apps that benefit and More Cores = Faster = More Better. Although, my need for speed vs my budget dictated 10 cores, I will certainly enjoy the time savings and extra overhead afforded by the extra 4 cores.

Apple’s iMac Pro page has comparisons between the 10 core and 18 core running a few apps. I expect the 14 core to split the difference.

High end AV machines can come with 56 cores, 1TB RAM and 8TB SSD on board (if you want a $150K Win 10 machine). The new Mac Pro is going after that market — about the only thing we really do know about it. Speculation is running high that it will use the new Intel 28 core CPU. We’ll see...

I guess that too, the MacPro has to start where the iMac pro ends, so i guess options of 14C/18C and high end 28C are the possible options, with Radeon VII GFX 16G HBM2 and may be more memory

Is the Xeon chip in the iMac customized to apple, i guess it has higher L3 cash memory?
 
I am looking forward to the prices and the how many pcs can u get for the same performance comparison. it will be the last big upgrade anyways. in 5 years cloud computing is serious enough so that by 2025 apple will just sell terminal screens and performance subscriptions hehe no reason for owning individual hardware anymore that 90% of the time is not fully used anyway
 
Last edited:
I am looking forward to the prices and the how many pcs can u get for the same performance comparison. it will be the last big upgrade anyways. in 5 years cloud computing is serious enough so that by 2025 apple will just sell terminal screens and performance subscriptions hehe no reason for owning individual hardware anymore that 90% of the time is not fully used anyway
Hunh... I happen to know more than a little about that.

In 1996, Microsoft announced that as the future, based on NT. The day after they bought Hotmail, it was ported to NT — three days later, it was ported back to a mainframe. SUN announced that so many times, based on NFS, that people stopped listening—and released multiple rounds of hardware to support it and look what happened to them.

In the 1950s, IBM speculated that only 3 or 4 computers would ever be built. This is also the idea behind the AS400 (14,000 simultaneous connections! mic drop) and UNIX as envisioned by AT&T in 1968.

So yes, while the idea of dumb terminals for everybody and everything has been in the works for over 60 years, has it happened yet? No. I've worked on dumb terminals in a few industries (including AT&T) and wrote support docs for terminal emulation that are still online. Without going way OT, I had a sales job trying to sell that concept to large corporations (where it is still a very good idea). Heck, it is the underpinnings of all cloud based computing and services so we will see more of it. My day gig could almost be done on a terminal... but not really.

The advent of 5G internet opens up the real possibility—again, just as UNIX had 50 years earlier. The reality is that not everyone uses their computers to do the same thing. Dumb terminals for all doesn't work if that's true.

Every time I fire up a DAW or render a video file, I know that dumb terminals for all is just as ridiculous now as it seemed in the early '90s when I was introduced to the concept.
 
So finally the exchange iMP w/ Vega 64X to my wrongly delivered iMP w/ Vega 64 is here. I did the Heaven benchmark, extreme setting. GPU temperature went pretty high, between 80-90°C and GPU clockspeeds varied between 1.2 and 1.45Ghz (maybe throtteling? but don't take my word). Vega 64 reached "only" 1.34Ghz. I'm not too sure about the Vega 64 temperatures compared but I THINK it stayed unter 75°C in this benchmark? Can anybody confirm? So IIRC that's about 20% more heat for 8-10% more power... but I guess this is the price to pay when you are pushing the clockspeeds without changing the thermals.
 
So finally the exchange iMP w/ Vega 64X to my wrongly delivered iMP w/ Vega 64 is here. I did the Heaven benchmark, extreme setting. GPU temperature went pretty high, between 80-90°C and GPU clockspeeds varied between 1.2 and 1.45Ghz (maybe throtteling? but don't take my word). Vega 64 reached "only" 1.34Ghz. I'm not too sure about the Vega 64 temperatures compared but I THINK it stayed unter 75°C in this benchmark? Can anybody confirm? So IIRC that's about 20% more heat for 8-10% more power... but I guess this is the price to pay when you are pushing the clockspeeds without changing the thermals.
Any fps benchmarks please?
 
  • Like
Reactions: rxs0 and Bohemien
In 1996, Microsoft announced...

In the 1950s, IBM speculated...

...I know that dumb terminals for all is just as ridiculous now as it seemed in the early '90s...
I was going to say something similar while adding that I was told by a programming lecturer in the early eighties that programming (as a career) had no future because "by the end of the century computers will program themselves". So here I am, a retired programmer who finds that computers are still programmed in exactly the same way today as they were back then.
 
Another one... (8 core/32GB/1TB SSD/Vega 64X)

DaVinci Resolve Standard Candle Light Test

Vega 64X
66 Nodes - 12 fps
30 Nodes - 26 fps
18 Nodes - 42 fps
9 Nodes - 59 fps ( <- this one seems to be limited to display refresh rate, don't know how to prevent this in Resolve)

for comparison (from here: https://forum.blackmagicdesign.com/viewtopic.php?f=21&t=86127)

Radeon VII
66 Nodes - 16 fps
30 Nodes - 35 fps
18 Nodes - 57 fps
9 Nodes - 98 fps

Radeon VII will be a nice update for the upcoming Mac Pro ;-)

(Interesting find out in this benchmark compared to Heaven benchmark: GPU Clock ist stable at 1.46Ghz (in Heaven it varied), temperature doesn't rise as fast, fans kick in later... some high frequency coil whine can be heard here, not too loud (had much higher coil whine with two Vega 64 for PC which I returned), but it's there :-/ )
 
Last edited:
  • Like
Reactions: Boil and rxs0
I was going to say something similar while adding that I was told by a programming lecturer in the early eighties that programming (as a career) had no future because "by the end of the century computers will program themselves". So here I am, a retired programmer who finds that computers are still programmed in exactly the same way today as they were back then.

he didn't say which century...
 
  • Like
Reactions: Vjosullivan
I was going to say something similar while adding that I was told by a programming lecturer in the early eighties that programming (as a career) had no future because "by the end of the century computers will program themselves". So here I am, a retired programmer who finds that computers are still programmed in exactly the same way today as they were back then.
Back in 2003 a lecturer told me that there was no future/money in programming in the future, I listened to her. Same BS different decade. Never listen to lecturers.
 
  • Like
Reactions: AlexMaximus
I think this has been discussed elsewhere before, but as I read this forum topic and as 2019 passes . . . does it really make sense for Apple to update the iMac Pro?

Meaning, we have a Mac Pro that is about to drop. And, while it will not fit into a lot of people's workflows, there is no doubt that it can serve the high end (maybe very high end) well.

Assuming Apple will redesign/refresh the "regular" iMac in 2020, is there really a need for the iMac Pro to live on? Assuming Apple uses modern Intel hardware, they can put four TB3 ports in, gigE or 10GE in (just like the 2018 Mac mini option), 128GB ram, 4TB of SSD, and somewhere between and eight and 12-core cpu. And an updated screen with smaller bezels of course. You would get QuickSync and T2, as well. You would get some version of Radeon 5700 (so better than VII).

Of course, the ability to easily upgrade memory would be no more.

Like the "trashcan", I think it should/will be a single release. But in this case, it seems like Apple did not design themselves into a corner--they just knew they needed a cross-over product for the time/transition because of Intel's roadmap woes and because they were not ready to redesign the iMac yet.
 
I read others saying this. Where do you base this on other than assumptions?
Leaks, talks by Apple, etc?

Only assumptions. It is definitely possible that they will do something like they did with the 2018 Mac mini—don’t make it crazy simple, but possible for many people with the right tools and patience. Much easier with that model than the iMac Pro for example. I just don’t believe it given how they have proceeded with the bulk of their lineup. And I’m assuming they will will want to create a smaller and thinner chassis—with even more adhesive and space limitations.

Only assumptions.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.