Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.
As full of it as ever.

What I pointed out was that while you are making fun of people using 10 year old computers,

say what? i absolutely wasn't making fun of people using 10year old computers..
:rollseyes:


THE MOST POPULAR THREAD on the MP board is the one where people figured out how EASY it is to run Yosemite on a 2006 MP.
i scanned the thread.. it doesn't seem easy.. much of the thread is filled with "i can't get it to work etc".. either that or discussing what's broken after the hack.

the instructions are a 40 step process.. functionality is missing.. buggy performance.. unsupported environment.

you're arguing that apple used to be cool and design hobby friendly machines (cmp) but that's not so.. they turned the thing off with software.. just like they always do.. it's great that someone figured out a way to hack the thing but realize it's definitely a hack.. it's prone to error and it's not easy to do..

if apple were actually cool about it, they'd simply continue to support the older machines.. that's the easy way


You have done nothing but drop layer upon layer of gibberish after this.
i'm arguing to one person about it.. you
are you certain you're not bringing gibberish into the equation? are you one of the upgrade maniacs i mentioned? yes.. are you using a 10 year old computer? no

It IS the most popular thread. There is basically no chance whatsoever that a nMP will even be close to running a current OS in 2023 due to those pre-crippled GPUs.
you're not much of a forward thinker are you?
 
Wait wait wait. Wait. Can I/should I buy a 2013nMP 12GB for $1200 and upgrade the processor later, or will chips for the LGA 2011 Socket not even be in existence in a few years? :eek:
 
I beg to differ.

I'm also curious what you consider an insult? Just because someone takes a different viewpoint than yourself, doesn't mean they have insulted you.

Your claims about Fiji were balderdash and you knew it. You posted links to articles that claimed under clocking didn't reduce power without killing performance then claimed the article said the opposite. You were deliberately misleading people who might not follow the link and read the actual conclusion reached by the people who ran the tests.
If you wan't to know what an insult is:
You were deliberately misleading people who might not follow the link and read the actual conclusion reached by the people who ran the tests.

Who are you to judge if I did something on purpose or not? Who are you to accuse me of being a liar?

Everything I have posted was backed up, even if you want to think otherwise. EVERYTHING. Only you want to see that reducing power target makes AMD GPU unusable. Only You want to view Mac Pro as a ****** machine. Why? I will not go your route and accuse you for things you do, or not. That is your own business why you do negative PR on this forum about Mac Pro and AMD.

But.

To all reading this. I'm sorry for my rant, but this has come too far.
 
Last edited by a moderator:
Wait wait wait. Wait. Can I/should I buy a 2013nMP 12GB for $1200 and upgrade the processor later, or will chips for the LGA 2011 Socket not even be in existence in a few years? :eek:

how much memory do you have on your macmini?

I want to unload my sandy bridge (imac 2011) but current imacs do not represent much improvement. I work with logic and would like a computer that would allow me to use tons of VSTs and tracks for the next 5 years.

hmmm
 
... or will chips for the LGA 2011 Socket not even be in existence in a few years? :eek:

Exist as 'New' ? No. Used? There will be bone yard, spare parts sold off of systems as they are decommissioned/recycled/etc for a considerable length of time.

Intel doesn't sell new processors for what went into the Mac Pro 2009-2012 systems anymore. Even "server"/"enterprise" stuff gets end-of-sale/end-of-life after a while. About 4 years for the single processor set-ups and around 6 years for the sales and several more years longer for the support. But they are all finite; it isn't "decades".

When Intel completes the tick/tock cycle after 2011 (i.e,, this v3/v4 ; Haswell/Broadwell) cycle then the end is coming. By the time they start the next tick/tock cycle then even more so.

Apple coupling the Mac Pro the end end of Tick/Tock cycles doesn't add alot of value is part of the "add processor later" crowd. By coming in at the "2nd half" of the cycle they have already negated half of that shared family as options. End up with just the "2nd" half set as even viable options. If more so part of the "I buy everything used" crowd then LGA 2011 has 'value' because it is old. Older means more used offerings.

Intel has "reused" the physical LGA 2011 pins but socket for v3/v4 is pragmatically new because it isn't electrically compatible (nor is it physically notch compatible ).
 
is the mac pro a good computer should i buy one when does the next mac pro come out is there new graphics chips and CPUs available at the moment that would be used in the next mac pro should i wait for the next mac pro to come out also is there going to be a new thunderbolt display
 
Nobody can really say with certainty. All we have are educated guesses.

In answer to your queries:

- The Mac Pro is a good computer, though it depends on what spec you buy for what you need it for. Not every model of Mac Pro will blow an iMac/MacBook Pro configuration out of the water.
- I suspect it will be updated in October, though 1000 other people will have 1000 other suspicions.

TL;DR: buy now if you urgently need it, wait if you can afford to wait.
 
I don't consider myself to be a pedant, but any punctuation at all would be extremely helpful. :D
answers (as best I can tell) to, if my count is correct, 6 questions.
1. Yes
2. Yes, if it fits your budget and your needs.
3. No one knows except Apple.
4. Probably new chips and CPUs
5. Get a computer when you are ready, and when you need it. No reason to wait.
6. I suspect there will be a new thunderbolt display soon, but only Apple knows.
 
is the mac pro a good computer should i buy one when does the next mac pro come out is there new graphics chips and CPUs available at the moment that would be used in the next mac pro should i wait for the next mac pro to come out also is there going to be a new thunderbolt display

Top, top grammar.

Anyway, no one knows.

/thread
 
Wait wait wait. Wait. Can I/should I buy a 2013nMP 12GB for $1200 and upgrade the processor later, or will chips for the LGA 2011 Socket not even be in existence in a few years? :eek:

Yes the CPUs will still exist. Mostly on the used market, but new ones as well.

There are still new-in-box Westmere CPUs available for the 2010/2012 Mac Pro, but they command a huge price premium over used CPUs.
 
  • Like
Reactions: JamesPDX
EDIT: Whoops, this was a response to another thread. Don't know how I accidentally got it here.

As everyone has already said, only Apple knows at this point. I think there are three possibilities, listed in order of what I think is most likely to least likely:
  1. Apple waits a while longer for Skylake/TB3 and we see it next year Q1-Q2.
  2. Apple updates it late this year, but it's just a spec bump.
  3. Apple keeps the MP on the backburner and skips another Intel generation.
 
Last edited:
http://www.pcper.com/files/review/2015-08-16/ashes-5960x.png
http://www.extremetech.com/gaming/2...he-singularity-amd-and-nvidia-go-head-to-head
http://www.computerbase.de/2015-08/...diagramm-ashes-of-the-singularity-3840-x-2160

And some people did not believed me, when I was saying that gaming performance of AMD cards was due to DirectX 11...

One more thing. https://pbs.twimg.com/media/CBBu9COWwAAPzZB.jpg:large
There are entire blocks of API function calls and documentation copied verbatim into DX12 and Vulcan. And yes, DirectX 12 development was from 2011. But it was mostly influenced in the last stage by AMD and EA engineer, who came to AMD with low-level idea.

What is staggeringly consistent is that with higher level of details, the performance of Nvidia GPUs erodes down compared to even DirectX 11.

Asynchronous shader capabilities here are playing major role. It is perfectly in line with what we have seen in Ryse: Son of Rome benchmarks from DirectX 12 performance compared to DirectX 11. However, there was no regression of 980 Ti performance.
How useful is the benchmark?

It should not be considered that because the game is not yet publically out, it’s not a legitimate test. While there are still optimizations to be had, Ashes of the Singularity in its pre-beta stage is as – or more – optimized as most released games. What’s the point of optimizing code 6 months after a title is released, after all? Certainly, things will change a bit until release. But PC games with digital updates are always changing, we certainly won’t hold back from making big changes post launch if we feel it makes the game better!

DirectX 11 vs. DirectX 12 performance

There may also be some cases where D3D11 is faster than D3D12 (it should be a relatively small amount). This may happen under lower CPU load conditions and does not surprise us. First, D3D11 has 5 years of optimizations where D3D12 is brand new. Second, D3D11 has more opportunities for driver intervention. The problem with this driver intervention is that it comes at the cost of extra CPU overhead, and can only be done by the hardware vendor’s driver teams. On a closed system, this may not be the best choice if you’re burning more power on the CPU to make the GPU faster. It can also lead to instability or visual corruption if the hardware vendor does not keep their optimizations in sync with a game’s updates.

While Oxide is showing off D3D12 support, Oxide also is very proud of its DX11 engine. As a team, we were one of the first groups to use DX11 during Sid Meier’s Civilization V, so we’ve been using it longer than almost anyone and know exactly how to get the get the most performance out of it. However, it took 3 engines and 6 years to get to this point . We believe that Nitrous is one of the fastest, if not the fastest, DX11 engines ever made.

It would have been easy to engineer a game or benchmark that showed D3D12 simply destroying D3D11 in terms of performance, but the truth is that not all players will have access to D3D12, and this benchmark is about yielding real data so that the industry as a whole can learn. We’ve worked tirelessly over the last years with the IHVs and quite literally seen D3D11 performance more than double in just a few years time. If you happen to have an older driver laying around, you’ll see just that. Still, despite these huge gains in recent years, we’re just about out of runway.

Unfortunately, our data is telling us that we are near the absolute limit of what it can do. What we are finding is that if the total dispatch overhead can fit within a single thread, D3D11 performance is solid. But eventually, one core is not enough to handle the rendering. Once that core is saturated, we get no more performance. Unfortunately, the constructs for threading in D3D11 turned out to be not viable. Thus, if we want to get beyond 4 core utilization, D3D12 is critical.
The bolded fragment pretty much says why we see decrease in performance on DirectX12.
 
Last edited:
The most surprising thing to me about the directx 12 benchmarks is how bad AMD's directx 11 drivers seem to be. Now it makes sense why AMD was pushing for mantle/directx 12 so hard. Its not surprising that Nvidia has the same or slightly worse performance, as directx12 is brand new and could probably use some optimizing. Also keep in mind this is a single game that seems more like a tech demo for directx 12. I am guessing the difference in more mainstream games will be less pronounced.

I wonder how this translates over to OS X. What kind of speed increase will there be between OpenGL and Metal apps?
 
how much memory do you have on your macmini?

I want to unload my sandy bridge (imac 2011) but current imacs do not represent much improvement. I work with logic and would like a computer that would allow me to use tons of VSTs and tracks for the next 5 years.

hmmm

Oh, my 2012 Mini i7 has 16GB RAM and 2 Angelbird 512 SSDs. I was asking about a 2013 Mac Pro Quad core, 12 TB RAM, 256 Flash SSD. ~$1200, allegedly new-in-box. Seems TGTBT. Whaddya think, folks?

Or do I need something here: (check it out) http://www.pro-tools-pc.com/
 
Last edited:
http://www.pcper.com/files/review/2015-08-16/ashes-5960x.png
http://www.extremetech.com/gaming/2...he-singularity-amd-and-nvidia-go-head-to-head
http://www.computerbase.de/2015-08/...diagramm-ashes-of-the-singularity-3840-x-2160

And some people did not believed me, when I was saying that gaming performance of AMD cards was due to DirectX 11...

One more thing. https://pbs.twimg.com/media/CBBu9COWwAAPzZB.jpg:large
There are entire blocks of API function calls and documentation copied verbatim into DX12 and Vulcan. And yes, DirectX 12 development was from 2011. But it was mostly influenced in the last stage by AMD and EA engineer, who came to AMD with low-level idea.

What is staggeringly consistent is that with higher level of details, the performance of Nvidia GPUs erodes down compared to even DirectX 11.

Asynchronous shader capabilities here are playing major role. It is perfectly in line with what we have seen in Ryse: Son of Rome benchmarks from DirectX 12 performance compared to DirectX 11. However, there was no regression of 980 Ti performance.
The bolded fragment pretty much says why we see decrease in performance on DirectX12.

All these AMD posts just go to show how behind and crappy AMD cards are, why would people not spend the $100 more and get an Nvidia. That much power used for less performance. Apple needs to wake up and Nvidia needs to not be aholes...
 
  • Like
Reactions: tuxon86
Oh, my 2012 Mini i7 has 16GB RAM and 2 Angelbird 512 SSDs. I was asking about a 2013 Mac Pro Quad core, 12 TB RAM, 256 Flash SSD. ~$1200, allegedly new-in-box. Seems TGTBT. Whaddya think, folks?

Or do I need something here: (check it out) http://www.pro-tools-pc.com/

thats undoubtly TGTBT. On that link, I still refuse to run windows OS but to each its own.

No doubt about it though. If you do not need thunderbolt for audio work, the cheese grater is absolutely way to go. Get a used 6 core and upgrade to 8 core = win.
 
  • Like
Reactions: JamesPDX
I wonder how this translates over to OS X. What kind of speed increase will there be between OpenGL and Metal apps?

You won't see a speed increase due to Metal in a lot of cases.

You'll be lucky if existing things run in Metal at all.
 
you get it that we're tucked away in a back corner of the internet talking about a bunch of crap that has no real impact on life (or- no real impact on getting work accomplished on a freaking computer for that matter).
right?

is 'winning' an argument around here something really worth striving for?

Well Said! They also don't take in account the history of the back and forth between you and MVC. There are many reasons to post on forums. Everyone is not here to seek knowledge or answer questions. I will chime in on an occasional 3,1 post or post an anti nMP picture in jest. For me it is a site purely for entertainment. Who said you can't play games on a Mac?:p
 
http://forums.anandtech.com/showpost.php?p=37637004&postcount=21

Pretty good understanding of those benchmarks posted here.

Edit:
ExtremeTech Conclusions said:
As things stand right now, AMD showcases the kind of performance that DirectX 12 can deliver over DirectX 11, and Nvidia offers more consistent performance between the two APIs. Nvidia’s strong performance in DX11, however, is overshadowed by negative scaling in DirectX 12 and the complete non-existence of any MSAA bug. Given this, it’s hard not to think that Nvidia’s strenuous objections to Ashes had more to do with its decision to focus on DX11 performance over DX12 or its hardware’s lackluster performance when running in that API.
Nvidia said:
Our work with Microsoft on DirectX 12 began more than four years ago with discussions about reducing resource overhead. For the past year, NVIDIA has been working closely with the DirectX team to deliver a working design and implementation of DX12 at GDC.
http://blogs.nvidia.com/blog/2014/03/20/directx-12/#sthash.Lg1CBec0.dpuf
 
Last edited:
  • Like
Reactions: JamesPDX
thats undoubtly TGTBT. On that link, I still refuse to run windows OS but to each its own.

No doubt about it though. If you do not need thunderbolt for audio work, the cheese grater is absolutely way to go. Get a used 6 core and upgrade to 8 core = win.

OMG how I wish I could afford a ProTools|HDX system with the Focusrite RedNet. The tech out there now is just insane if one has the dough. Total gear-porn. https://www.avid.com/US/products/Pro-Tools-HDX/features#Comparesystems http://us.focusrite.com/rednet-protoolshd

Is the 2010 okay? One reason I love the 6,2 Mini is that it's dead-quiet, so it can live in my studio with it's absurdly low noise floor. A 2013MP could, too FWIU. A 2010 or 2012 cheese-grater-deluxe would have to live in the basement/machine room and I'd have to run KVM and a TRS snake up through a hole in the floor, but I could jam at least 3 SSDs into it, and add an (HDMI? card) and a USB3 card. But every problem has a "compromise solution" when you can't just shovel tons money at it. And yeah, I'm that guy who vacuums out his Mini about once a month: Vacuum on one side, hurricane lens blower on the other. -Trying to eek out a little extra turbo-time through temperature control.
 
http://www.pcper.com/files/review/2015-08-16/ashes-gtx980.png

One more benchmark. So in low settings we see on GTX 980 increase in performance in DirectX 12, and in high setting we see decrease. It is not due to drivers in this case, but rather how good is your hardware with asynchronous capabilities, especially if the engine is designed specifically for that. In which AoS is designed.

What makes me sad on the other hand is that current hardware is already outdated for DX12, even if we think about AMD GPUs. Maxwell was designed to be potentially best for DX11, in DX12 it is not that efficient. AMD starts to fly here, but I have a feeling that current gen will get stuck on some sort of design wall soon. And that is where GCN 2.0 comes to life.
 
Last edited:
People need to understand that X86 CPUs are not the future, GPGPUs are the future. And that's what they're designing for. We're not quite there yet, but it really sucks that Intel is basically kneecapping them right when they need it least. I miss the days when Intel and AMD actually competed...


Newbie? Ive been here for years, sure, may have not logged on in a while.

Few months ago, talking to Apple support, mention, EE (Electrical Engineering, with emphasis in micro photolithography, making cpus, components, analog devices in Boston, they make the AUDIO CHIPS FOR APOLLO UAD, and what I do’t get is where is all the outrage?

Pre iPhone, Apple was a guys store, now its a freaking JC Payless, baby strollers, moms, yeah, you can come in, just show me how compressor works, color correction and either Logic or FCP work. Just open it and do one thing, lol, just kidding, but forgetting about pro and leaving us behind?

Anyway, delays were due to Apple trying to make FPU faster thru GPU, never happened, thats why we have n 14//18 core, the ****** littler box can’t go over 135 watts, thus the more cores, the lower the speed, sure more cores are great, but for audio, FPU=THE FASTER THE CHIP THE HIGHER, A-N-D a Mac Pro i7 12 core, time to get rid of mobile parts on iMac, as well, otherwise, face mobile, not a desktop, really. And why do you think they stopped with the iMac i7 4.0/4.5? Because in Audio it kicked the pros ass. Dump the trashcan, its hinders the people who stuck by you while AMD and INTEL beat the pants off us MEGAHERTZ WISE and remember AMD was so good, pre duo, that Intel need 1 GHZ more, just 2 keep up (1GHZ more in CPU), and lastly, while talking to the person at Apple, then Apple and AMD were about the same, intel still needed, 2.0 Ghz to keep up (more), then Adobe gt mad at Apple due to Logic, they released the PC version for a few years, The woman at Apple I was on the phone with? She thought I worked there as she asked, hows the 32 core CPU coming along? I also gave news either here or Mac insider, that FCP, the new onelooked like iMovie on steriods, knowone believed me as they released 2 moree, then it came, so a big CPU is comping, if so, yu won’t see that until they get the TBolt 3 figured out, Intel will be pisssed.
 
  • Like
Reactions: JamesPDX
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.