Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
AMD provides very good tools to the consumer for customizing power, cooling, and clocks.
So? Okay, do this: use that tool to "customise power" on a Vega 64 (up to 295w) down to the same level as the GTX 1080 (about 180w), what will your end performance of the Vega 64 be? Yeah, thats right, compare efficiency between Vega and Pascal and Vega looks like the stoneage. AMD has simply taken a ****** architecture, overclocked it to the breaking point (because consumers don't care about powerdraw as much) and they just about reach up to a 1080 in performance.

This Vega launch has not impressed me yet, it almost feels like a Bulldozer launch, overhype, overpromise, yeah, overall ****, AMD could have overclocked Polaris up to the same 295w as the Vega 64 and probably be better off.

AMD should most likely abandon GCN, its fairly obvious that they have pushed the limit for way to long with their endless rebranding on an old dog.
 
  • Like
Reactions: Squuiid and tuxon86
I don't think that the Vega lineup is all that bad really. If Vega and Pascal has been released at the same time, that would have been more interesting. As it is, they're competing with a product line that's been available for over a year, and even then, they're pushing it with a lot more power to achieve similar results.

I've no doubt in the near future, as is typical of AMD, performance metrics will steadily improve with new driver releases.

I'm very interested to see how the Vega chips planned for the iMac Pro perform; I'm assuming they're gonna get cherry-picked for power efficiency. I'm also very interested to see what the performance of the RX Vega Nano is when put within a more reasonable power bracket, and what the performance difference is between this and the previous Nano. That could make quite a decent eGPU perhaps.

You could even make a mock 2017 Mac Pro by pairing two RX Vega Nanos with a Xeon ;)
 
I care about power draw... my electric bill is already way too high in the summer :|
 
The Vega 56 is a better buy than the GTX 1070 at this point. I mean, it's $100 less!

What I'd like to see, is benchmarks showing the actual TDP of the card when pushed to the limit. Well, what I'd really like to see someone sticking a RX 56 in their Mac Pro!

I'm interested in power consumption tests while gaming on a freesync monitor, I's also like to see actual power usage during compute. While they'll be higher than team green I'll bet it's in the teens difference
 
So? Okay, do this: use that tool to "customise power" on a Vega 64 (up to 295w) down to the same level as the GTX 1080 (about 180w), what will your end performance of the Vega 64 be? Yeah, thats right, compare efficiency between Vega and Pascal and Vega looks like the stoneage. AMD has simply taken a ****** architecture, overclocked it to the breaking point (because consumers don't care about powerdraw as much) and they just about reach up to a 1080 in performance.

This Vega launch has not impressed me yet, it almost feels like a Bulldozer launch, overhype, overpromise, yeah, overall ****, AMD could have overclocked Polaris up to the same 295w as the Vega 64 and probably be better off.

AMD should most likely abandon GCN, its fairly obvious that they have pushed the limit for way to long with their endless rebranding on an old dog.

Vega uses the Polaris chip...the only difference is Vega is optimized so if Vega doesn't cut it, Polaris would be worse. I agree with you that AMD lost on this one, expecially with nVidia releasing their optomized FinFET, 1080ti.
 
DiRT 4 was performing better on Vega because it was tested with CMAA. In fact, Vega is really beaten only from 8x MSAA.
 
Last edited:
The Vega 56 is a better buy than the GTX 1070 at this point. I mean, it's $100 less!

What I'd like to see, is benchmarks showing the actual TDP of the card when pushed to the limit. Well, what I'd really like to see someone sticking a RX 56 in their Mac Pro!

Vega 56 can totally run in a cMP.

But I can't seem to be able to buy one anywhere? (Europe here)
 
No. For one thing, it has many for transistors.

I'd hate to disappoint you, but the same series cards will have various transistors from the low end to the high end. AMD never released the high end Polaris intentionally. Think about it: nVidia just released a high end video card that crushes AMD. AMD can release a slower, high end video card months later, when no interest is left...does that make sense? Their hope was likely to get the Vega out sooner and hopefully beat nVidia at that same game...which they failed at.

That being said, I should have said, Vega uses the same FinFET chip...not Polaris...my bad.
 
I understand AMD and Microsoft locking the cards to protect people from buying used ones with modified VBIOSes, but there should be a way to customize VRAM timings.
 
You are just lazy if you don't undervolt.

But the software is still buggy.
 
You are just lazy if you don't undervolt.

But the software is still buggy.

Then it's not really an option...
And shouldn't be needed anyway, just like overclocking.

People should stop making excuse for a company that can't be bothered to release a finished and competitive product.
 
If AMD had made the bundles with Solar panels, that'd offset the extra power draw!
Your new Vega 64 is here, where would you like it put?

solar-panels-lift-crane[1].jpg
 
Then it's not really an option...
And shouldn't be needed anyway, just like overclocking.

People should stop making excuse for a company that can't be bothered to release a finished and competitive product.
The cards are configured from the factory with generic parameters that should work across the board.
 
The cards are configured from the factory with generic parameters that should work across the board.

...but not well enough to be as efficient as the competion while performing less.
They don't even have the price going for them either.
 
...but not well enough to be as efficient as the competion while performing less.
They don't even have the price going for them either.
I don't know how the safety margins compare. And what tools NVIDIA provides for power management.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.