Same here, now that ETH is slowly but surely showing signs of tapering off in the coming months.I might buy an ex-mining card if the price is rock bottom. Not much to complain about if it lasts a couple months.
Last edited:
Same here, now that ETH is slowly but surely showing signs of tapering off in the coming months.I might buy an ex-mining card if the price is rock bottom. Not much to complain about if it lasts a couple months.
The hardware is not necessarily different, but certain features might be enabled by drivers and firmware.
...and why there's an even more expensive Tesla line.That's why there's a whole separate Quadro product line that NVIDIA can charge a hell of a lot more money for.
All of those differences are not on hardware level but software(partially). I am speechless.It's not just drivers though, here's a comparison of GeForce vs Quadro hardware features from NVIDIA:
http://nvidia.custhelp.com/app/answers/detail/a_id/37/~/difference-between-geforce-and-quadro-gpus
For example:
- Antialiased points and lines.
- Logic operations.
- Clip regions/planes.
- Two-sided lighting.
AA lines and two-sided lighting are very common in workstation modelling apps like Solidworks. So, if AA lines and two-sided lighting are 100x slower on a GeForce card, that would easily explain why a GP106-level Quadro card would beat a GP102-based Titan Xp (note that I don't know the exact delta, just that there is a real hardware difference). Quadro cards are absolutely not simply GeForce cards with a different driver.
I'm sure there are driver-level differences where the Quadro cards get certain optimizations targeted at the workstation applications while GeForce cards do not, but if you'd done any research on this you'd know that there are real hardware level differences as well. All of this is in line with my original statement that SPECviewperf is not a compute test that is limited by raw TFLOPs.
Links?P.S. You can try to "hack" the Quadro drivers to run on consumer GPUs, and you will get those features enabled.
All of those differences are not on hardware level but software(partially). I am speechless.
Those are hardware features that are disabled in consumer grade GPUs, but drivers, and different name of the GPU enables them. Silicon in both GPUs is the same. But the drivers are completely different. And drivers activate those features.
Those are exact driver level differences which actually differentiate consumer grade drivers from signed professional drivers.
Which actually was the point from beginning.
P.S. You can try to "hack" the Quadro drivers to run on consumer GPUs, and you will get those features enabled.
My point was that in SpecPerf test we see difference in performance of GPUs, because of usage of professional drivers. You looked only on comparison of TFLOPs between GPUs, and forgot about the rest of the post:I'm confused -- you claim they're all software features, and then suggest that the hardware acceleration for two-sided lighting is only enabled by the Quadro drivers? Doesn't that imply that there is actually a difference on the hardware side of things? I don't really understand what you're arguing about at this point, I stand by my original assertion that SPECviewperf is not a compute test that is limited by raw TFLOPs.
I was writing from the start about drivers affecting performance, that is why Quadro P5000 with 9.2 TFLOPs is faster in tests than Titan Xp, that has 12.8 TFLOPs.Vega does not have signed, professional drivers, like Quadro/Radeon Pro Duo has.
How come then, the Quadro P5000 is faster than 12.78 TFLOPs GPU, despite having over 9 TFLOPs of compute power?
DRIVERS. How can you not factor something like this?
My point was that in SpecPerf test we see difference in performance of GPUs, because of usage of professional drivers. You looked only on comparison of TFLOPs between GPUs, and forgot about the rest of the post:
I was writing from the start about drivers affecting performance, that is why Quadro P5000 with 9.2 TFLOPs is faster in tests than Titan Xp, that has 12.8 TFLOPs.
I am not sure Vega FE needs to improve performance that much.But the reason why a Quadro P5000 is faster than a Titan Xp is different hardware that only Quadros have enabled. If you want to believe that this is because of signed drivers, okay, I can't convince you otherwise. A quick web search suggests that much of this actually comes from hardware differences at the board level, i.e. the Quadro cards enable (at least some/many of) these features at the hardware level. But, as usual, I'm tired of arguing in circles with you, so I'll leave it at that. The underlying point here is that I find it very hard to believe that AMD can fix their problems with a driver update.
I am not sure Vega FE needs to improve performance that much.
RX Vega could be quite faster for consumer use with execution paths not targeting engineering applications.
That is negative hype. Vega FE beats Titan Xp at engineering applications, so I can very well see the gaming performance being brought down because of that configuration.Vega was supposed to beat 1080 Ti though, right? They are miles behind right now, and I'm not holding my breath for them to "fix" this with a driver update.
The design is the same. Unless you want to believe that Nvidia put 250 000 000$ into toilet developing two separate GPUs with the same basic specs.But the reason why a Quadro P5000 is faster than a Titan Xp is different hardware that only Quadros have enabled. If you want to believe that this is because of signed drivers, okay, I can't convince you otherwise. A quick web search suggests that much of this actually comes from hardware differences at the board level, i.e. the Quadro cards enable (at least some/many of) these features at the hardware level. But, as usual, I'm tired of arguing in circles with you, so I'll leave it at that. The underlying point here is that I find it very hard to believe that AMD can fix their problems with a driver update.
In current state Vega behaves just like Fiji, but has lower memory bandwidth, effective(303 GB/s vs 373 GB/s), hence the per clock decrease against Fiji in games.That is negative hype. Vega FE beats Titan Xp at engineering applications, so I can very well see the gaming performance being brought down because of that configuration.
But the reason why a Quadro P5000 is faster than a Titan Xp is different hardware that only Quadros have enabled. If you want to believe that this is because of signed drivers, okay, I can't convince you otherwise. A quick web search suggests that much of this actually comes from hardware differences at the board level, i.e. the Quadro cards enable (at least some/many of) these features at the hardware level. But, as usual, I'm tired of arguing in circles with you, so I'll leave it at that. The underlying point here is that I find it very hard to believe that AMD can fix their problems with a driver update.
There's little reason to believe that there's an actual hardware difference (besides ECC VRAM on some GPUs). It wouldn't make much sense to develop custom silicon for GPUs with such low quantities.
In the past there have been different ways to use the Quadro drivers on GeForce cards, e.g. by modifying the driver or by altering the hard-straps in the GPU board (thus changing the device ID), which notably increases the performance in professional CAD applications.
I know this has worked for many GPUs up to Kepler generation, didn't find similar hacks for Maxwell/Pascal yet.
A positive Vega FE review!
Vega was supposed to beat 1080 Ti though, right? They are miles behind right now, and I'm not holding my breath for them to "fix" this with a driver update.
No. RX Vega was supposed to beat the 1080. There were no 1080ti when the demos were making the rounds.
According to Don, the Radeon RX Vega performance compared to the likes of NVIDIA’s GeForce GTX 1080 Ti and the Titan Xp looks really nice.
You sure about that? Pretty sure koyoot linked this back in April:
http://wccftech.com/amd-radeon-rx-vega-performance-gtx-1080-ti-titan-xp/