So are the D300s a bad, average, or good gpu set compared to other cards?
Looking to order your setup for photography/Aperture, but wondering if 16GB ram and D500 could make a difference in a few years....
For most things now, but it's not future proof at all, and it's actually already outdated. Get the D500, which is midrange, and if you can, the D700, which is pretty good (~7970).
Interesting thread. If I'm going to be editing HD video along with Logic and LR, are you guys saying the D500 isn't any better. Just when I thought I had my mind made up.....
Interesting thread. If I'm going to be editing HD video along with Logic and LR, are you guys saying the D500 isn't any better. Just when I thought I had my mind made up.....
1. They both have very similar OpenGL performance
2. The D500s have better OpenCL performance (60% more?)
3. The D500s have 1GB more VRAM each
4. The D500s have 4x the double-precision performance
What gain does the D500 offer? More computational math capabilities, and lower graphics benchmark scores sofar?
Yet to see any evidence that the D500 isn't underwhelming for the price, with no discernible difference to the D300 for non-hardcore-rendering.
D700 might be an option, but that is a $1,000 upgrade over the D300.
Scroll down to the bottom of http://www.barefeats.com/tube04.html and look at the comparisons. Each benchmark a different card shines and fails.
We need real world app testing.
Specs don't lie, the D300 is already basically outdated. It's just not a good card, period.
If nothing else, the extra VRAM is hugely important, especially if the OP moves toward higher resolution displays in the future.
Why do you say this? I'm not arguing I'm asking.. If the D300 and D500 are giving same results, does that mean both of those are outdated?
Hmm, I missed that. Where did you see that difference? Bare Feats has something like a 4% difference, unless I misread something.2. The D500s have better OpenCL performance (60% more?)
Hmm, I missed that. Where did you see that difference? Bare Feats has something like a 4% difference, unless I misread something.
I agree, synthetic benchmarks alone are misleading without real word tests. Meanwhile, specs don't lie, the D300 is already basically outdated. It's just not that great of a card, period. If nothing else, the extra VRAM can potentially be hugely important, especially if the OP moves toward higher resolution displays in the future. The D500 upgrade is probably also not a great value based on how it compares to consumer cards in the decidedly mid-range area, but I was just responding to the OP's query on the D300 and in my opinion, it's not very good.
How much VRAM is needed to run 4K displays? Would the D300s be able to run two of them?If nothing else, the extra VRAM can potentially be hugely important, especially if the OP moves toward higher resolution displays in the future.
How much VRAM is needed to run 4K displays? Would the D300s be able to run two of them?
How much VRAM is needed to run 4K displays? Would the D300s be able to run two of them?
A 4K Screenbuffer needs 32 MB. Any card with fast enough outputs can deliver it.
However, early tests of any 3D gaming on 4K has shown that 2GB creates issues and 3-4GB of VRAM is much better on 4K. This may be telling about how much VRAM would be needed for professional 3D work on 4K as well. IMHO, It would seem like a no-brainer to go for the D500's added 1GB of VRAM if you plan to use 4K displays... it's a $400 option.