Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Well, it's a bit vague. Still, the QDR alone might require a new controller altogether, and the lower power and higher clocks at least some tweaking. And the number of pins has changed.
Again, I won't argue but I very much doubt it's that easy.
 
Well, it's a bit vague. Still, the QDR alone might require a new controller altogether, and the lower power and higher clocks at least some tweaking. And the number of pins has changed.
Again, I won't argue but I very much doubt it's that easy.
The difference is with GDDR5X chips, not with controller. Its like new memory hardware allows the processor, which Mem. Controller inherently is for memory chips, ability to access more data, that is stored in the memory, more frequently. Whole feature set is the same, so there is no difference in terms of memory controller. The difference required: modification of PCB.
 
... there is no difference in terms of memory controller. The difference required: modification of PCB.

I also don't want to argue, but I also don't believe it's that simple. Different voltages, different prefetch modes, different pin counts, etc. There may not be huge differences (many new technologies are designed to be as close to backward compatible as possible), and I am not an expert in this area, but the graphics chip's memory controller would very likely have had to be designed with X in mind to achieve that compatibility.

On the other hand, it wouldn't surprise me that almost any new-generation graphics chip would include those appropriate modifications, even if their use has not been demonstrated yet (e.g., due to the RX series currently inhabiting a lower tier of products for which X is perhaps a little overpriced still).
 
I also don't want to argue, but I also don't believe it's that simple.
Agree.

Koyoot took an article that talked about how a GDDR5X controller could easily be backwards compatible with a GDDR5 controller - and twisted into an argument that any GDDR5 controller is forwards compatible with GDDR5X.

Sorry, but technology just doesn't work like that.
 
  • Like
Reactions: tuxon86
Well first of all, I am not arguing, just discussing.

Secondly, maybe you are right, and I misunderstood something. However, I do have a point in discussing about this, if the rumors about RX 485 coming to live with GDDR5X memory, are true.
 
Well first of all, I am not arguing, just discussing.

Secondly, maybe you are right, and I misunderstood something. However, I do have a point in discussing about this, if the rumors about RX 485 coming to live with GDDR5X memory, are true.
Only if the ATI memory controller was designed for GDDR5X from day one - but initially shipped with GDDR5 memory to hit a lower price point.

Surely you can find an ATI press release that covers that point.
 
  • Like
Reactions: tuxon86
Nvidia has announced the GTX 1050 (Ti) based on GP107. The interesting bit of news related to AMD's GPUs is that its manufactured on Samsung's 14 nm process. The base clock speeds are much lower than other Pascal parts manufactured at TSMC. It will be interesting to see the efficiency numbers once reviews are out. If there is a significant drop in efficiency for GP107, this may provide some optimism to AMD fans who were disappointed with the efficiency of Polaris, also manufactured on a similar 14 nm process. The optimism would come in the form of Vega, which is rumored to be made at TSMC which could have a much more efficient and higher performance process.
 
Since I'm the OP and have been gonna get a 480 since I started the thread I thought I'd chime in and say I finally got one. I had to return a U2414 last month the shop sent it to it's HQ then form there to the Dell repair center who just replaced it. I got the email that it was ready to be picked up so I went at luch and asked the manager if I could just return it since it was unopened and he said yes so I took that money added 40 and got an XFX RX 480 GTR. :)
 
http://www.theverge.com/circuitbrea...ia-geforce-gtx-1050-ti-price-release-date-amd

gg, no re
cya next year.
Team Green wins at all price points again.
pls no compute smoke
chart_01.jpg


There are Nvidia Pascal threads on this forum. Write Pascal news in there.

One more thing. It always baffles me. How come people who claim themselves as Professional users, creative people, claim that one brand is better than the other based on gaming benchmarks, completely ignoring compute benchmarks?
This board is unique in this matter.

I guess thats what mindshare always is. I still see this even on myself. I do not believe AMD can compete with Intel on CPU side.

Edit: In another news:
Sys-plus-1.jpg
 
Last edited:
chart_01.jpg


There are Nvidia Pascal threads on this forum. Write Pascal news in there.

One more thing. It always baffles me. How come people who claim themselves as Professional users, creative people, claim that one brand is better than the other based on gaming benchmarks, completely ignoring compute benchmarks?
This board is unique in this matter.

I guess thats what mindshare always is. I still see this even on myself. I do not believe AMD can compete with Intel on CPU side.

Edit: In another news:
Sys-plus-1.jpg
How does it compare to mobile GTX 960 and desktop RX460?
 
How does it compare to mobile GTX 960 and desktop RX460?
Well currently RX 460, with latest drivers is faster on average by 10% from GTX 950 in games at 1080P. And 5-10% slower than desktop GTX 960. Mobile GTX 960 is the same GPU as GTX 750 Ti.
perfrel_1920_1080.png

So 1050 should be on par with RX 460, and GTX 1050 Ti around desktop GTX 960.

Because bare in mind how Time Spy performance translates into gaming:
aTJ2UmZV.jpeg

According to Marketing materials GTX 950 has higher Time Spy score than GTX 960 in real world.
In the marketing material from Zotac GTX 950 scores around 2900 pts. in Time Spy. Here, GTX 960 with stock clocks: 2261 pts.
 
Last edited:
Well currently RX 460, with latest drivers is faster on average by 10% from GTX 950 in games at 1080P. And 5-10% slower than desktop GTX 960. Mobile GTX 960 is the same GPU as GTX 750 Ti.
perfrel_1920_1080.png

So 1050 should be on par with RX 460, and GTX 1050 Ti around desktop GTX 960.
What does that benchmark mean?

I am interested in OpenGL 4.5, Vulkan, OpenCL 2.2 and DX12.
 
That benchmark is average performance of the GPUs in 18 games in review suite on Techpowerup site.
I don't care about global averages. I want to see the performance comparison for each API.

And I see NVIDIA is stuck at OpenCL 1.2, so good call on my going AMD.
 
I don't care about global averages. I want to see the performance comparison for each API.

And I see NVIDIA is stuck at OpenCL 1.2, so good call on my going AMD.

None of those APIs are available on macOS though, right?
 
Thanks AMD

Based largely on the mindshare that AMD was winning back due to RX480, I decided to buy shares a couple of months ago. Sold today for 52% profit.

I actually think that further gains are possible from future products in the pipeline, but I decided to lock in my gain instead of risk losing it due to poor execution. (Bulldozer comes to mind.)

Although I no longer have stock, I continue to wish AMD well because competition benefits everyone, and as a consumer I like to have choices.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.