384 Bit GDDR6GV102 - 512 bit GDDR6 32 GiB
24 and 48 GB Versions .
384 Bit GDDR6GV102 - 512 bit GDDR6 32 GiB
Possible Memory configurations for GV consumer chips.
GV104 - 256 Bit GDDR6 16 GB
GV106 - 256 Bit GDDR5X 8 GB
GV107 - 192 Bit GDDR5X 6 GB.
That's what she said."He".
My point is simply that with a gender-free handle, it's wrong to assume "he".
Good zing. The singular "they" would be the safest.But it's not wrong to assume "she"?
Are you a puppie or a kitten? Or should we just call you "fluffy"?Im having hard time understanding, how "koyoot" is not in any way suggesting the gender of the user behind this nickname.
Isn't my nick's phonetic version of this:Are you a puppie or a kitten? Or should we just call you "fluffy"?
I think that I'll go with "Fluffy".
In Texas, where they are plentiful, 2 variations exist. Go here and click the American pronunciation.Isn't my nick's phonetic version of this:
Nvidia did absolutely great job here.
Nah. The only thing Nvidia has better than AMD is Windows software, and CUDA. Actual hardware IPC is lower than AMD GCN, and Nvidia with Volta actually, just have paired with it. There is no point in looking for Nvidia under Mac OS, software is better for AMD, and you will get better results. Unless you focus on Windows and gaming, then yes, you will get better results with Nvidia.It's a shame Apple will continue to ignore NVIDIA, despite how much better their GPUs are.
Nah. The only thing Nvidia has better than AMD is Windows software, and CUDA. Actual hardware IPC is lower than AMD GCN, and Nvidia with Volta actually, just have paired with it. There is no point in looking for Nvidia under Mac OS, software is better for AMD, and you will get better results. Unless you focus on Windows and gaming, then yes, you will get better results with Nvidia.
There is only one Nvidia GPU that can do 120 TFLOPs of DL operations, and it would NEVER come to Mac, so it is moot point.Oh, so Vega can do more than 120 TFLOPs for Deep Learning? I must've missed that.
NVIDIA on macOS is a chicken and egg problem right now. Given the lack of official hardware over the last 4+ years, their investment in macOS software has clearly declined. If Apple was to return to NVIDIA for an official product, then the software quality would improve.
https://developer.nvidia.com/cuda-downloads...
Remember: there is, and there will be no CUDA on Mac. So what is the technical reason for switch? The only understandable switch would be switch from Polaris 11, and 10 to GV107, 106 and 104. But iMac Pro, and Mac Pro GPUs should stay on AMD.
The rhetorical reply would be that there is no CUDA on Windows nor on Linux, nor will there ever be.Is it official support from Apple?
Its rhetoric question...
Does Microsoft or Linux have official support for any API?The rhetorical reply would be that there is no CUDA on Windows nor on Linux, nor will there ever be.