Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

1024724

macrumors member
Original poster
Apr 4, 2016
83
77
Can anyone help me find out or understand why Photoshop CC 2015.5 is showing some of the things it is regarding my GeForce GTX TITAN X? Couple of things that concern me or that I'm curious about:

1. glgpu[0].GLVersion="2.1" listed as OpenGL version 2.1. NVIDIA and EVGA specs show OpenGL 4.5 and OpenGL 4.4, respectively.
2. glgpu[0].glGetIntegerv[GL_MAX_ all 9 of these entries indicate lower values than weaker GPUs. (included links showing weaker cards supporting higher intergers)
3. CUDASupported=0 No CUDA support. Photoshop doesn't even support CUDA but I'm curious why other NVIDIA cards show CUDASupported=1.

Hope someone has some info for me. Is this a OS X issue, Adobe GPU Sniffer issue, NVIDIA driver issue, or something with my GPU itself? Very much appreciate any insight anyone can offer.

What's involved:
  • OS X 10.11.6
  • Mac Pro 5,1
  • Adobe CC 2015.5
  • NVIDIA GeForce GTX TITAN X (Maxwell)
  • NVIDIA Drivers: 346.03.15f04
  • CUDA Version: CUDA 8.0.51 driver for MAC

Links:


Photoshop System Info:
(all items below highlighted in RED are the areas I'm interested in)

Operating System: Mac OS 10.11.6
System architecture: Intel CPU Family:6, Model:44, Stepping:2 with MMX, SSE Integer, SSE FP, SSE2, SSE3, SSE4.1, SSE4.2, HyperThreading
Physical processor count: 12
Logical processor count: 24
Processor speed: 2660 MHz
Built-in memory: 98304 MB
Free memory: 70571 MB
Memory available to Photoshop: 94587 MB
Memory used by Photoshop: 75 %
Alias Layers: ^0
Modifier Palette: Disabled.
Design Space: Disabled.
3D Multitone Printing: Disabled.
Highbeam: Disabled.
Image tile size: 1024K
Image cache levels: 2
Font Preview: Disabled
TextComposer: Latin
Display: 1
Main Display
Display Bounds: top=0, left=0, bottom=1600, right=2560
OpenGL Drawing: Enabled.
OpenGL Allow Old GPUs: Not Detected.
OpenGL Drawing Mode: Normal
OpenGL Allow Normal Mode: True.
OpenGL Allow Advanced Mode: True.
AIFCoreInitialized=1
AIFOGLInitialized=1
OGLContextCreated=1
NumGLGPUs=1
NumCLGPUs=1
glgpu[0].GLVersion="2.1" <—OpenGL Version
glgpu[0].IsIntegratedGLGPU=0
glgpu[0].GLMemoryMB=12288
glgpu[0].GLName="NVIDIA GeForce GTX TITAN X OpenGL Engine"
glgpu[0].GLVendor="NVIDIA Corporation"
glgpu[0].GLVendorID=4318
glgpu[0].GLRectTextureSize=16384
glgpu[0].GLRenderer="NVIDIA GeForce GTX TITAN X OpenGL Engine"
glgpu[0].GLRendererID=16918368
glgpu[0].HasGLNPOTSupport=1
glgpu[0].CanCompileProgramGLSL=1
glgpu[0].GLFrameBufferOK=1
glgpu[0].glGetString[GL_SHADING_LANGUAGE_VERSION]="1.20"
glgpu[0].glGetProgramivARB[GL_FRAGMENT_PROGRAM_ARB][GL_MAX_PROGRAM_INSTRUCTIONS_ARB]=[65536]
glgpu[0].glGetIntegerv[GL_MAX_TEXTURE_UNITS]=[8]
glgpu[0].glGetIntegerv[GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS]=[16]
glgpu[0].glGetIntegerv[GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS]=[16]
glgpu[0].glGetIntegerv[GL_MAX_TEXTURE_IMAGE_UNITS]=[16]
glgpu[0].glGetIntegerv[GL_MAX_DRAW_BUFFERS]=[8]
glgpu[0].glGetIntegerv[GL_MAX_VERTEX_UNIFORM_COMPONENTS]=[4096]
glgpu[0].glGetIntegerv[GL_MAX_FRAGMENT_UNIFORM_COMPONENTS]=[4096]
glgpu[0].glGetIntegerv[GL_MAX_VARYING_FLOATS]=[124]
glgpu[0].glGetIntegerv[GL_MAX_VERTEX_ATTRIBS]=[16]

glgpu[0].extension[AIF::OGL::GL_ARB_VERTEX_PROGRAM]=1
glgpu[0].extension[AIF::OGL::GL_ARB_FRAGMENT_PROGRAM]=1
glgpu[0].extension[AIF::OGL::GL_ARB_VERTEX_SHADER]=1
glgpu[0].extension[AIF::OGL::GL_ARB_FRAGMENT_SHADER]=1
glgpu[0].extension[AIF::OGL::GL_EXT_FRAMEBUFFER_OBJECT]=1
glgpu[0].extension[AIF::OGL::GL_ARB_TEXTURE_RECTANGLE]=1
glgpu[0].extension[AIF::OGL::GL_ARB_TEXTURE_FLOAT]=1
glgpu[0].extension[AIF::OGL::GL_ARB_OCCLUSION_QUERY]=1
glgpu[0].extension[AIF::OGL::GL_ARB_VERTEX_BUFFER_OBJECT]=1
glgpu[0].extension[AIF::OGL::GL_ARB_SHADER_TEXTURE_LOD]=1
clgpu[0].CLPlatformVersion="1.2 (Jun 30 2016 20:18:53)"
clgpu[0].CLDeviceVersion="1.2 "
clgpu[0].IsIntegratedCLGPU=0
clgpu[0].CLMemoryMB=12288
clgpu[0].CLName="GeForce GTX TITAN X"
clgpu[0].CLVendor="NVIDIA"
clgpu[0].CLVendorID=16918272
clgpu[0].CLDriverVersion="10.11.14 346.03.15f04"
clgpu[0].CUDASupported=0 <—CUDA SUPPORT
clgpu[0].CLBandwidth=2.46933e+11
clgpu[0].CLCompute=2660.03
 
Last edited:
The app needs to create a core profile context to get GL 4.1, the legacy version will be 2.1 still.
 
Ok, thanks for your input. So how do I prompt the app to this or is it something I'll be unable to do? Also, "by app" I assume you're talking about Photoshop?
 
well the gpu dose not do much in Photoshop, if it works why mess with it?
 
well the gpu dose not do much in Photoshop, if it works why mess with it?

Curiousity and wanting to understand a little deeper why a System Info report says one thing when it's clearly another. Also that other NVIDIA cards (as posted in the links I included) are not showing these inconstancies. Default thought is that something's wrong. That may or may not be the case, I have no idea. I tend to think this is something anyone would be curious about given a similar situation, i.e., hardware specs = A, software reports that hardware specs = B. So I'm asking if someone may know why, what are possible reasons behind the inaccuracies in my report.

Also wondering if this is a "par for the course" trade-off when using a flashed, non-OEM GPU? That some things may just be a little "off" here and there.
 
Last edited:
Curiousity and wanting to understand a little deeper why a System Info report says one thing when it's clearly another. Also that other NVIDIA cards (as posted in the links I included) are not showing these inconstancies. Default thought is that something's wrong. That may or may not be the case, I have no idea. I tend to think this is something anyone would be curious about given a similar situation, i.e., hardware specs = A, software reports that hardware specs = B. So I'm asking if someone may know why, what are possible reasons behind the inaccuracies in my report.

Also wondering if this is a "par for the course" trade-off when using a flashed, non-OEM GPU? That some things may just be a little "off" here and there.

No, the apps is NOT reporting the hardware spec. It's more like reporting the overall (hardware + software) spec. If no CUDA avail to the software (e.g. No driver installed), the the software will report a NO at there. It doesn't mean that your card doesn't have CUDA.

Also, the software is now running with OpenGL 2.1, it doesn't mean that your hardware cannot do OpenGL 4.5. Anyway, I don't think OpenGL 4.5 even exist in OSX.
 
Last edited:
Adobe uses Mercury Engine for drawing images in Creative Suite. It leans a little on OpenGL and OpenCL, but not on CUDA (screenshot below)

Metal support has been arriving for the video apps but so far users are not impressed.
 

Attachments

  • IMG_1579.PNG
    IMG_1579.PNG
    916.4 KB · Views: 374
yep your confusing hardware with software (is OpenGL driver or software?).

the card supports up to OpenGL 4.5 (supports being the operative word)

osx10.11 supports up to OpenGL 4.1 kind of (OpenGL is made of lots of parts which are kind of independent so while osx10.11 suports some 4.1 and some 4.2 but not all 4.2 i think)

now the software is only utilising OpenGL 2.1

thats a simple explanation

ps CUDA is being axed >.<

:also adobe dose provide software support, part of what your paying for
 
But what is limiting it to 2.1, this is all I'm trying to find out. Is it OS X or the NVIDIA drivers or what?

Core = 4.1
core.png

Compatibility = 2.1
compatibility.png
 
You answered your own question. Nothing is limiting it to 2.1, there is no compatibility profile version greater than that. The latest core profile version is 4.5, but Apple only supports 4.1 and some 4.2 extensions.
 
your paying a lot for adobe CC apps that comes with some support, go ask adobe for help.

creative cow or the red user forums are also good places to ask for help.

i think the answer has been given, if your app works then get down to editing.

if you need a faster computer do like a lot of users and make a hackintosh or use windows on a new windows computer.
 
Can anyone help me find out or understand why Photoshop CC 2015.5 is showing some of the things it is regarding my GeForce GTX TITAN X? Couple of things that concern me or that I'm curious about:

1. glgpu[0].GLVersion="2.1" listed as OpenGL version 2.1. NVIDIA and EVGA specs show OpenGL 4.5 and OpenGL 4.4, respectively.
2. glgpu[0].glGetIntegerv[GL_MAX_ all 9 of these entries indicate lower values than weaker GPUs. (included links showing weaker cards supporting higher intergers)
3. CUDASupported=0 No CUDA support. Photoshop doesn't even support CUDA but I'm curious why other NVIDIA cards show CUDASupported=1.

Hope someone has some info for me. Is this a OS X issue, Adobe GPU Sniffer issue, NVIDIA driver issue, or something with my GPU itself? Very much appreciate any insight anyone can offer.

What's involved:
  • OS X 10.11.6
  • Mac Pro 5,1
  • Adobe CC 2015.5
  • NVIDIA GeForce GTX TITAN X (Maxwell)
  • NVIDIA Drivers: 346.03.15f04
  • CUDA Version: CUDA 8.0.51 driver for MAC

Links:


Photoshop System Info:
(all items below highlighted in RED are the areas I'm interested in)

Operating System: Mac OS 10.11.6
System architecture: Intel CPU Family:6, Model:44, Stepping:2 with MMX, SSE Integer, SSE FP, SSE2, SSE3, SSE4.1, SSE4.2, HyperThreading
Physical processor count: 12
Logical processor count: 24
Processor speed: 2660 MHz
Built-in memory: 98304 MB
Free memory: 70571 MB
Memory available to Photoshop: 94587 MB
Memory used by Photoshop: 75 %
Alias Layers: ^0
Modifier Palette: Disabled.
Design Space: Disabled.
3D Multitone Printing: Disabled.
Highbeam: Disabled.
Image tile size: 1024K
Image cache levels: 2
Font Preview: Disabled
TextComposer: Latin
Display: 1
Main Display
Display Bounds: top=0, left=0, bottom=1600, right=2560
OpenGL Drawing: Enabled.
OpenGL Allow Old GPUs: Not Detected.
OpenGL Drawing Mode: Normal
OpenGL Allow Normal Mode: True.
OpenGL Allow Advanced Mode: True.
AIFCoreInitialized=1
AIFOGLInitialized=1
OGLContextCreated=1
NumGLGPUs=1
NumCLGPUs=1
glgpu[0].GLVersion="2.1" <—OpenGL Version
glgpu[0].IsIntegratedGLGPU=0
glgpu[0].GLMemoryMB=12288
glgpu[0].GLName="NVIDIA GeForce GTX TITAN X OpenGL Engine"
glgpu[0].GLVendor="NVIDIA Corporation"
glgpu[0].GLVendorID=4318
glgpu[0].GLRectTextureSize=16384
glgpu[0].GLRenderer="NVIDIA GeForce GTX TITAN X OpenGL Engine"
glgpu[0].GLRendererID=16918368
glgpu[0].HasGLNPOTSupport=1
glgpu[0].CanCompileProgramGLSL=1
glgpu[0].GLFrameBufferOK=1
glgpu[0].glGetString[GL_SHADING_LANGUAGE_VERSION]="1.20"
glgpu[0].glGetProgramivARB[GL_FRAGMENT_PROGRAM_ARB][GL_MAX_PROGRAM_INSTRUCTIONS_ARB]=[65536]
glgpu[0].glGetIntegerv[GL_MAX_TEXTURE_UNITS]=[8]
glgpu[0].glGetIntegerv[GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS]=[16]
glgpu[0].glGetIntegerv[GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS]=[16]
glgpu[0].glGetIntegerv[GL_MAX_TEXTURE_IMAGE_UNITS]=[16]
glgpu[0].glGetIntegerv[GL_MAX_DRAW_BUFFERS]=[8]
glgpu[0].glGetIntegerv[GL_MAX_VERTEX_UNIFORM_COMPONENTS]=[4096]
glgpu[0].glGetIntegerv[GL_MAX_FRAGMENT_UNIFORM_COMPONENTS]=[4096]
glgpu[0].glGetIntegerv[GL_MAX_VARYING_FLOATS]=[124]
glgpu[0].glGetIntegerv[GL_MAX_VERTEX_ATTRIBS]=[16]

glgpu[0].extension[AIF::OGL::GL_ARB_VERTEX_PROGRAM]=1
glgpu[0].extension[AIF::OGL::GL_ARB_FRAGMENT_PROGRAM]=1
glgpu[0].extension[AIF::OGL::GL_ARB_VERTEX_SHADER]=1
glgpu[0].extension[AIF::OGL::GL_ARB_FRAGMENT_SHADER]=1
glgpu[0].extension[AIF::OGL::GL_EXT_FRAMEBUFFER_OBJECT]=1
glgpu[0].extension[AIF::OGL::GL_ARB_TEXTURE_RECTANGLE]=1
glgpu[0].extension[AIF::OGL::GL_ARB_TEXTURE_FLOAT]=1
glgpu[0].extension[AIF::OGL::GL_ARB_OCCLUSION_QUERY]=1
glgpu[0].extension[AIF::OGL::GL_ARB_VERTEX_BUFFER_OBJECT]=1
glgpu[0].extension[AIF::OGL::GL_ARB_SHADER_TEXTURE_LOD]=1
clgpu[0].CLPlatformVersion="1.2 (Jun 30 2016 20:18:53)"
clgpu[0].CLDeviceVersion="1.2 "
clgpu[0].IsIntegratedCLGPU=0
clgpu[0].CLMemoryMB=12288
clgpu[0].CLName="GeForce GTX TITAN X"
clgpu[0].CLVendor="NVIDIA"
clgpu[0].CLVendorID=16918272
clgpu[0].CLDriverVersion="10.11.14 346.03.15f04"
clgpu[0].CUDASupported=0 <—CUDA SUPPORT
clgpu[0].CLBandwidth=2.46933e+11
clgpu[0].CLCompute=2660.03



***********ATTTENTION**ATTENTION**ATTENTION****

The links you provided above and are trying to follow are by a COMPLETE SCAMMER, if you gave him any money and/or personal information, call your bank and make a charge back ASAP!

Poof List:

#1 The site says to make an iMac with an unsupported videocard without any proper guidance. (Requires work arounds, extra drivers, patched kexts and other option are much more stable

#2 He lists very old SMBIOS's to use, like MacPro 5.1 (which is for an old core 2 duo and would never work cause it has not GPU expansion slot.

#3 He FLAT OUT says you need El Capitain! Which is total BS as far as I know (You Would need one of the Sierra's to get that drivers) and it just so happens to be the last release before SIP (System Integrity Protection) was introduced, he does this so he can gain ROOT access by the nVidia drivers he provides on his website (Tested myself on a test machine and it launched my Xquartz [Not Normal]). Most likely plans to steal your banking info ect... ALWAYS GET YOUR WEB-DRIVERS DIRECTLY FROM nVidia.

#4 His address points to a shopping mall in the UK, yah openly pirating Apple goods right out inside a mall...

#5 His linked-In page he has linked from the site says they have 0 affiliation with this Macsite

#6 He owns multiple Mac sites that all have the same theme, plus other themed scams in which he hid his name.

#7 Notice on how his payment e-mail address has a different domain then all the other contact info?

#8 He slipped up and his full name was registered to one of his older scam sites, so if he owes you money here is his info:

Registrant:
Phil Goldsmith

Trading as:
scrumpymacs Ltd

Registrant type:
UK Individual

Registrant's address:
Unit 7 Farthing Rd Ind Est
Ipswich
Suffolk
IP1 5AO
United Kingdom

and I also found his real Linked-In:
https://uk.linkedin.com/in/macrefreshphil

There is some posts on how he ripped other people off:
http://www.macvidcards.com/blog/encourage-create-pro-to-go-legit

You can also use a Whois service to see all the scam and clone sites he owns.


If you're looking to build a Hackintosh or upgrade your current one with a non-authorized part, send me a message and I can help, it's no where as easy as this guy is portraying (He didn't even go over BIOS settings and which parts aren't compadable). Your best source of info to start is to ask in the hackintosh sub on reddit (www.reddit.com/r/hackintosh), as the community is knowledgeable and even modifies kexts/drivers to work with unsupported after-market parts (If you want to drop in a GTX 970 for gaming or to ask which parts are supported before purchasing that BT dongle or sound chip), they are built manually with Clover and no other sketchy stuff is discussed like "Naresh's Pirated Copies"
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.