Can anyone help me find out or understand why Photoshop CC 2015.5 is showing some of the things it is regarding my GeForce GTX TITAN X? Couple of things that concern me or that I'm curious about:
1. glgpu[0].GLVersion="2.1" listed as OpenGL version 2.1. NVIDIA and EVGA specs show OpenGL 4.5 and OpenGL 4.4, respectively.
2. glgpu[0].glGetIntegerv[GL_MAX_ all 9 of these entries indicate lower values than weaker GPUs. (included links showing weaker cards supporting higher intergers)
3. CUDASupported=0 No CUDA support. Photoshop doesn't even support CUDA but I'm curious why other NVIDIA cards show CUDASupported=1.
Hope someone has some info for me. Is this a OS X issue, Adobe GPU Sniffer issue, NVIDIA driver issue, or something with my GPU itself? Very much appreciate any insight anyone can offer.
What's involved:
Links:
Photoshop System Info:
(all items below highlighted in RED are the areas I'm interested in)
Operating System: Mac OS 10.11.6
System architecture: Intel CPU Family:6, Model:44, Stepping:2 with MMX, SSE Integer, SSE FP, SSE2, SSE3, SSE4.1, SSE4.2, HyperThreading
Physical processor count: 12
Logical processor count: 24
Processor speed: 2660 MHz
Built-in memory: 98304 MB
Free memory: 70571 MB
Memory available to Photoshop: 94587 MB
Memory used by Photoshop: 75 %
Alias Layers: ^0
Modifier Palette: Disabled.
Design Space: Disabled.
3D Multitone Printing: Disabled.
Highbeam: Disabled.
Image tile size: 1024K
Image cache levels: 2
Font Preview: Disabled
TextComposer: Latin
Display: 1
Main Display
Display Bounds: top=0, left=0, bottom=1600, right=2560
OpenGL Drawing: Enabled.
OpenGL Allow Old GPUs: Not Detected.
OpenGL Drawing Mode: Normal
OpenGL Allow Normal Mode: True.
OpenGL Allow Advanced Mode: True.
AIFCoreInitialized=1
AIFOGLInitialized=1
OGLContextCreated=1
NumGLGPUs=1
NumCLGPUs=1
glgpu[0].GLVersion="2.1" <—OpenGL Version
glgpu[0].IsIntegratedGLGPU=0
glgpu[0].GLMemoryMB=12288
glgpu[0].GLName="NVIDIA GeForce GTX TITAN X OpenGL Engine"
glgpu[0].GLVendor="NVIDIA Corporation"
glgpu[0].GLVendorID=4318
glgpu[0].GLRectTextureSize=16384
glgpu[0].GLRenderer="NVIDIA GeForce GTX TITAN X OpenGL Engine"
glgpu[0].GLRendererID=16918368
glgpu[0].HasGLNPOTSupport=1
glgpu[0].CanCompileProgramGLSL=1
glgpu[0].GLFrameBufferOK=1
glgpu[0].glGetString[GL_SHADING_LANGUAGE_VERSION]="1.20"
glgpu[0].glGetProgramivARB[GL_FRAGMENT_PROGRAM_ARB][GL_MAX_PROGRAM_INSTRUCTIONS_ARB]=[65536]
glgpu[0].glGetIntegerv[GL_MAX_TEXTURE_UNITS]=[8]
glgpu[0].glGetIntegerv[GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS]=[16]
glgpu[0].glGetIntegerv[GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS]=[16]
glgpu[0].glGetIntegerv[GL_MAX_TEXTURE_IMAGE_UNITS]=[16]
glgpu[0].glGetIntegerv[GL_MAX_DRAW_BUFFERS]=[8]
glgpu[0].glGetIntegerv[GL_MAX_VERTEX_UNIFORM_COMPONENTS]=[4096]
glgpu[0].glGetIntegerv[GL_MAX_FRAGMENT_UNIFORM_COMPONENTS]=[4096]
glgpu[0].glGetIntegerv[GL_MAX_VARYING_FLOATS]=[124]
glgpu[0].glGetIntegerv[GL_MAX_VERTEX_ATTRIBS]=[16]
glgpu[0].extension[AIF::OGL::GL_ARB_VERTEX_PROGRAM]=1
glgpu[0].extension[AIF::OGL::GL_ARB_FRAGMENT_PROGRAM]=1
glgpu[0].extension[AIF::OGL::GL_ARB_VERTEX_SHADER]=1
glgpu[0].extension[AIF::OGL::GL_ARB_FRAGMENT_SHADER]=1
glgpu[0].extension[AIF::OGL::GL_EXT_FRAMEBUFFER_OBJECT]=1
glgpu[0].extension[AIF::OGL::GL_ARB_TEXTURE_RECTANGLE]=1
glgpu[0].extension[AIF::OGL::GL_ARB_TEXTURE_FLOAT]=1
glgpu[0].extension[AIF::OGL::GL_ARB_OCCLUSION_QUERY]=1
glgpu[0].extension[AIF::OGL::GL_ARB_VERTEX_BUFFER_OBJECT]=1
glgpu[0].extension[AIF::OGL::GL_ARB_SHADER_TEXTURE_LOD]=1
clgpu[0].CLPlatformVersion="1.2 (Jun 30 2016 20:18:53)"
clgpu[0].CLDeviceVersion="1.2 "
clgpu[0].IsIntegratedCLGPU=0
clgpu[0].CLMemoryMB=12288
clgpu[0].CLName="GeForce GTX TITAN X"
clgpu[0].CLVendor="NVIDIA"
clgpu[0].CLVendorID=16918272
clgpu[0].CLDriverVersion="10.11.14 346.03.15f04"
clgpu[0].CUDASupported=0 <—CUDA SUPPORT
clgpu[0].CLBandwidth=2.46933e+11
clgpu[0].CLCompute=2660.03
1. glgpu[0].GLVersion="2.1" listed as OpenGL version 2.1. NVIDIA and EVGA specs show OpenGL 4.5 and OpenGL 4.4, respectively.
2. glgpu[0].glGetIntegerv[GL_MAX_ all 9 of these entries indicate lower values than weaker GPUs. (included links showing weaker cards supporting higher intergers)
3. CUDASupported=0 No CUDA support. Photoshop doesn't even support CUDA but I'm curious why other NVIDIA cards show CUDASupported=1.
Hope someone has some info for me. Is this a OS X issue, Adobe GPU Sniffer issue, NVIDIA driver issue, or something with my GPU itself? Very much appreciate any insight anyone can offer.
What's involved:
- OS X 10.11.6
- Mac Pro 5,1
- Adobe CC 2015.5
- NVIDIA GeForce GTX TITAN X (Maxwell)
- NVIDIA Drivers: 346.03.15f04
- CUDA Version: CUDA 8.0.51 driver for MAC
Links:
- Anyone else have issues with Focus Area Selection and the Nvidia GeForce 970 Card?
- Photoshop doesn't detect my nvidia gt750m
- Photoshop not detecting Nvidia GTX 860m
Photoshop System Info:
(all items below highlighted in RED are the areas I'm interested in)
Operating System: Mac OS 10.11.6
System architecture: Intel CPU Family:6, Model:44, Stepping:2 with MMX, SSE Integer, SSE FP, SSE2, SSE3, SSE4.1, SSE4.2, HyperThreading
Physical processor count: 12
Logical processor count: 24
Processor speed: 2660 MHz
Built-in memory: 98304 MB
Free memory: 70571 MB
Memory available to Photoshop: 94587 MB
Memory used by Photoshop: 75 %
Alias Layers: ^0
Modifier Palette: Disabled.
Design Space: Disabled.
3D Multitone Printing: Disabled.
Highbeam: Disabled.
Image tile size: 1024K
Image cache levels: 2
Font Preview: Disabled
TextComposer: Latin
Display: 1
Main Display
Display Bounds: top=0, left=0, bottom=1600, right=2560
OpenGL Drawing: Enabled.
OpenGL Allow Old GPUs: Not Detected.
OpenGL Drawing Mode: Normal
OpenGL Allow Normal Mode: True.
OpenGL Allow Advanced Mode: True.
AIFCoreInitialized=1
AIFOGLInitialized=1
OGLContextCreated=1
NumGLGPUs=1
NumCLGPUs=1
glgpu[0].GLVersion="2.1" <—OpenGL Version
glgpu[0].IsIntegratedGLGPU=0
glgpu[0].GLMemoryMB=12288
glgpu[0].GLName="NVIDIA GeForce GTX TITAN X OpenGL Engine"
glgpu[0].GLVendor="NVIDIA Corporation"
glgpu[0].GLVendorID=4318
glgpu[0].GLRectTextureSize=16384
glgpu[0].GLRenderer="NVIDIA GeForce GTX TITAN X OpenGL Engine"
glgpu[0].GLRendererID=16918368
glgpu[0].HasGLNPOTSupport=1
glgpu[0].CanCompileProgramGLSL=1
glgpu[0].GLFrameBufferOK=1
glgpu[0].glGetString[GL_SHADING_LANGUAGE_VERSION]="1.20"
glgpu[0].glGetProgramivARB[GL_FRAGMENT_PROGRAM_ARB][GL_MAX_PROGRAM_INSTRUCTIONS_ARB]=[65536]
glgpu[0].glGetIntegerv[GL_MAX_TEXTURE_UNITS]=[8]
glgpu[0].glGetIntegerv[GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS]=[16]
glgpu[0].glGetIntegerv[GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS]=[16]
glgpu[0].glGetIntegerv[GL_MAX_TEXTURE_IMAGE_UNITS]=[16]
glgpu[0].glGetIntegerv[GL_MAX_DRAW_BUFFERS]=[8]
glgpu[0].glGetIntegerv[GL_MAX_VERTEX_UNIFORM_COMPONENTS]=[4096]
glgpu[0].glGetIntegerv[GL_MAX_FRAGMENT_UNIFORM_COMPONENTS]=[4096]
glgpu[0].glGetIntegerv[GL_MAX_VARYING_FLOATS]=[124]
glgpu[0].glGetIntegerv[GL_MAX_VERTEX_ATTRIBS]=[16]
glgpu[0].extension[AIF::OGL::GL_ARB_VERTEX_PROGRAM]=1
glgpu[0].extension[AIF::OGL::GL_ARB_FRAGMENT_PROGRAM]=1
glgpu[0].extension[AIF::OGL::GL_ARB_VERTEX_SHADER]=1
glgpu[0].extension[AIF::OGL::GL_ARB_FRAGMENT_SHADER]=1
glgpu[0].extension[AIF::OGL::GL_EXT_FRAMEBUFFER_OBJECT]=1
glgpu[0].extension[AIF::OGL::GL_ARB_TEXTURE_RECTANGLE]=1
glgpu[0].extension[AIF::OGL::GL_ARB_TEXTURE_FLOAT]=1
glgpu[0].extension[AIF::OGL::GL_ARB_OCCLUSION_QUERY]=1
glgpu[0].extension[AIF::OGL::GL_ARB_VERTEX_BUFFER_OBJECT]=1
glgpu[0].extension[AIF::OGL::GL_ARB_SHADER_TEXTURE_LOD]=1
clgpu[0].CLPlatformVersion="1.2 (Jun 30 2016 20:18:53)"
clgpu[0].CLDeviceVersion="1.2 "
clgpu[0].IsIntegratedCLGPU=0
clgpu[0].CLMemoryMB=12288
clgpu[0].CLName="GeForce GTX TITAN X"
clgpu[0].CLVendor="NVIDIA"
clgpu[0].CLVendorID=16918272
clgpu[0].CLDriverVersion="10.11.14 346.03.15f04"
clgpu[0].CUDASupported=0 <—CUDA SUPPORT
clgpu[0].CLBandwidth=2.46933e+11
clgpu[0].CLCompute=2660.03
Last edited: