A while ago I bought 2 new Samsung monitors to run on my 2008 Mac Pro 3,1 with a GTX770 graphics card. I unplugged the old monitors, plugged in the new ones (using Samsung's supplied HDMI cables, the only ones I've found that will actually run them at their full 144Hz speed). One of the screens came on, the other didn't. I replugged them, and neither came on. I rebooted, no chime, the graphics card fan came on full speed, and there was no video output. I figured I must have killed the graphics card, so I bought a new one (the same model). And that did exactly the same thing!
I then tried using the original 8800GT that was in the Mac Pro when new, and I still have lying around. That didn't run fans, but no video either.
If I remove the graphics card altogether, the machine boots, and I can access it over SSH locally, but of course the logs are filled with errors trying to start the graphics system. I tried one of the cards in a friend's PC, and it also failed to boot (not even a POST) and had the fans run full speed.
Eventually I gave up and bought a 2012 Mac Pro with an RX580, and that works fine with the same monitors. But I'm now left with this half-dead Mac Pro that I'd like to sell, along with 3 not-terrible graphics cards, but it's obviously not going anywhere without working graphics, and I can't tell what's really happened to them.
So I need to diagnose what's gone wrong – is it just the cards, the motherboard, or both? While I could test the apparently dead graphics cards in my new Mac Pro, I'm reluctant to do that for fear of killing it too! It seems bizarre that so much damage cound have been done by simply replugging a monitor! It's tempting to buy a cheap PC just to see if I can diagnose it better on there, since more diagnostic tools on DOS/Windows, firmware twiddling might be possible, and it will have built-in graphics that should always work.
Does anyone have a good idea what might have happened, and how I might diagnose what state things are really in?
I then tried using the original 8800GT that was in the Mac Pro when new, and I still have lying around. That didn't run fans, but no video either.
If I remove the graphics card altogether, the machine boots, and I can access it over SSH locally, but of course the logs are filled with errors trying to start the graphics system. I tried one of the cards in a friend's PC, and it also failed to boot (not even a POST) and had the fans run full speed.
Eventually I gave up and bought a 2012 Mac Pro with an RX580, and that works fine with the same monitors. But I'm now left with this half-dead Mac Pro that I'd like to sell, along with 3 not-terrible graphics cards, but it's obviously not going anywhere without working graphics, and I can't tell what's really happened to them.
So I need to diagnose what's gone wrong – is it just the cards, the motherboard, or both? While I could test the apparently dead graphics cards in my new Mac Pro, I'm reluctant to do that for fear of killing it too! It seems bizarre that so much damage cound have been done by simply replugging a monitor! It's tempting to buy a cheap PC just to see if I can diagnose it better on there, since more diagnostic tools on DOS/Windows, firmware twiddling might be possible, and it will have built-in graphics that should always work.
Does anyone have a good idea what might have happened, and how I might diagnose what state things are really in?