Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

GalactiNaut

macrumors newbie
Original poster
Jun 19, 2018
20
6
I'm planning on getting a gtx 1080 and having it flashed...but I was thinking: is it possible to put the stock graphics card in the slot above and run a dual monitor setup that way? I was thinking that way I might be able to not flash it. Would I get the boot screen on the one monitor and then have the other come on after the system boots and the drivers load? What do you guys think?

Also, do you need to install the web drivers on windows as well?
 
Yes ! This is exactly what I do. I run two (non-flashed) 1080 TI's and an Apple GT-120 and it works very well. A small display connected to the GT-120 and my main display connected to a 1080 TI.

I see the boot screen on my small display connected to the Apple GT-120 and when the Nvidia web drivers kick in (about half way through the boot) my main display connected to a 1080 TI goes active.

No web drivers are needed for Windows.

Below: Boot screen from the the Apple GT-120 on my small display. Works every time.

1.jpg


Below: Both Displays are active after boot.

3.jpg



It's actually a great setup for me. I watch a lot of YouTube and Vimeo tutorials for graphics programs (Cinema 4D lately) and I can put the tutorial on the small display and follow along with my large display. I actually love it. Plus it's very handy for Photoshop or any other program to put your pallets on the small display while working on the main display.

You are thinking in the right direction...

I'll add this, the small display and the Apple GT-120 cost less than having one of my 1080 TI's flashed by MVC. ( And he can't flash my factory over-clocked EVGA 1080 TI's anyway. ) It's far more versatile than just having a flashed card.
 
Last edited:
I also vote for the GT120 if you aren't really must have the 1080 flashed.

The GT120 can essentially be the Mac EFI for any graphic cards. I use that with my 1080Ti come time ago. It works fine.

On the Windows side, we always need to install the relevant driver before a hardware can be used. However, that's not called "web driver" on the Windows side. But yes, you need it. And Windows should do that automatically to you (on the direct boot with that card installed). However, it won't installed latest best driver from Nvidia, but just from their own data base.

So, you better go to Nvidia download the latest one.

Also, depends on the model your "assistant card", running dual card in Windows may be more complicated that what you think.

In macOS, GT120 is fine to work with other GPU. However, in Windows, you better manually select the GT120's driver to "Microsoft generic VGA adaptor", then disable it.

With this procedure, it will still display (because the EFI still works), but won't cause any conflict to the newer Nvidia card / AMD card.
 
  • Like
Reactions: Dr. Stealth
Yeah when I boot with both cards it switches back to the GT 120, I’m going to try what you said there now
[doublepost=1530487002][/doublepost]It seems like they’re running off the same driver and it just updates it (/downgrades it) to which ever one it’s prioritizing, it’s behaving very strangely
[doublepost=1530488877][/doublepost]How do I set it to the generic driver?
 
Yeah when I boot with both cards it switches back to the GT 120, I’m going to try what you said there now
[doublepost=1530487002][/doublepost]It seems like they’re running off the same driver and it just updates it (/downgrades it) to which ever one it’s prioritizing, it’s behaving very strangely
[doublepost=1530488877][/doublepost]How do I set it to the generic driver?

Right click the GT120 in device manager (it may call itself 9500GT something like that), select driver update, manually choose Microsoft Basic Display Adapter.
GT120+1080Ti Windows.JPG


After driver selected, you can disable it. Don't worry, it will still work, still display, even can still provide temperature reading.
GT120 disbaled + 70C.JPG


Then you can install the latest Nvidia driver for the 1080.
 
Any idea why my 1080p monitor running on the 120 is only able to go up to 1280x1024 in windows? (haven't tried mac side) Is this just a limitation of the setup or is there a setting that I'm not finding?
 
The old monitor has HDMI and VGA....I'm thinking about running DVI to VGA from the 120 and HDMI from the 1080, with the goal of having the DVI/VGA display the boot screen/menu and then switching to HDMI after booting and loading the drivers....do you guys think this will work?
 
Right click the GT120 in device manager (it may call itself 9500GT something like that), select driver update, manually choose Microsoft Basic Display Adapter.
View attachment 768607

After driver selected, you can disable it. Don't worry, it will still work, still display, even can still provide temperature reading.
View attachment 768608

Then you can install the latest Nvidia driver for the 1080.

I use GT-120 with RX-580 in Win 10. Apart from the above I have blocked the driver update via group policy.
 
I use GT-120 with RX-580 in Win 10. Apart from the above I have blocked the driver update via group policy.

In my own test, once set the GT120 to run with Microsoft Basic Display Adapter driver. There is no need to block driver update.

It may sounds like a good idea to avoid changes. But this also create other problems (e.g. RX580 driver update).
 
In my own test, once set the GT120 to run with Microsoft Basic Display Adapter driver. There is no need to block driver update.

It may sounds like a good idea to avoid changes. But this also create other problems (e.g. RX580 driver update).
You can select which driver to block based on the hardware id
 
  • Like
Reactions: h9826790
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.