Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Just because there are others using it, doesn't prove that they won't do damage to your machines over time.

Oh Good Grief. You have made your point over and over and over. Enough already!

But, the real question is, can you make me a good doughnut:p

Lou
 
Oh Good Grief. You have made your point over and over and over. Enough already!

But, the real question is, can you make me a good doughnut:p

Lou

There are people who irresponsibly keep saying it works and recommending it, so I feel inclined to add that it may not be safe.

I can barely operate my microwave.
 
^^^^Fine, but your warning is not based on real world experiences, mine is. I have used an MVC modified GTX570 for a year now in two different Mac Pros, with absolutely no incidences. I am upgrading my card to another MVC modified GTX780 because of MVC's credibility, honesty, the reliability and engineering he puts into the stuff he sells, but most of all because his customers, almost 5,000 of them have given him an approval rating of over 99%. And those are just the ones from eBay. We know he also sells privately.

Do you really think he would take a chance on ruining his reputation by selling stuff he didn't test and believe in? IMHO, there is no irresponsibility involved here. MVCs testing and that done by Bear Feats bare (no pun intended) out that his cards are the real deal. Mac Video card modifiers come and go on eBay - Only One has proven he has staying power, because he sells GOOD stuff that has been proven to WORK over the long haul.

I know I've said this before, but it's worth repeating:

The MVC/Netkas combination has produced some truly phenomenal results, and without their presence in the community the Apple Graphic Card solution would be in truly bad shape. It is my belief that without these two guys, neither EVGA or Sapphire would have produced Mac versions of their cards. But, that is just my opinion.

Now why would you choose a handle like pastrychef if you can't make doughnuts:confused:

Lou
 
My warnings are based off of proof that more than 75W is being pushed through traces designed for 75W. A user with a video card that has a TDP of 250W but only ever uses TextEdit will probably never have any issues.

There are also many sellers with tens of thousands of ratings who have managed to maintain 100% ratings. It proves nothing to me nor does it impress me.

I'm not saying he is intentionally trying to damage anyone's machine. I'm saying that these things are pushing more wattage through than intended and none of us know what the long term consequences are. Were an auxiliary power supply recommended along with the cards, I'd have no qualms. www.barefeats.com ran a few benchmarks and moved on to test another card. That in no way proves to me it is safe. Does it work? Yes. Is it safe? We don't know.

I don't know if I'd go so far as to say their work is phenomenal but it is impressive. They do things that most of us are incapable of, yes. That doesn't make them infallible. Remember, there is a profit motive driving Macvideocards. He is not doing it out of charity or for a "greater good" and there's nothing wrong with that, it's the American way. But I certainly wouldn't worship him as the greatest thing to happen to Macs.

I highly doubt hacked video card firmwares had any influence on any manufacturers' decisions to produce or not to produce cards for Macs.

I came up with the handle decades ago as a kid while hanging out with friends having coffee and, you guessed it, pastries... Then, it just kinda stuck.
 
Last edited:
The MVC/Netkas combination has produced some truly phenomenal results, and without their presence in the community the Apple Graphic Card solution would be in truly bad shape. It is my belief that without these two guys, neither EVGA or Sapphire would have produced Mac versions of their cards. But, that is just my opinion.

Lou,

Count me in the MVC fan club too; I have sung praises multiple times. I'm also using a 6+8 pin card.

That being said, I like to be an informed buyer. I'd much rather hear the whole story and make my own decision.

  • Some people here make a living off of their MP and perhaps have a much lower tolerance for risk than, for example, a gamer. They might not want an out of spec solution at all.
  • Some people might want the card but won't balk about $40 to add a second power supply just to be safe.
  • And others, like myself, might be completely comfortable simply running 6+8 cards directly.
The bottom line is, everybody has different preferences, priorities, and comfort levels. It is helpful, therefore, to hear all sides of the story and make an informed decision.
 
Lou,

Count me in the MVC fan club too; I have sung praises multiple times. I'm also using a 6+8 pin card.

That being said, I like to be an informed buyer. I'd much rather hear the whole story and make my own decision.

  • Some people here make a living off of their MP and perhaps have a much lower tolerance for risk than, for example, a gamer. They might not want an out of spec solution at all.
  • Some people might want the card but won't balk about $40 to add a second power supply just to be safe.
  • And others, like myself, might be completely comfortable simply running 6+8 cards directly.
The bottom line is, everybody has different preferences, priorities, and comfort levels. It is helpful, therefore, to hear all sides of the story and make an informed decision.


honestly i wouldnt mind running another psu in the optical bay but the problem for me is wiring it all in .. that is something I am not comfortable with.

hopefully macvideocards will reply in this thread
 
Hey, nothing worth fighting about here.

I used to be on the "must get external power'" bandwagon.

It was getting Hardware Monitor and testing actual current load that has softened my position.

And if you read the thread I started about this, justifying 100 watts on a single connector is based largely on the fact that a GTX570 will pull same amount as GTX780 running Furmark.

Read anywhere you want, there is not a single report of a 570 crapping out a Mac Pro, anywhere on interwebs.

I have even spent a dozen hours or more playing Metro Last Light in bootcamp on a 4,1 using a Titan via dual 6 pins, never so much as a flicker. And that is a monster machine with dual 5680s sucking 70 extra watts for CPUs at same time.

I have a dying 4,1 here that I would be willing to do an extended test on. Only has 1 working 16x lane slot.

I could leave it running Furmark overnight or rendering a never ending movie, or whatever you want.

I recommend that people on the fence pay for a $10 license to hardware monitor. There may be a way to use iStat as well, don't know.

See how much juice you can pull doing an AE render vs how much Furnark pulls. I don't think any "legit" software can use as much power as Furmark, but maybe someone can prove me wrong.

Even Tesselator agreed with my findings that 570 can pull 100 watts via a single connector using Furmark, and he would rather swallow his tongue than agree with me.

So If we have 1,000s of 570s out there pulling extra juice and not a one of them has reported a problem...well I think that means 780 is ok.
 
Fighting? There's no Stikin' Fighting going on here. This is called a discussion:eek: And a darn good one at that:p

Lou
 
The END of the power squabbles.

Don't know why I never thought of this before.

For some background info, let me point out that the GTX285 was ORIGINALLY a GTX280. It had a 6 pin and an 8 pin then.

When Nvidia updated it to the GTX285, it went to Dual 6 pins.

But apparently the card still grabs juice like a 6 & 8 card.

Someone here expressed fear at a card pulling 110 Watts from a single source, well behold...the ol' standard for CUDA the GTX285 sucks 110 Watts from a 6 pin, just like a 570, just like a 780.

Again, this is wth Furmark running. So again, this is WORST CASE SCENRIO.

But if the 1000's of approved and sold via legit retail GTX285s out there are able to yank 110 Watts, why get worked into a lather about a GTX780 doing it?

Anyone with a GTX285 and Hardware Monitor app can likely verify this.
 

Attachments

  • Screen Shot 2013-10-03 at 8.36.24 PM.png
    Screen Shot 2013-10-03 at 8.36.24 PM.png
    472.3 KB · Views: 150
Hopefully, this puts everything to bed:p

Lou
 

Attachments

  • Screen Shot 2013-08-27 at 4.28.14 PM.png
    Screen Shot 2013-08-27 at 4.28.14 PM.png
    328.7 KB · Views: 150
The GTX 285 numbers have me convinced. I had no idea that card pulled so much power.

I decided to try the Furmark test and got the following screenshot from iStat Menu.
 

Attachments

  • Screen Shot 2013-10-04 at 7.35.30 AM.png
    Screen Shot 2013-10-04 at 7.35.30 AM.png
    52.8 KB · Views: 643
Last edited:

From what I've read, the GTX 770 is essentially a tweaked GTX 680. That's why they bench very similarly. There are a good number of people who have even managed to flash GTX 680s with GTX 770 firmware. What they ended up with were identically performing cards.

In my opinion, given a choice between a GTX 770 or a GTX 680 for a Mac Pro, I would go with a GTX 680 every time. Not having the power consumption issues in the back of my head is enough for me to sacrifice a few frames per second in a game.

As GTX 680 stocks are drying up, I'm wondering if it's actually possible to do the opposite... Shouldn't it be possible to flash a 770 into a 680 in order to get a fully mac-compatible video card with supported TDP?
It's always possible to "overclock" it later, if required...

Did anybody try this yet?
 
Guys, I will just warn you that it is tricky business flashing cards like this. It is very easy to create a brick, especially if you don't have an actual PC. With GT200 cards the final byte if device is was set by soft straps in rom, which is why a GTX260 or 275 can be flashed into a 285.

This is no longer the case. I am on mobile devices now so can't look at the linked toms but my guess is that they had to mod the device id s to match card. This gets much more difficult with EFI that needs to match.

When I get back to lab I'll see what they did. Anyone at home can have a look. The first 1k of rom is an image, after that real PC bios starts with "55AA". Somewhere in next 1k is "de10" the device I'd is either right before or right after and will be byte swapped. From memory, "pcir" will appear directly above or belie this enumeration of device Id.

Then look for UEFI Section, it will be another place that starts with "55AA" the device I'd will be coupled with "de10" again in first 100 bytes or so, IIRC.

See what device Id is in those 2 places. Compare to mac 680. Uncompressed part of UEFI looks similar to mac EFI But compressed part 100% different.
 
I don't have the slightest intention of doing anything, I just enjoy reading about this stuff.

I appreciate the warning though. :)
 
Question to everyone who's upgraded:

Should I sell my Radeon HD 5870 or keep it around as a spare?

Mac editions go roughly for $200 on eBay now.
 
Having had video cards die on me in the past, I also keep a spare card around. I have a GT 640 that I got for about $60 from Ebay. One of the cool things about this card is that it doesn't require auxiliary power and I can use it along side my GTX 680 if I ever ran an app that can use a little additional GPGPU processing.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.