Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
The first post of this thread is a WikiPost and can be edited by anyone with the appropiate permissions. Your edits will be public.
The 7120 has the same PLX situation as the 7101, but instead of M.2 slots, it has U.2 ports. Because of this, the adapter is a half hight card, as well as shorter, so you won't suffocate you video card when placing it in front.

You would be able to run U.2 cables up to your drive sleds and then either a) purchase U.2 drives to use (I'm looking at Intel Optane, but I am wondering what other options there are) or b) you can get a 30$ U.2 to M.2 adapter the shape and size of a 2.5" drive.
The 7120 might also be a better choice for adding PCIe 3.0 slots to the Mac Pro (if you don't want to spend $2000+ on a PCIe expansion box). I had previously tried an Amfeltec with 45cm R42SL cables (M.2 to PCIe x4). U.2 cables are probably much more flexible, and will be detachable at both ends. The 7120 overview says it's for U.2 SFF-8639 but it appears to have SFF-8643 connectors. Highpoint does sell the "HighPoint 8643-8639-50" which is a 50cm SFF-8643 to SFF-8639 cable. The U.2 to PCIe x4 adapters I've seen are 45 mm wide though so you can't achieve a standard 20.32 mm spacing of PCIe slots or even 40.64 mm for double wide PCIe slots. The U.2 to PCIe x4 adapters appear to take power from the cable so I'm not sure if the PCIe x4 slot will have enough power (adapters say max 3.3V 4A).
 
Is is possible to install such card under Windows 8.1 or it needs Windows 10?
The OS shouldn't matter. You just need drivers for whatever you connect to the card (NVMe or AHCI). Windows 7 has NVMe and AHCI drivers. Some computers might not have NVMe booting support.
 
  • Like
Reactions: dabotsonline
Some computers might not have NVMe booting support.
Usually anything earlier than Broadwell can't boot NVMe. It's practically impossible to find a PC BIOS/UEFI that have boot support for NVMe before Z97/H97, even after that for years only high end motherboards and workstations had it.

NVMe support become commonplace only a year ago.
 
Usually anything earlier than Broadwell can't boot NVMe. It's practically impossible to find a PC BIOS/UEFI that have boot support for NVMe before Z97/H97, even after that for years only high end motherboards and workstations had it.

NVMe support become commonplace only a year ago.
But you can add boot support by adding a UEFI NVMe driver to the BIOS (like people have done with the 2008 Mac Pro?). I suppose it would require the computer to use UEFI to boot. UEFI was included in computers older than Z97.
 
But you can add boot support by adding a UEFI NVMe driver to the BIOS (like people have done with the 2008 Mac Pro?). I suppose it would require the computer to use UEFI to boot. UEFI was included in computers older than Z97.
Yes, you can. Win-RAID forum has a topic that shows how to insert NVMe EFI module to X58 and newer motherboards UEFI.
 
I'm booting Windows 8.1 from Intel NVMe DC P3700 400 GB on Mac Pro 5,1 with bootrom 140.0.0.0 without any issues, but I'm looking for something on x8 or x16 lane.

HighPoint website says it needs Windows 10 - that's why I'm asking.

I'm also considering Intel Optane SSD DC P4800X 375 GB as it has superb latency.
 
  • Like
Reactions: dabotsonline
Interesting note, my first HP 7101A went completely dead after 30 days use, purchased through Amazon, got a replacement. Posted here: https://forums.macrumors.com/threads/highpoint-7101a-dead.2171611/

When I received the replacement, I mounted the two 512GB 970 Pros in the same slots on the HP card and all worked as well and fast as hell like before. Running this replacement card for last few days, have not heard the LOUD fan amp up at all, even under heavy load - gaming, Photoshop, multi 8k videos playing via 3 monitors. The previous card fan would amp up even under medium loads??

The HP card (Slot2) sits on top of a modified Nvidia 980TI (Slot1) that is cooled by a Thermatake 3.0 AIO, with copper heatsinks on all VRAM and upgraded Noctura fans mounted to a Kraken adaptor, so no added heat compared to a stock cased 980TI. Pre AIO install, the bottom of the HP7101A had temps (infrared gun) 100F degrees; after AIO temps run around 90F.

I will prolly not continue using the HP7101A as a boot drive, rather a Kyro or I/O Crest SI-PEX40129; very concerned about the quality/longevity of the High Point cards and their lack of support for my issue!!
 
  • Like
Reactions: dabotsonline
Interesting note, my first HP 7101A went completely dead after 30 days use, purchased through Amazon, got a replacement. Posted here: https://forums.macrumors.com/threads/highpoint-7101a-dead.2171611/

When I received the replacement, I mounted the two 512GB 970 Pros in the same slots on the HP card and all worked as well and fast as hell like before. Running this replacement card for last few days, have not heard the LOUD fan amp up at all, even under heavy load - gaming, Photoshop, multi 8k videos playing via 3 monitors. The previous card fan would amp up even under medium loads??

The HP card (Slot2) sits on top of a modified Nvidia 980TI (Slot1) that is cooled by a Thermatake 3.0 AIO, with copper heatsinks on all VRAM and upgraded Noctura fans mounted to a Kraken adaptor, so no added heat compared to a stock cased 980TI. Pre AIO install, the bottom of the HP7101A had temps (infrared gun) 100F degrees; after AIO temps run around 90F.

I will prolly not continue using the HP7101A as a boot drive, rather a Kyro or I/O Crest SI-PEX40129; very concerned about the quality/longevity of the High Point cards and their lack of support for my issue!!
Mac Pro firmware is the one that makes NVMe blades have boot support, not the adapter. HPT has nothing to do with Mac Pro booting from NVMe blades. Apple made a EFI module just for NVMe boot, all supported blades boot without any help of card firmware, most adapters are totally dumb.

You had a bad experience with HPT, but it’s a real solid card.
 
  • Like
Reactions: dabotsonline
Mac Pro firmware is the one that makes NVMe blades have boot support, not the adapter. HPT has nothing to do with Mac Pro booting from NVMe blades. Apple made a EFI module just for NVMe boot, all supported blades boot without any help of card firmware, most adapters are totally dumb.

You had a bad experience with HPT, but it’s a real solid card.
Thank you, didn't know that.
I have learned more from you on this forum than I have anywhere, thank you for your dedication sir!!
 
Interesting note, my first HP 7101A went completely dead after 30 days use, purchased through Amazon, got a replacement. Posted here: https://forums.macrumors.com/threads/highpoint-7101a-dead.2171611/

When I received the replacement, I mounted the two 512GB 970 Pros in the same slots on the HP card and all worked as well and fast as hell like before. Running this replacement card for last few days, have not heard the LOUD fan amp up at all, even under heavy load - gaming, Photoshop, multi 8k videos playing via 3 monitors. The previous card fan would amp up even under medium loads??

The HP card (Slot2) sits on top of a modified Nvidia 980TI (Slot1) that is cooled by a Thermatake 3.0 AIO, with copper heatsinks on all VRAM and upgraded Noctura fans mounted to a Kraken adaptor, so no added heat compared to a stock cased 980TI. Pre AIO install, the bottom of the HP7101A had temps (infrared gun) 100F degrees; after AIO temps run around 90F.

I will prolly not continue using the HP7101A as a boot drive, rather a Kyro or I/O Crest SI-PEX40129; very concerned about the quality/longevity of the High Point cards and their lack of support for my issue!!

Place your HP card in slot 1 and your video card in slot 2. That should take care of any real-estate and hp issues.
 
  • Like
Reactions: crjackson2134
Very insightful thread! Thanks, everyone for sharing their experience, test results, and tips!
Though, it seems still unclear for me, what's the practical difference between SSD7101A and SSD7102?
Also, did anyone try their SSD7110? Seems to be an interesting device.

I'm interested in the SSD7110, especially with all my PCIe slots taken up. This looks like an interesting solution for Adobe CC, where you can have a separate SSD for NVME boot, Media Cache & Render Scratch. And then use the SAS connection to break out to 4 SSD RAID 0 internally. This would free up a slot for me that I can definitely use.

I have some big 4K projects this year, so I may try this out. Curious if anyone else has experience with this particular card in their NVME controller series.
 
I'm interested in the SSD7110, especially with all my PCIe slots taken up. This looks like an interesting solution for Adobe CC, where you can have a separate SSD for NVME boot, Media Cache & Render Scratch. And then use the SAS connection to break out to 4 SSD RAID 0 internally. This would free up a slot for me that I can definitely use.

I have some big 4K projects this year, so I may try this out. Curious if anyone else has experience with this particular card in their NVME controller series.

It looks interesting for sure. If you get one, please be sure to let us know your impression.
 
Last edited:
I'm interested in the SSD7110, especially with all my PCIe slots taken up. This looks like an interesting solution for Adobe CC, where you can have a separate SSD for NVME boot, Media Cache & Render Scratch. And then use the SAS connection to break out to 4 SSD RAID 0 internally. This would free up a slot for me that I can definitely use.

I have some big 4K projects this year, so I may try this out. Curious if anyone else has experience with this particular card in their NVME controller series.
The max no. of NVMe drives you can use without impacting top storage performance/speeds is 2 as the PCIe 2.0 x16 slot is limited to that. So if you connect SAS drives and/or a third NVMe drive to the 7110, expect top speeds to drop as the bandwidth will be shared amongst all drives.
 
The max no. of NVMe drives you can use without impacting top storage performance/speeds is 2 as the PCIe 2.0 x16 slot is limited to that. So if you connect SAS drives and/or a third NVMe drive to the 7110, expect top speeds to drop as the bandwidth will be shared amongst all drives.

Thanks. Yes, but I wouldn't be looking to use the all NVMe drives for sequential read/writes. I'm more interested in them for the faster IOPS for the media cache. I'd probably allocate one NVME to the scratch drive, where sequential speed would matter, while the using the SAS port for a 4x4TB SATA SSD Raid 0, which would be 2000MBs for source media playback. So seems like the 16x PCIe 2.0 bandwidth is fine for this. Mainly editing 4K ProRes HQ/4444 media.
 
Thanks. Yes, but I wouldn't be looking to use the all NVMe drives for sequential read/writes. I'm more interested in them for the faster IOPS for the media cache. I'd probably allocate one NVME to the scratch drive, where sequential speed would matter, while the using the SAS port for a 4x4TB SATA SSD Raid 0, which would be 2000MBs for source media playback. So seems like the 16x PCIe 2.0 bandwidth is fine for this. Mainly editing 4K ProRes HQ/4444 media

I ‘m very happy with my areca 1880 and the netapp ds4246 diskshelf for media and 4 sm951 for boot/cache.

Play red weapon 8k 9:1 flawlessly

this setup only show it’s limit with HEVC...
 
  • Like
Reactions: handheldgames
Just taken delivery of a ssd7101A... FINALLY - so happy :)

I've read up on fan noise because the seller warned me that the noise was the reason he was selling. Great stuff contributed by @W1SS back in late December - it's made me consider something I'd never thought was important / relevant.........

So.. is the reason the cheap/noisy fan should NOT be disconnected/removed mainly for the sake of cooling/protecting the PLX switch than the NVMe SSDs?

I was going to just disconnect the fan plug and let the metal cover do the passive thermal dissipation do all the heavy lifting regarding nvme ssd temperature control).

I disconnected the fan on the IOCREST PCIe 3.0 x 16 SWITCH i bought 3 months ago and haven't had any problems ( though I DID setup an 'always on' medium fan speed profile in iStat Menus to compensate).

Any thoughts appreciated - it's not a trivial issue for me unfortunately because I've set up some of my existing pcie ssds in a way that may require a serious rethink :/
 
  • Like
Reactions: w1z
So.. is the reason the cheap/noisy fan should NOT be disconnected/removed mainly for the sake of cooling/protecting the PLX switch than the NVMe SSDs?

Both actually but more importantly the PLX switch as it can easily overheat and you do not want it to fail early on you.

I was going to just disconnect the fan plug and let the metal cover do the passive thermal dissipation do all the heavy lifting regarding nvme ssd temperature control).

Bad idea! Order the fan I referenced in my December post. The pure copper heatsinks are optional once you have replaced the fan, especially for the PLX switch. As for your NVMe drives, it all depends on the brand/s you are using and how heavy the writes are. Personally, I slapped the copper heatsinks on the switch and drives as it gets hot in the summer here. I know Perth's summers are just as hot.

I disconnected the fan on the IOCREST PCIe 3.0 x 16 SWITCH i bought 3 months ago and haven't had any problems ( though I DID setup an 'always on' medium fan speed profile in iStat Menus to compensate).

Different cards, manufacturers and pcie switches so while your current IOCREST results are satisfactory, you could end up permanently damaging your 7101A and/or drives.
 
Both actually but more importantly the PLX switch as it can easily overheat and you do not want it to fail early on you.



Bad idea! Order the fan I referenced in my December post. The pure copper heatsinks are optional once you have replaced the fan, especially for the PLX switch. As for your NVMe drives, it all depends on the brand/s you are using and how heavy the writes are. Personally, I slapped the copper heatsinks on the switch and drives as it gets hot in the summer here. I know Perth's summers are just as hot

Awesome feedback - thanks. I'd be crushed if it failed.
 
@roobarb! You ARE a legend!! I had come across your original post many months ago and I couldn't quite remember where I'd seen it. Kudos on the innovation and thanks again for bringing it back in to the 'foreground' again for me :)
 
Not sure if my card is getting too hot, but it works just fine as long as system is running but as soon i restart the computer my bios struggle to find card.

when i boot with UEFI bios using USB key, it can detect card at PCIE but card not responding. Any idea if card is getting too hot or is it bad piece?
 
Not sure if my card is getting too hot, but it works just fine as long as system is running but as soon i restart the computer my bios struggle to find card.

when i boot with UEFI bios using USB key, it can detect card at PCIE but card not responding. Any idea if card is getting too hot or is it bad piece?

That’s not going to get any better. If still within retailer return window, return for replacement. If not, RMA to manufacturer. Do not mention Mac or Apple. Just say it doesn’t work reliably in any of several PC’s it was tried in.
 
That’s not going to get any better. If still within retailer return window, return for replacement. If not, RMA to manufacturer. Do not mention Mac or Apple. Just say it doesn’t work reliably in any of several PC’s it was tried in.
Thank you, being not in US really sucks when come to RMA. I paid 60USD for shipment + Tax, now have to pay again to get it RMA.

One question though, when i shut down my system (totally disconnect power) for 5-10 min then system can recognize the card. For now it is working just fine but i am sure once i restart my system it is going say nothing found. Do you think it could be heat related issue?

I mean this screen, if i restart my system now and system don't show this screen then system will not show raid in bios but if i let it cool down for a while (could be 5 min to 20 min) it may randomly appear in bios and system will boot normally.
Oz0EGhN.jpg



And true enough, did a restart and now Raid Controller is nowhere to be found (boot priority is empty, please forgive about bad picture as i am sitting near window). I have shutdown my system totally to see if it appear in a few minutes
vxH95r8.jpg



Update
After shutting down for 5 min, now system was able to detect raid controller (i am very puzzled and concerned about this problem as i want to use this as my primary system for work).
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.