Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
The first post of this thread is a WikiPost and can be edited by anyone with the appropiate permissions. Your edits will be public.
Thanks very much. I don't really understand this ! I have my 4TB Sabrent in the Orico with 3.2Gb of data and it seems to be behaving. What would I look for if this was a problem?
I'm just being paranoid. I wouldn't worry about it.

Run First Aid from Disk Utilities.app to make sure the disk and all its partitions/containers and volumes are ok (select Show All Devices from the view menu).

You can try sudo gpt -r show -l /dev/disk0 (change the disk number to match the Orico. It will show where the partitions are located on your disk. It should show the existence of the primary and secondary GPT table and header.
 
Awesome, this is indeed what I want, and what I thought I was getting with the 4m2, but individual drives max out at about 700mbps and to get their rated speed one must have 4 matching NVME drives installed and use soft raid. Admittedly not a bad option considering the premium larger NVME's cost.

So now I'm choosing between the Startech and the OWC unit. Does anyone have any reason one is better than the other? As far as I can tell both accommodate x16 cards at x4 speeds and have the same ports. I'm weary of the OWC unit after being mislead by the specs of the 4m2.

This seems like the ultimate external NVME solution with a 7101A or similar inside imo, but I wonder if the power adapters draw high power only when in use. I think most modern fet driven power supplies are only drawing higher power as needed. I'm inclined to go for the startech due to the simple power supply, I am unable to find photos of the OWC power supply and it seems to have an odd power connector similar to that of a video card, perhaps they are unloading some gpu power adapter parts.



Any thoughts on which of these is better?

I hope this is appropriate for this thread, I know I wish I had seen this here. It would have saved some trouble so I hope it helps someone else looking for a 4 blade NVME solution that should run as fast as the internal drive on an M1 without having to run raid on any of them.

I will report back since it may not work as planned, as often happens.
 
I'm just being paranoid. I wouldn't worry about it.

Run First Aid from Disk Utilities.app to make sure the disk and all its partitions/containers and volumes are ok (select Show All Devices from the view menu).

You can try sudo gpt -r show -l /dev/disk0 (change the disk number to match the Orico. It will show where the partitions are located on your disk. It should show the existence of the primary and secondary GPT table and header.


Thank you very much for the input!
That gives:

Screenshot 2021-12-29 at 21.22.13.png



The 4TB Orico/Sabrent is external, physical disk6, and the only Container is disk7. Just one volume on the Container: "OFFLOAD"


Screenshot 2021-12-29 at 21.22.41.png

Seems to look OK?

Thanks very much
 
  • Like
Reactions: incumbent
So now I'm choosing between the Startech and the OWC unit. Does anyone have any reason one is better than the other? As far as I can tell both accommodate x16 cards at x4 speeds and have the same ports. I'm weary of the OWC unit after being mislead by the specs of the 4m2.

This seems like the ultimate external NVME solution with a 7101A or similar inside imo, but I wonder if the power adapters draw high power only when in use. I think most modern fet driven power supplies are only drawing higher power as needed. I'm inclined to go for the startech due to the simple power supply, I am unable to find photos of the OWC power supply and it seems to have an odd power connector similar to that of a video card, perhaps they are unloading some gpu power adapter parts.



Any thoughts on which of these is better?

I hope this is appropriate for this thread, I know I wish I had seen this here. It would have saved some trouble so I hope it helps someone else looking for a 4 blade NVME solution that should run as fast as the internal drive on an M1 without having to run raid on any of them.

I will report back since it may not work as planned, as often happens.
The OWC is newer - supports DisplayPort 1.4 from the DisplayPort port. However, if you connected a 4K120 display, the NVMe's would be limited to 14 Gbps. If you run the display at 4K60, you'll get up to 24 Gbps from the NVMe which is basically full speed (there are some rare situations where 25 Gbps would be possible).
The OWC has more power - supports up to 85W charging - good for laptops. That's why they do not use a barrel connector - so that more current can be taken without melting something. The power supply is slightly larger but that difference doesn't really matter - there's a power supply in either case.

The Sonnet is limited to DisplayPort 1.2 and only provides 15W for charging.

The way electricity works, high power is used only if it's used. The USB-C power source tells the device what voltage and current levels are allowed. The device tells the USB-C power source what voltage it wants. The device (if it's well behaved) will not attempt to use more current than is allowed for that voltage.

Thank you very much for the input!
That gives:
View attachment 1935922

The 4TB Orico/Sabrent is external, physical disk6, and the only Container is disk7. Just one volume on the Container: "OFFLOAD"
View attachment 1935924
Seems to look OK?
Yup.
 
Hi all, sorry if I missed this somewhere, I'm going from an old 5,1 to an M1 mini, I have a PCI card, HPT 7101 card full of NVME drives, can I purchase an external thunderbolt 3 pci card enclosure and get 2500mbs+ speeds per each singular drive and just use this as my external 4port NVME solution? I purchased and am sending back the OWC Express 4M2 due to it limiting the speed to each drive to about 1/3. While I realize this may be an expensive way to go, I already have the 7101A, and would like to use several NVME drives in my M1 mini at full speed. Here are some examples of the enclosures I would consider: https://appleinsider.com/articles/21/11/08/how-to-add-pci-e-expansion-to-your-new-macbook-pro.

I'd get the HighPoint RocketStor 6661A Thunderbolt 3 to PCI-E 3.0 x16 Expansion Chassis, it's built for that card and the cheapest solution, I'm thinking of doing the same, either that or the Sonnet https://www.sonnettech.com/product/echo-3-desktop/overview.html because I plan on getting some other PCIe cards (10gbe, firewire and a Blackmagic 10 bit HDMI out)
 
I'd get the HighPoint RocketStor 6661A Thunderbolt 3 to PCI-E 3.0 x16 Expansion Chassis, it's built for that card and the cheapest solution, I'm thinking of doing the same, either that or the Sonnet https://www.sonnettech.com/product/echo-3-
I saw a reviewer of that unit get much slower speeds. I did look at the Highpoint, maybe he was doing something wrong? I would really like that 3 card sonnet since I have a SAS raid card, but it's expensive and I may opt for a USB enclosure instead for my mirror raid and dedicated pci enclosure for the 7101A where the speed matters. Also the OWC looks just as cheap and after joevt's comments I am leaning towards their product.
 
Last edited:
  • Like
Reactions: ekwipt
This seems like the ultimate external NVME solution with a 7101A or similar inside imo
Why is this the ultimate solution? If you're putting the card into a Thunderbolt 3 enclosure you'll be limited to TB3 speeds anyways 2600-2800 MB/s no matter what card you use.

joevt, did a good job explaining the differences in the enclosures. I've used a few of these enclosures with the Accelsior 4M2 NVME card, these were my observations:

HighPoint RocketStor 6661A Thunderbolt 3 - Pros: Smallest footprint, thumbscrews so installing/removing a card is easy, relatively small power supply, and cheapest price. Cons: Build quality felt a bit cheap. An annoying rattling noise eventually started emanating from the enclosure (could have been an issue with just that unit and not something across the entire line) but ultimately due to that noise I returned it.

Sonnet Echo SE-1 - Build quality feels a little more robust than the HighPoint, relatively small power supply.
Cons: Requires you to unscrew/rescrew 4 screws to install/remove card, wider base takes up more desk space space but also one of the reasons it feels more solid than the HighPoint. It is the priciest of the 3 enclosures and I'm really not sure why.

OWC Helios S3 - Pros - Build quality feels really solid, thumbscrews so installing/removing a card is easy, Displayport, and one of the main reasons I went with it is it can provide 85W of power to a laptop. Cons - Huge power supply.

The footprint of the Echo SE-1 and Helios S3 are very similar, they adopt a wider base/shorter height style, where the HighPoint is a narrow base/tall height style
 
Last edited:
  • Like
Reactions: ekwipt
I saw a reviewer of that unit get much slower speeds. I did look at the Highpoint, maybe he was doing something wrong? I would really like that 3 card sonnet since I have a SAS raid card, but it's expensive and I may opt for a USB enclosure instead for my mirror raid and dedicated pci enclosure for the 7101A where the speed matters. Also the OWC looks just as cheap and after joevt's comments I am leaning towards their product.

Why is this the ultimate solution? If you're putting the card into a Thunderbolt 3 enclosure you'll be limited to TB3 speeds anyways 2600-2800 MB/s no matter what card you use.

joevt, did a good job explaining the differences in the enclosures. I've used a few of these enclosures with the Accelsior 4M2 NVME card, these were my observations:

HighPoint RocketStor 6661A Thunderbolt 3 - Pros: Smallest footprint, thumbscrews so installing/removing a card is easy, relatively small power supply, and cheapest price. Cons: Build quality felt a bit cheap. An annoying rattling noise eventually started emanating from the enclosure (could have been an issue with just that unit and not something across the entire line) but ultimately due to that noise I returned it.

Sonnet Echo SE-1 - Build quality feels a little more robust than the HighPoint, relatively small power supply.
Cons: Requires you to unscrew/rescrew 4 screws to install/remove card, wider base takes up more desk space space but also one of the reasons it feels more solid than the HighPoint. It is the priciest of the 3 enclosures and I'm really not sure why.

OWC Helios S3 - Pros - Build quality feels really solid, thumbscrews so installing/removing a card is easy, Displayport, and one of the main reasons I went with it is it can provide 85W of power to a laptop. Cons - Huge power supply.

The footprint of the Echo SE-1 and Helios S3 are very similar, they adopt a wider base/shorter height style, where the HighPoint is a narrow base/tall height style

Were the speeds the same across all three devices?
 
Why is this the ultimate solution? If you're putting the card into a Thunderbolt 3 enclosure you'll be limited to TB3 speeds anyways 2600-2800 MB/s no matter what card you use.
To clarify why, I am moving to an M1 Mac mini from an old 2010 Mac Pro. I am yet unaware of any way to add a faster drive to an M1 mini, also I'm unsure if it would be perceivable. This looks like the ultimate solution for an M1 mini, but I suppose everyone has different needs. My goal was to achieve the speed of the internal drive without using raid 0 and connect several nvme drives which I already own for my old Mac. What better or faster methods are there to add storage to a mini? By the way thanks for the reviews of each one. I had a feeling the OWC Helios power supply would be huge. How can I see it? That is the only reason I might not go for that one, no plans to connect a laptop here.

One reviewer of the Highpoint 6661A noted using an NVME card and only getting speeds in the low 1000's, how did yours do?
 
Last edited:
OWC Helios S3 - Pros - Build quality feels really solid, thumbscrews so installing/removing a card is easy, Displayport, and one of the main reasons I went with it is it can provide 85W of power to a laptop. Cons - Huge power supply.

The footprint of the Echo SE-1 and Helios S3 are very similar, they adopt a wider base/shorter height style, where the HighPoint is a narrow base/tall height style
I think thumbscrews is the best reason to get the OWC. DisplayPort feature is nice for some situations. The screws on all the Sonnet stuff are annoying.

Were the speeds the same across all three devices?
Thunderbolt is Thunderbolt.

I had a feeling the OWC Helios power supply would be huge. How can I see it? That is the only reason I might not go for that one, no plans to connect a laptop here.
I wouldn't worry about the size. A brick is still a brick. You'll need a place to put it no matter what the size.

Here's the original Helios power supply with green tape label (helios is vertical) on top an Acer display power supply next to the Helios 3S power supply where the Helios 3S is currently being used for a W5700 GPU powered by another PSU you can't see next to a Mac mini 2018.
OWC Helios power supplies.JPG

Here's some other power supplies with a Sonnet SE-I on the left and a Sonnet SE-III turned on its side.
more Thunderbolt power supplies.png

One reviewer of the Highpoint 6661A noted using an NVME card and only getting speeds in the low 1000's, how did yours do?
I think it depends on the NVMe's. Some behave better than others in a Thunderbolt enclosure.
 
Were the speeds the same across all three devices?
One reviewer of the Highpoint 6661A noted using an NVME card and only getting speeds in the low 1000's, how did yours do?

i don't recall the exact numbers but they were close enough where it didn't make a difference which one I choose.
Keep in mind I'm using 4 x NVMEs in a RAID 0 configuration. If one wanted to install 4 drives and use them independently I haven't tried that but you should be able to that without a speed hit with either the
Highpoint or OWC NVME PCI cards.

I think thumbscrews is the best reason to get the OWC. DisplayPort feature is nice for some situations. The screws on all the Sonnet stuff are annoying.
Agreed, I was just changing a card in a sonnet enclosure yesterday and I'm dropping screws, it was annoying.


the Helios 3S is currently being used for a W5700 GPU powered by another PSU
Oh wow a GPU card is compatible with the S3? That's awesome, I didn't know it could do that, but you're using a different power supply? Can you walk me through this? The other power supply fits the S3 connector?

What cards do you have in the Sonnet SE-III? I wish Sonnet or OWC would make a 2 slot enclosure that is 3.5 inches or under in height, the Sonnet SE-III is too tall and the ECHO-III desktop and rackmount are too long for my space.
 
Last edited:
One reviewer of the Highpoint 6661A noted using an NVME card and only getting speeds in the low 1000's, how did yours do?

I’m guessing it depends on the NVME card, unfortunately different controllers work with different enclosures, speeds are affected, I’m not even sure everything will work the same when things are out in RAID arrays?

There’s not much point in putting them in RAID 0 x 4 NVME drives with Thunderbolt 3 (I’m thinking about this for me in the future as well), but I’m also unsure, just let them all run as single drives or one big drive for neatness sake.
 
  • Like
Reactions: Mad Davey
There’s not much point in putting them in RAID 0 x 4 NVME drives with Thunderbolt 3

Depends on your workflow and what you do, but for me there's two:

1. NVMEs have a fast cache that gets used up rather quickly, so this extends that fast cache as it's utilizing all 4 drives.
2. I can have 8TB-32TB size raid. For me I need at least 8TB and it's nice I can have 8TB of
super fast storage without breaking the bank.

One can purchase an 8TB NVME drive but they are very expensive and usually QLC
 
  • Like
Reactions: satcomer and ekwipt
Oh wow a GPU card is compatible with the S3? That's awesome, I didn't know it could do that, but you're using a different power supply? Can you walk me through this? The other power supply fits the S3 connector?
Basically you can connect a GPU do to anything that does PCIe (mini PCIe, NVMe, ExpressCard, etc.) A riser is used because the GPU doesn't fit inside the Helios. A computer power supply is used for the PCI power connectors of the GPU because the Helios doesn't have enough power for it.

Here's a different example:
https://egpu.io/forums/which-gear-should-i-buy/thunderbolt-3-external-drive-slot-m-2-ngff-adapter/

What cards do you have in the Sonnet SE-III? I wish Sonnet or OWC would make a 2 slot enclosure that is 3.5 inches or under in height, the Sonnet SE-III is too tall and the ECHO-III desktop and rackmount are too long for my space.
The Sonnet SE-III is unused. One plan is to make it into the world's first 8 port Thunderbolt hub... I suppose someone can make a 10 port Thunderbolt hub with 4 Thunderbolt 4 hubs but it wouldn't have 6 DisplayPort inputs to support three or four LG UltraFine 5K displays...
I haven't seen anyone try chaining multiple Thunderbolt 4 hubs together yet.

1. NVMEs have a fast cache that gets used up rather quickly, so this extends that fast cache as it's utilizing all 4 drives.
That's something I haven't thought about. Usually when I test an NVMe, I'm just looking at the top number (max bandwidth) for a couple seconds. I don't do extended tests to see how long it takes to use up the cache or how low the write speed is when the cache is used up.[/QUOTE][/QUOTE]
 
The cache and random speeds is what interests me as as well. I’m a video editor and music producer. So this is some good info, I’d love to test everything out and see why differences it would make in the real world
 
Mostly space and heat.

Depends on your workflow and what you do, but for me there's two:

1. NVMEs have a fast cache that gets used up rather quickly, so this extends that fast cache as it's utilizing all 4 drives.
2. I can have 8TB-32TB size raid. For me I need at least 8TB and it's nice I can have 8TB of
super fast storage without breaking the bank.

One can purchase an 8TB NVME drive but they are very expensive and usually QLC
 
Does anyone know if the Sonnet Echo Express SE3e can work at least electrically with the 7101a? Granted it would need an 8x to 16x riser or a modification to the slot (riser looks like it may fit, modification not reccomended)? If this would work, would there be any speed difference between it and the SE3? I don't need power for a laptop or gpu. It would be great to have extra slots for my SAS raid card and any future cards I may need. It seems like the SE3e is all I need.

 
Does anyone know if the Sonnet Echo Express SE3e can work at least electrically with the 7101a? Granted it would need an 8x to 16x riser or a modification to the slot (riser looks like it may fit, modification not reccomended)? If this would work, would there be any speed difference between it and the SE3? I don't need power for a laptop or gpu. It would be great to have extra slots for my SAS raid card and any future cards I may need. It seems like the SE3e is all I need.
You could try using a dremel to turn one of the slots into an open ended x8 to allow an x16 to fit.
It should connect as an x4 slot and the speed should be the same as the SE3 or similar to any other Thunderbolt enclosure.

I tried the following successfully:
The riser at https://www.amazon.com/dp/B07D46WW1V will raise the card by 19mm.
The Sonnet SE IIIe has about 25mm of clearance (from the underside of the pcie bracket).
the HighPoint SDD7505 rises about 6mm above the underside of the pcie bracket.
6mm + 19mm = 25 mm so it's a perfect fit.
If the SSD7101A is of a similar size then it should work. Get a 19mm standoff to make it more secure. I'm not sure what size of screw the Sonnet uses for the pcie brackets. Bigger than M3. Maybe #6-32?
 
Last edited:
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
  • Like
Reactions: Mad Davey
You could try using a dremel to turn one of the slots into an open ended x16.
It should connect as an x4 slot and the speed should be the same as the SE3 or similar to any other Thunderbolt enclosure.
I found a x8 riser https://www.amazon.com/gp/product/B08NX9D72S

I was also considering the idea of cutting this one down to x8 since I like the straight traces as opposed to the others that use vias, but I'm sure that it would be the same anyway. https://www.amazon.com/gp/product/B07RWRK2L6

I am very capable of dremeling it without damage, however I am sure that would void my warranty so a riser it is at least for a while.

I wonder if I will regret not spending the extra money for the Echo III, for example if they start to support external gpu's with the m1.

On my old 5,1 with the 7101A card I was using the driver Highpoint provided (disabled SIP to install), should I use that driver on the M1 mini or is it not supported?

I just pulled the trigger on the Express SE3e from Newegg for $500, I plan to use the riser for a while at least to maintain warranty. I hope to use the 7101A and a highpoint SAS card with some SATA drives, now to find a good SAS enclosure. I like the older highpoint SAS cards since they were compatible with my 5,1 and allowed for sata drives in raid or single to be connected at ok speeds respectively. I will also have the ability to add a third card. I will report back later. Thanks for the tips joevt!
 
Last edited:
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
On my old 5,1 with the 7101A card I was using the driver Highpoint provided (disabled SIP to install), should I use that driver on the M1 mini or is it not supported?
I would avoid the driver. It's mostly for RAID which isn't helpful in a Thunderbolt enclosure. I don't know if it works in M1 Macs.
 
  • Like
Reactions: Mad Davey
I have used several Yottamaster and Orico external Thunderbolt 3 and one Thunderbolt 4 (plus USB-C) enclosures. On an Asus ExpertBook B9450 (Windows 10 up to and including 21H2) and on MacBook Pro 13 (2018, 4 Thunderbolt, Intel) as well as Mac Mini M1 (16GB, 1TB, 10 Gbit 2021) - the mac’s with the latest Big Sur.

Alle environments deliver results in the same region. APFS or ExtFAT on Macs and NTFS or ExtFAT on Windows. Diffrences are within very small margins, and with few percent across operating systems. The M1 is a good representative (drives directly connected):

Here Yottamaster with Samsung 970 Pro 1TB:
1641394037211.png

And my Orico Thunderbolt 3 (only) enclosure with similar Samsung 970 Pro 1TB:

1641394074703.png

The Samsung 970 Pro 1TB delivers similar performnce in all Thunderball enclosures, I have, and since it’s a MLC based SSD, the performance is constant irrespective of transfer size (no SLC cache used, full performance for complete copy to for all transfer data sizes).

Things become interesting with the CalDigit Elements Hub. No other traffic, and no Monitors connected, same Orico hardware and Samsung SSD:

1641394100463.png


The write speed falls roughly 30% on writes, while read speeds remains similar to direct connection (without any other activity in hub devices).

The CalDigit hub has a severe problem handling NvME USB drives (like the Sandisk Extreme Pro 1TB, old version), when connected to USB-A (with 10 Gbit cable). It’s probably a power problem, because the problem vanishes, when connected to USB-C (with 10 GBit cable).

All enclosures exhibit the same problems with Samsung 970 EVO PLus 2TB, irrespective of format and platform/OS. Read speeds are always “up there”, and write speeds hover in the region of 1100-1300 Mbyte/sec dpending on OS (formattging makes no real difference). It is essential, that Windows 10 caching is enabled (does not always happen by default), if you don’t want to end up around 800-900 Mbyte/sec write.

Special note on Big Sur on BOTH M1 and Intel Macs described above

The Asus ExpertBook B9450 (two easily accessible and active internal SSD sockets) was delivered with a Samsung MZVLB1T0HALR-00000 1TB SSD (often described as OEM PM981 etc), that delivers values in the region of Samsung EVO 970 or better. Especially sustained transfers are high, in the region of Samsung 970 EVO Plus.

I decided to mount my Samsung 970 EVO Plus 2TB inside instead. It just screams “performance”. Not quite 970 Pro on writes, alas… 970 Pro is not available in 2TB NvME sticks.

I mounted the Samsung MZVLB1T0HALR-00000 into the Orico Thunderboilt3/USB 4 enclosure, that had held the Samsung 970 EVO Plus 2TB, and reformatted the drive on Windows 10 (ExFAT) and performance was roughly identical to Samsung 970 EVO Plus in the enclosure. Write far lower than possible, but in the same region as the EVO Plus; that’s life.

The big surprise came, when I moved the drive to my MacBook (Intel). It dod not work, at all. Blackmagic Disk Sped Test froze after displaying 70MByte/sec read speed. Rpeat on M1 showed similar, but even lower speed before freezing up. Obviously something was horribly wrong. Maybe reformatting to APFS would help? No!

Big Sur showed the capacity to be 1.02 TB, and that was certainly not the case in real life, so the actual SSD was not to Big Sur’s taste (worked in Windows 10 in the same enclosure with the same cable).

There’s no wrong capacity handling in Windows 10, and both ExFAT and NTFS works as expected at the same speeds.

Conclusion

The short conclusion is, that enclosures behave similarly, when the same type SSD’s are used. Formattging has nigh to no influence on performance. Differences between macOS and Windows are very small, and likewise for differences between M1 and Intel based macs (if anything, Intel is less “problematic”). When USB ports are involved, M1 is generally around 30% slower, than the Intel platform for same drives and same transfer/contents. The slow-down is more or elss equal, whether 5GBit or 10GBit USB C/A gear is used.

Hope this will be of help to those experiencing problems.
 
I ended up getting a Fledging Thunderbolt enclosure and 2Tb WD Black NVME, results:

Formatted AFPS
:
Diskmark.png
Fledging_WD_Black_750_2TB.png


Internal MBP 16" Max

Internal MBP_16.png
 
it would be interesting to have a poll to know which enclosure is owned

after reading all the latest pages, i still don't know what to buy to go with my M1 Mac Mini :(
 
  • Like
Reactions: Wags
it would be interesting to have a poll to know which enclosure is owned

after reading all the latest pages, i still don't know what to buy to go with my M1 Mac Mini :(
In most cases, it’s not the enclosure, that matters, but nearly always the SSD-stick you put into the enclosure. A good place to start is here:


Note, that when you plan on writing large amounts to an SSD, it is really important to look into the write performance. If you use BlackMagic and other test programs, you only get “sprint info” (few megabytes written). What you need is to look into “Marathon distances” for write, let’s say several hundred gigabyte in one go, if you edit video and other large size projects.

If you look in this table, you clearly see, that some SSD’s in many respects are “write once read many” types, where copying/saving/writing large amounts of data can reach DOWN to the same level, as rotating rust (HDD):


Compare the performance of Samsung 970 Pro (which I use) and Corsair MP600, and it’s easy to see the real life differences.

Note, that these measurements are performed internally on desktop computers. If you mount them in an external Thunderbolt 3 enclosure, you’ll typically reach read limits of 2700-2800 Megabyte/second and around 2200-2300 megabyte/second for the absolute best SSD’s (give or take a bit). Aaand only, when you move really large files (hundreds or thousands or more megabyte apiece); if you move a lot of small to tiny files, SSD’s also slow down, just like HDD’s.

These figures are ONLY realistic, if you connect the enclosure directly to the machine via a good cable (usually included with the enclosure) and with sufficient cooling (cooling pads not always included or too thin, but a few spare/extra pads bought at the same time, doesn’t cost much).

If connected via an external dock, especially write speed will suffer. The dock has a maximum of 40GBit/sec, and if you also have a hi-res monitor or two connected, as well as networking and other paraphernalia, throughput will suffer. Just like a six lane freeway may be congested, if lot’s of cars is trying get through at the same time, and speed suffers.

Notebooks with four thunderbolt connections are obviously better suited, than notebooks with only two or especially limiting one, if power via USB-C is also taking place. In addition to that, it pays to look at standard usage. Example:

If I copy a huge file-set from one Thunderbolt enclosure to another, directly connected to the same notebook, I get around 1 GByte per second on my Asus EliteBook B9450FA (Intel) - two ports, for copy the power supple cannot be connected - compared to 2 GByte per second copying the same files from/to the same SSD/enclosures using the same cables on my Apple MacBook Pro 13 (2018, Intel, four ports). The first has two ports, the latter has four, but that’s NOT the limiting factor here. The Asus cannot sustain both writing and reading to different thunderbolt 3 ports at the same time. The Apple can, and it does not matter, if the ports are on different sides or on the same side. It’s the actual motherboard hardware design - both use Intel chips - that makes the difference. Whether NTFS, ExFAT or APFS is used has nigh to zero influence on performance.

Both machines deliver roughly the same, high read and write results, when using standard disk speed test programs. That’s the shocker in this example from real life.

In the big picture, the enclosure and cable seldom has a big influence (most use near identical chipsets and electronics anyway). The quality of the actual computer, the environment planned for use (direct connection or via dock) and most: The actual quality of the SSD - reaching from superb and expensive to downright lousy, but cheap.

That’s my view.It’s not popular amongst resellers of low cost SSD’s ;-)

Regards
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.