Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

altaic

macrumors 6502a
Jan 26, 2004
711
484
For some reason the m2 seems to have problems, as gatekeeper does not respond, and the Mac does not allow kernel access on the external drive, such as when trying to add known developers with reduced security, in order to run FOSS or my own applications.



The moment you run everything back on the internal drive, everything just works!
That is unbelievable and otherwise bizarre. Just to start 😒
 
  • Haha
Reactions: Kr0n05K!ngR

Kr0n05K!ngR

macrumors member
Sep 13, 2023
53
13
LOL. You do not need to lecture me on the basics, I'm an engineer who's been in the industry for decades.

The reason I asked you to clarify is because you started by claiming that Apple is doing a terrible job of supporting people who want to experiment with ML, then began babbling about unrelated topics like "kernel access" in a manner which suggested you were introducing them in support of the original complaint.

Lies, simply stated i could not get kernel mode access on an external thunderbolt 4 NVME, and then explained why i needed it. Not "Claiming that apps is doing a terrible job of supporting people who want to experiment with ML"..

Please keep your argument factual. And stop the Tu quote fallacies.

Gatekeeper does not manage memory spaces or sandboxing, its function is to check code signatures to determine whether a binary was supplied by a trusted source without alteration.

Grate, thats a little more helpful, as stated i am learning. So why is this even when being disabled not allowing the running of known developers code on a macOS running off of a thunderbolt 4 NVME ssd? Please provide your input considering your "years of experience."

Again, cannot use bro logic, and cannot say "some guy on a forum told me"..

Nonsense. You don't need VMs to train a neural network, and outside of VMs, you don't need "bare metal" access to the Apple Neural Engine (ANE). You don't even want that, unless your actual project is to reverse engineer the ANE - Apple does not document its instruction set, or any of the other low level details you'd need to use it without letting macOS run the show.

Another Tu quoque fallacies, or perhaps a misunderstanding, are you just reacting?

The point of the VM statement was to related to cloud and the use of ML in cloud, your driving a point that has nothing to do with the argument i was making.

The actual problem:

To run ML and other FOSS within macOS on an external NVME drive, please stick to that and not make other recommendations, that when i try to address, you go on a tangent about something unrelated...

My opinion is that if you're sincere, you really need to pay more attention in class.

You do not go to my classes, and only offer insults.. so why should i take your seriously again?

You say your an engineer of many years, do not provide me with any recommendations beyond "change your school, lecturer, your in over your head"?

None of this is helpful, if anything the more i communicate with you hear, the more my respect for you drops, and i think i am wasting time..

No, because you're ignoring one of the most important questions, one which makes it completely unclear why you're complaining about this:

Not complaining, actually read my posts properly, considering all the conclusions your jumping to, i wonder if your actually understanding me at all.

I am asking for others experiences with booting macOS off of an external NVME drive, and running FOSS while also having kernel access (reduced protection for known developers), so that drivers can be ran in kernel space.

Does not have to be related to ML, but at least running FOSS on a mac mini m2 pro. The foss includes veracrypt and keepassxc. Please stick to that, again providing your many years of "engineering experience"..

I was assuming you were being honest about needing to write 2500 TB per month. That's why I thought you were talking about your manager, not your student adviser.

Toy undergrad projects don't need 2500 TBW/month; they should be designed to teach you how to use the tools rather than require expensive hardware. I can believe there's graduate level projects which might legitimately need that, but they'd come with enough grant money to rent or buy appropriate hardware.

Over arching statement, that cannot be proven, as you do not know all undergrad projects in all places at all time. I do not know this ever, but I know my project and what I am being told to achieve..
That statement "Toy undergrad projects don't need 2500TBW/month" is ridiculous, and your projecting your understanding vs the realities of the world.

Also "they should be designed to teach you how to use the tools rather than require expensive hardware. I can believe there's graduate level projects which might legitimately need that, but they'd come with enough grant money to rent or buy appropriate hardware.”

Equally as ridiculous, as undergraduate projects or any degree or PHD level (i am doing PDH..) requires the student learn and research themselves, the university or academic body will only teach them research methodologies, how to use reliable sources, etc..
Also, the Mac mini m2 pro is cheaper then the other recommendation you and your small group of friends I have chatted to here, certainly cheaper then ÂŁ7k Mac Pro you all seem to be lobbying for me to buy.

What you seem to be referring to are the basics, such as the case of an assignment, and being asked to go away and complete it released to the topic being taught.

Final year projects and PDH works differently, and often require the learner to do their own research, in order to draw conclusions from their findings and action upon them. Research and personal learning is an important step.

Not sure what University you went to, but this is not how things are done, and does nothing for your credibility when making your recommendations to me, with examples such as " change you uni, change your professor, and your over your heard..."

I'm guessing that even if you're not actually a troll, whatever support person has the misfortune of talking to you is as confused as we are.

They understand my quite well actually, as from experiences i can tell you they are level headed, and willing to help. I have given examples, screen recordings and system reports to them. I am now just waiting to hear back.

You on the other hand come across as very unhelpful, rude, and actually more troll like then me, your bate for arguments, and i doubt you help with anything in your day the way your acting. What about my post is so bad to trigger you to this level?

You want to come at me about not making sense when you say things like this?!

AWS rents you a computer (or a slice of a computer), not an "AI". If you use your rented compute power to train a neural network, you are not training the "VPS provider's AI". You're simply generating data stored in disk space that you're also renting from AWS. AWS doesn't know or care what that data is (they're not supposed to look at it), nor do they own any IP rights to it. If Amazon tried to write their contracts such that they owned all data generated on or uploaded to AWS, nobody would use their services.

Go and read your contracts in detail, your data is not private as its in a public cloud the company has full control over and give you very little view on what goes on inside if beyond what you have access to..

Microsoft are one example of a company who out right tells you they are looking at your files and documents to train their ai, what makes amazon any different?

How do you prove they are not doing it? Scouts honour? What makes them different? A company will has fiduciary responsibility to make money, and digital transformation and the digital twin is a thing, so I ask again, how can you prove amazon is not doing it?

This is the new world we live in, and personal data protection beyond the cloud is ever increasing in importance, in order for the user and companies to play on the same level field, there continue to be problems with data breaches, and cases where companies helped on to data (GDRP says one year), when they should not have...

People usually don't want to use the Apple Neural Engine for training, as it's too specialized for inference. Apple's own documentation suggests training on the GPU.

That said, if you really want to try to target the ANE for training, it's right there, you can use it. Go wild! But please don't pretend it's impossible just because of countless made-up issues which don't really matter.

Does not matter to you, this is what i have and what i wish to do, so if i realy do think i should " go wild" then just leave it at that and stop trying to bate me in to an argument..

Also, didn't say you rent the AI, i stated VPS, again not understanding or simply lying.


I say again to you and your group of bullies, if am so way over my head, offer actual advice beyond adhomanims, attacks generally that have nothing to do with the problem, and also stop trying to use to qo qu fallacies to justify your arguments.

I am starting to think you and others who I have argued with may be trolls or simply lobbyists trying to protect a reputation that does not need it.

I love apple, and want to do my work on apple for the reasons I have stated and apple are helping me to achieve this with the support engineers.

Aim: Run FOSS and other apps on macOS on external NVME when reduced protection has been configured.

Why is this triggering you?

Please stop posting unless you have actual advice and help to offer beyond your usual rants and insults.

Good day!
 

Kr0n05K!ngR

macrumors member
Sep 13, 2023
53
13
That is unbelievable and otherwise bizarre. Just to start 😒

Try it and get back to me, if you have a solution, i would be happy to hear your findings.

This works on all macs up to 2018, but does not work on an Mac mini m2 pro.

Thanks for your input.
 

dmccloud

macrumors 68040
Sep 7, 2009
3,138
1,899
Anchorage, AK
Try it and get back to me, if you have a solution, i would be happy to hear your findings.

This works on all macs up to 2018, but does not work on an Mac mini m2 pro.

Thanks for your input.

The more you double down on your inane claims, the less credibility you have. Now you're claiming the University is requiring you to write 2500TB of data per month, even though I clearly showed how hard and expensive that would be to achieve even running a machine 24/7/365 using math (which doesn't lie, mislead, or bend reality). You also continuously misuse the term "tu quoque fallacy" in your recent posts.

Tu quoque is a discussion technique that intends to discredit the opponent's argument by attacking the opponent's own personal behavior and actions as being inconsistent with their argument, therefore accusing hypocrisy. This specious reasoning is a special type of ad hominem attack.

What has been happening in this thread is that people have questioned your alleged needs (not your personal behavior) because they defy logic and reason. That is neither a tu quoque fallacy nor an ad hominem attack. That is disputing claims, which is what all civilized debate is based upon. In fact, you may be the one engaging in ad hominem and tu quoque fallacies through you continual belittlement of anyone who dares to question your claims. Here's a couple of examples of your own attacks:

"I say again to you and your group of bullies,..."

"You on the other hand come across as very unhelpful, rude, and actually more troll like then me" (at least this time you admitted to being trollish)

If your school is truly telling you that you'll need to write 2500TB of data/month, then they're expecting you to perform a entire datacenter's workload on a Mac, which is neither physically nor logistically possible. If they're not and you're estimating the usage you'll need, you're severely overestimating the capabilities of any consumer Mac or PC on the market today.
 
  • Like
Reactions: picpicmac

picpicmac

macrumors 65816
Aug 10, 2023
1,239
1,833
FWIW, I quickly looked up:
1) how much data is can be downloaded from the JWST every day. Answer seems to be about 57GB. Times 30 that is 1.7TB per month.
2) how large are full read files of a human DNA coming from an Illumina machine. Answer seems to be about 30GB. If a grad student is studying a population of say 500 humans, that's 15TB and if said grad student does it all in a month (unlikely but possible) that's 15TB/month.

Those strike me as the most data intensive things a student will be doing.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
FWIW, I quickly looked up:
1) how much data is can be downloaded from the JWST every day. Answer seems to be about 57GB. Times 30 that is 1.7TB per month.
2) how large are full read files of a human DNA coming from an Illumina machine. Answer seems to be about 30GB. If a grad student is studying a population of say 500 humans, that's 15TB and if said grad student does it all in a month (unlikely but possible) that's 15TB/month.

Those strike me as the most data intensive things a student will be doing.
It doesn't work like that. When you are analyzing data, you rarely have the software that does the thing you want, because everyone's needs are different. Instead, you build an analysis pipeline from primitive tools that typically read the data, transform it in some way, and write it out in a new form.

Some of the tools may be trivial. For example, you may need some information in the file header, but the tool that wrote the file did not have the option to include that information. So you run a new tool that reads a file, rewrites the header, and writes a new file.

Combine a few of those, and a pipeline that downloads 30 GB of data may end up writing 500 GB. And you may have to rerun it a few times, because something went wrong or you want to try different computational parameters. Given the speed of today's computers, writing a few terabytes/day is pretty common in this kind of work.
 

ahurst

macrumors 6502
Oct 12, 2021
410
815
Aim: Run FOSS and other apps on macOS on external NVME when reduced protection has been configured.
I think this thread might be more helpful to you if you gave a *specific* example of a software you’re trying to run and the error you’re getting when trying to run it: if we know exactly what the problem you’re running into looks like, we’ll have a lot more information to go on when offering advice!

Also, as a current graduate student (albeit in a different field) who’s worked in and around enough research labs to know what’s normal and fair to ask of an undergraduate student, my honest assessment based on your posts is that your supervisor’s expectations are highly atypical and unfair to you as a student. If you’re a student in a lab that’s doing research that needs specialized and/or expensive equipment, it’s the lab’s responsibility to provide that for you, full stop. If your supervisor is telling you that you need to pay out of pocket for expensive storage that your project will burn through, they are taking advantage of you: it would be unheard of anywhere I’ve worked for a professor to expect student to pay out-of-pocket for *any* research expense, let alone large ones like that. For your own sake and protection, I would recommend you talk to someone in the administration for your department and make sure what your supervisor’s asking of you is fully in line with your institution’s rules and regulations.

Your ability to learn and be mentored shouldn’t be limited by your ability or willingness to spend money on things that are your supervisor’s responsibility to provide for you.


EDIT: Wait, I think I finally pieced together what you’re trying to do with the kernel access stuff, let me know if I’ve got it right: you want to install Veracrypt on your Mac for some purpose, but that requires you to install a MacFUSE driver, which requires you to disable SIP to install it. Thus, you’re not trying to use custom kernel drivers for training/running machine learning models, but rather some unrelated software (I think there’s been a lot of confusion in the thread about this).

First off: have you looked into alternatives to Veracrypt, or do you need it for compatibility with something else? As a general rule, software in modern macOS that requires you to disable SIP is almost certainly not software that’s well-designed for macOS (and comes with a range of security risks). I haven’t used MacFUSE in years, but last time I did I remember it making my system far less stable so I ended up uninstalling it and giving up on native ext4 read support.

Alternatively, as others have suggested, you should just be able to boot from your SIP-disabled internal drive and run FUSE that way: as long as your model is set to read/write from external storage it shouldn’t affect the life of the internal SSD. This used to be a common set up for video editors: they’d have a boot drive (for the OS + apps), a media drive (for all the raw video content), and a “scratch drive” for actually creating and editing projects. Is there a reason you can’t use the internal SSD as a boot drive and use your external for your “scratch”, so to speak?
 
Last edited:
  • Like
Reactions: bobcomer and altaic

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
That is unbelievable and otherwise bizarre. Just to start 😒
Actually, that's one of the only things he's said which is true-ish (after decoding the weird language and dismissing overblown claims). I tried setting up an external boot disk on a spare M1 computer, just because I hadn't done it before. What I found is that you can quite easily configure reduced security settings, but you cannot load third party kernel extensions, such as the macFUSE KEXT required to make Veracrypt work.

I suspect the problem relates to Apple Silicon's new boot process. The early bootloader software (everything that runs before the macOS kernel starts up) has no drivers for any storage media other than a NOR firmware flash chip on the motherboard and the internal NAND flash SSD. The first piece of software in the boot chain which knows how to talk to external disks is macOS kernel. The reason for this is attack surface - Apple's trying to keep the early boot firmware as simple as possible in the name of security.

This has some implications about what goes where. The kernel binary and associated files (such as KEXTs) must be located in a special APFS volume on the internal SSD, even when booting from an external drive. When you use Apple's startup disk utility to tell the system to boot from an external, behind the scenes, it's creating a hidden APFS preboot volume on the internal SSD, then copying the kernel (and KEXT collections) from the external macOS installation to the new internal preboot volume. (Note: these preboot volumes aren't exclusive to booting from an external, the kernel and KEXT collections for an internal SSD installation are also stored in a special preboot APFS volume. Apple has infrastructure allowing multiple of these to exist in parallel on the internal drive without interfering with each other.)

Whenever you install a third party KEXT, a special utility has to run to update the KEXT collection, then the system needs to reboot before the KEXT can actually be loaded (since the kernel only scans KEXT collections once, at boot time). My best guess is that they've got a bug in the KEXT collection updater in which it fails to do the right thing and update the internal preboot volume with the boot volume is an external. It might even be writing everything to a (useless) preboot volume on the external. If so it might be possible to fix this manually, but I didn't really want to spend a lot more time on it, so that's where I stopped.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.