Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

crazy dave

macrumors 65816
Sep 9, 2010
1,453
1,229
Unlike WSL 2, WSL is an api layer over the Windows NT kernel, it is not running in a VM so I guess technically it has "near bare metal performance". It's Cywin on steroids.

WSL is basically an inverted WINE? Interesting ?
 

BigPotatoLobbyist

macrumors 6502
Dec 25, 2020
301
155
I wouldn't use Windows for school if it weren't for WSL. I'm willing to bet the rate of Windows use has increased and will continue to in the next decade among students, if not even more so among nominally nix-exclusive developers.

IMO Homebrew is meh (seriously Apple just make peace with GPL), I also prefer window management on Windows (and by this I mainly mean minimization with previews integrated into the taskbar's icons, however I enjoy the consistent App Menus in Mac OS), IME Apple updates have broken more work-related bs than I've experienced in Windows 10, too.

Sure sucks the Windas (laptop) hardware gets smoked rn lmao
 

Wizec

macrumors 6502a
Jun 30, 2019
680
778
So, in Powershell can you install virtually any open source package, like gcc for example, and string commands together? No.

Now if I want to write .Net cmdlets and do Windows things, great. It’s still not a robust sw development environment for anything other than the Windows ecosystem.

The capability is literally baked into the OS:

PowerShell puts bash to shame. String commands together, yup. Now, I hate PowerShell’s syntax, but feature wise, it has no competition.
 
Last edited:

Gerdi

macrumors 6502
Apr 25, 2020
449
301
The capability is literally baked into the OS:

PowerShell puts bash to shame. String commands together, yup. Now I hate PowerShell’s syntax, but feature wise, it has no competition.

Agreed. Powershell is my most used shell on both my Mac and Windows machine. The really nice thing is, that you can pipe objects instead of only strings like in bash.
Take instance "ps | where { $_.CPU -gt 100 } | sort -property CPU"*. Both Pipe operators forward objects, so you can directly access object properties for both the where-cmd and sort-cmd.
Now compare this to bash, where you need to pipe the ps output to something like grep.

*This is just for illustrative purpose. Don't try this on Mac.
 
Last edited:

theluggage

macrumors G3
Jul 29, 2011
8,015
8,449
For the sake of argument, you think that even 10% of Computer Science students are doing development on Windows machines? My daughter goes to Santa Clara, huge CS department, and it’s 0%, my son is at Colorado State and it’s 0%, as it was for me at CU.
You started off talking about "software professionals". Now it's CS students. Yes, we know, Macs are popular in US Education, always have been - it's one of Apple's niches. That's not "software professionals" or even developers as a whole.

Unlike WSL 2, WSL is an api layer over the Windows NT kernel, it is not running in a VM so I guess technically it has "near bare metal performance". It's Cywin on steroids.
It's curious why Microsoft started off down the "API layer" route and then switched to VM. I guess that the compatibility of running a kosher Linux kernel outweighed any performance advantage of an API layer - or maybe there wasn't any advantage? Thing is, I think VMs are part of the furniture - now most serious CPUs have hardware virtualisation support - and not something to be stigmatised...

The M1 Mac Mini seems more credible to me than the Intel Mac Mini which lacks a real GPU.

Yes. Personally, I don't need a bleeding edge GPU, but do need something better than the lowest-spec Intel integrated graphics on the Intel Mini. The deal-breaker for the M! Mini, for me, is that it only supports 1xDP and 1xHDMI display, plus the ridiculous cost of upgrading the RAM and SSD to something reasonable. Also - and this is where the M1 hype comes in a bit - while it is incredible that the M1 performance in an ultra-miniature low-power setting is even comparable to an Intel Mac (and the GPU blows the Intel iGPU out of the water), that doesn't mean that it is a good upgrade for an Intel iMac with a half-decent discrete GPU. At the end of the day, the M1 is really designed for the MacBook Air.

I'm hoping that the M1x/M2/whatever will offer a more substantial CPU and GPU performance boost over the existing Intel Mac desktops - and that it will show up in a desktop Mac with more substantial base specs, without Mac Pro-style "luxury car" pricing. If it does I'm prepared to be open-minded on the whole PCIe slots/expandable internal storage thing - but Apple still have to prove that they can and will deliver on that. (For other customers, they're going to have to show how Apple Silicon scales to Mac Pro-like 512G+ ECC RAM and multiple GPU levels...)

I think it is often cheaper to use your hardware than rent it from a cloud provider.
...for hobby work, maybe, but for professional use the cost of a few Linux instances is trivial - cloud providers make their money when production stuff goes to scale... and, as I said, for web/server development it makes sense for your development machine to have the fat pipe to the internet and proper IP.

In the past, it has always been an advantage to have everything local, and I never went to a demo without a local server set up on my laptop (lest someone had forgot to book WiFi for the conference room) - but these days, if your network goes down then you're basically humped 'cos that's all of your documentation and software repositories offline, hot-and-cold-running WiFi and 4G is de rigeur for meetings (assuming that we're going to have face-to-face meetings again) so times are changing. I suspect even the "Oh Noes!!! Security!!! If I put stuff in the cloud I'll be fired!!!" line is going to change to "Oh Noes!!! if I take stuff off the cloud and store it locally I'll be fired!!!" as companies sign off all their compliance and security duties to cloud providers.

However, Linux is the standard platform for containerized work loads and you don't have to pay Microsoft a licensing fee. I don't see much value in using anything else.
My reason for mentioning MacOS containerisation was that it would provide some benefit for having your development machine running Unix - a lot of web development work wouldn't care if it was running in a Darwin sandbox or a Linux sandbox, and you save the overhead of a VM. Once you end up using a Linux VM anyway that takes another bite out of the advantages of MacOS over Windows. But, as you said, only a minority of web/server developers are ultimately targeting MacOS, using a Linux VM lets you go that extra mile in simulating your target environment and - as I said - efficient VM support is a must-have for any development system these days.

That is true but Apple Silicon is great for running ARM 64 Linux containers and VMs.

Sure, but it sucks at running x86 containers, and while x86 is still dominant, you're likely to have to support both architectures for the foreseeable future. Whichever you choose for your development hardware you're going to need access to the other - be it physical, cloud-based or (slowly) emulated. If you're developing for ARM64 servers you presumably have access to ARM64 server hardware... I'd love an ARM home server with a bit more clout than a Raspberry Pi, but whether its worth paying the price of an M1 Mini I don't know.

Don't get me wrong - I'm enthusiastic about Apple Silicon, but it's important to get the advantages in perspective.
 
  • Like
Reactions: jeremiah256

ADGrant

macrumors 68000
Mar 26, 2018
1,689
1,059
It's curious why Microsoft started off down the "API layer" route and then switched to VM. I guess that the compatibility of running a kosher Linux kernel outweighed any performance advantage of an API layer - or maybe there wasn't any advantage? Thing is, I think VMs are part of the furniture - now most serious CPUs have hardware virtualisation support - and not something to be stigmatised...
Back when the first versions of NT were released, it shipped with a POSIX compatibility layer. I guess they thought they could just dust that off. One advantage to that approach is RAM. WSL can run on any Windows 10 machine or VM. For WSL 2 you need at least 8GB of RAM and be able to run the Hypervisor (not always possible in VMs).

...for hobby work, maybe, but for professional use the cost of a few Linux instances is trivial - cloud providers make their money when production stuff goes to scale... and, as I said, for web/server development it makes sense for your development machine to have the fat pipe to the internet and proper IP.

Once you need a public IP address that other people can connect to then yes, you should be running your work load on a public cloud (for security if nothing else). But for initial development, I would rather just run Docker and Kubernetes on my local workstation.

In the past, it has always been an advantage to have everything local, and I never went to a demo without a local server set up on my laptop (lest someone had forgot to book WiFi for the conference room) - but these days, if your network goes down then you're basically humped 'cos that's all of your documentation and software repositories offline, hot-and-cold-running WiFi and 4G is de rigeur for meetings (assuming that we're going to have face-to-face meetings again) so times are changing. I suspect even the "Oh Noes!!! Security!!! If I put stuff in the cloud I'll be fired!!!" line is going to change to "Oh Noes!!! if I take stuff off the cloud and store it locally I'll be fired!!!" as companies sign off all their compliance and security duties to cloud providers.

There is no doubt that the cloud providers are better at InfoSec than most internal IT depts.

My reason for mentioning MacOS containerisation was that it would provide some benefit for having your development machine running Unix - a lot of web development work wouldn't care if it was running in a Darwin sandbox or a Linux sandbox, and you save the overhead of a VM. Once you end up using a Linux VM anyway that takes another bite out of the advantages of MacOS over Windows. But, as you said, only a minority of web/server developers are ultimately targeting MacOS, using a Linux VM lets you go that extra mile in simulating your target environment and - as I said - efficient VM support is a must-have for any development system these days.

I believe that currently the optimal development target is a Linux Docker container running on Kubernetes. I can do that on a Mac just as easily as I can on Linux and still use software like MS Office. Even .NET Core development is well supported on MacOS.

Sure, but it sucks at running x86 containers, and while x86 is still dominant, you're likely to have to support both architectures for the foreseeable future. Whichever you choose for your development hardware you're going to need access to the other - be it physical, cloud-based or (slowly) emulated. If you're developing for ARM64 servers you presumably have access to ARM64 server hardware... I'd love an ARM home server with a bit more clout than a Raspberry Pi, but whether its worth paying the price of an M1 Mini I don't know.

Don't get me wrong - I'm enthusiastic about Apple Silicon, but it's important to get the advantages in perspective.

I am not developing for ARM servers currently but AWS already offers ARM64 VMs and Azure and GCP will be following them. For now though x86 Linux is still the main cloud platform which is why I bought a 2020 27" iMac last year.
 

fwilers

macrumors member
Feb 1, 2017
53
50
Washington
Because of Powershell? Ok. You’re right. <sarcasm>Linux/Mac/Unix users are pining for that fantastic utility</sarcasm>.

somebody’s credibility is shot though

So, in Powershell can you install virtually any open source package, like gcc for example, and string commands together? No.

You obviously don't know the functionality of Powershell, nor it's capabilities. Perhaps you should take some classes and you will change your mind. All dev's I know, use it extensively. I use it extensively because it's so flexible. If you don't like it, that's fine, but if you don't know what it can do and say you don't like it, then that's just plain stupidity.


"Oh Noes!!! Security!!! If I put stuff in the cloud I'll be fired!!!" line is going to change to "Oh Noes!!! if I take stuff off the cloud and store it locally I'll be fired!!!"

This is so true and made me laugh. It is already true in our business for end users and dev's. The hardest part is tracking in case something does happen.
All for naught as anyone can take hundreds of screen shots or video with their phone, or built in screen capture on the computer. Sure it's slow, but doesn't stop the bad ones.
Security wise, no one solution is absolute, which has been proven, so it doesn't make any difference where it lives.
 

theluggage

macrumors G3
Jul 29, 2011
8,015
8,449
Security wise, no one solution is absolute, which has been proven, so it doesn't make any difference where it lives.

In the corporate world, I cynically suspect that it's not so much about preventing actual security breaches as making sure that, when the auditors arrive, all of the appropriate compliance logs and paper trails are in place to prove that all of your security theatre is present and correct... I'm sure cloud providers can provide really impressive, 100% tick-list compliant, reports & lists of their security accreditations... and if AWS or Azure get hacked then that's their (insurers') problem.

If someone leaves a USB stick on a train or gets their laptop stolen, then chances are it's sitting safely in a lost property office, or smashed up and deep-sixed in the nearest river, but you don't know and what can you do...? whereas you can change their cloud password in a jiffy, tick box, issue resolved as per item 134b on the Risk Register, between "office struck by asteroid" (action: procure office space on alternative continent/planet) and "uniquely talented chief software engineer falls under a train" (action: recruit new uniquely talented engineer). That may not be the real risk (vs. someone bribing an employee to take some screenshots) but it's what managers have nightmares about.
 

fwilers

macrumors member
Feb 1, 2017
53
50
Washington
In the corporate world, I cynically suspect that it's not so much about preventing actual security breaches as making sure that, when the auditors arrive, all of the appropriate compliance logs and paper trails are in place to prove that all of your security theatre is present and correct... I'm sure cloud providers can provide really impressive, 100% tick-list compliant, reports & lists of their security accreditations... and if AWS or Azure get hacked then that's their (insurers') problem.

If someone leaves a USB stick on a train or gets their laptop stolen, then chances are it's sitting safely in a lost property office, or smashed up and deep-sixed in the nearest river, but you don't know and what can you do...? whereas you can change their cloud password in a jiffy, tick box, issue resolved as per item 134b on the Risk Register, between "office struck by asteroid" (action: procure office space on alternative continent/planet) and "uniquely talented chief software engineer falls under a train" (action: recruit new uniquely talented engineer). That may not be the real risk (vs. someone bribing an employee to take some screenshots) but it's what managers have nightmares about.
It is about preventing security breaches. The largest threat is theft of IP. Security wise that's why USB is disabled and business laptops have multiple passwords for UEFI, TPM, Anti-Theft, Security Chip, Encryption, 2FA, etc. No IP can be downloaded to laptop, only viewed for a predetermined amount of time. All other documents are watermarked and password protected. All connections are recorded by SIM.
 
  • Like
Reactions: Unregistered 4U

crazy dave

macrumors 65816
Sep 9, 2010
1,453
1,229
Even though Alyssa is extremely talented, I remain skeptical. M1 is not Bifrost... and unless Apple GPU is autonomous (with very little driver control), there will be a lot of things to reverse engineer and discover.

At the moment Alyssa is poking around in the command buffers containing the shaders. As far as i see it, there is not even a complete disassembler for shaders yet and only a vague idea how these shaders are sheduled. Thats very early work - in particular compared to how long it took for Freedreno and Panfrost to become even only somewhat stable.

But hey, nothing is impossible per se :)

Really good progress on the graphics driver:


Still early days but that’s big!
 

CMMChris

macrumors 6502a
Oct 28, 2019
850
794
Germany (Bavaria)
There has never been no hope. It's not the first time Linux DEVs write drivers for devices that are not documented. Sure, it will take time. But once the foundation is done things will get easier even for future Apple Silicon.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,679
There has never been no hope. It's not the first time Linux DEVs write drivers for devices that are not documented. Sure, it will take time. But once the foundation is done things will get easier even for future Apple Silicon.

Well, I mean "more hope". The thing is, I never doubted that this stuff can be reverse engineered, just that it can be made into something more than an academic/hacking exercise. But it indeed seems that these GPUs are straightforward enough so that a driver does not have to take care of too many corner cases (which was my main concern).
 

CMMChris

macrumors 6502a
Oct 28, 2019
850
794
Germany (Bavaria)
Yeah but that worry was and is unjustified looking at the past track record of GPU development on Linux. In addition, Apple GPUs being derived from PowerVR GPUs (yes, they are very different though) makes things a tiny bit easier. Looking how far things have come already I guess we could see something usable by the end of the year. From then on, it will only get better.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,679
In addition, Apple GPUs being derived from PowerVR GPUs (yes, they are very different though) makes things a tiny bit easier.

Why would it make it easier? Besides, the only things Apple took from PowerVR are some fixed function hardware (TBDR stuff, compression etc.). Their shader pipeline is 100% custom and couldn't' be more different from PowerVR.

Compare the PowerVR 6 documentation (the shading unit overview starts at page 9) and the reverse-engineered Apple GPU shader architecture (https://dougallj.github.io/applegpu/docs.html).
 

Lowhangers

macrumors regular
Nov 26, 2017
195
305
What is SO great about the M1 macs?
They offer less than Windows counterparts. No real gaming support, no support for other OS natively, no touch and VERY VERY limited app compatibly. Sure its faster than i7 11th gen but AMD processors offer greater performance and around the same battery life as the M1.

The AMD Ryzen 7 4800U offers faster performance than an M1 Air/Pro and there are laptops that have that processor that are cheaper than the M1 Air with upgradable SSD and RAM.

Now with the SSD swap issue that Apple is quiet on is very serious IMO. I have an intel 16" MBP and I have written about 7TBW and I got this machine around January 2020 and I use this laptop very heavily everyday. The fact that I see people writing over 15TBW on their M1 macs that they got 5-6 months ago is very concerning.

All I am saying is look beyond the M1 hype and see that you are getting a computer with less features, no upgradeability and limited third party software. I say this because I see some people say the M1 Air is the best deal for an Ultrabook, I strongly disagree with that claim.
The reason the M1 macs seem so good is because the previous Macs were utter garbage in terms of specs and price to performance ratio.
Ever wonder why Rosseta 2 runs Intel software better on M1 macs than on intel macs is because those intel's that Apple replaced were not at all performant.
The M1 Air had a quad core i7 a weak one at that, the M1 Pro had a 8th gen i5/i7.

For $920 on the Windows side you can get a HP ENVY x360 with a FHD screen(1080p), Ryzen 7 4700U, 16GB RAM, a 256GB SSD(user upgradable) and a 1000 NITS display with touch. Click here to see HP Envy configure page. Yes it comes with Windows but Windows can do a LOT more than macOS can ever can.
The argument that macOS is better than Windows is no longer true as Windows vastly outperforms macOS in almost everyway. It's now even more obvious with the M1 macs.

I know I can't tell people what to buy or not, but people have been making extraordinary claims on YouTube, twitter and other social media
forums that M1 macs is the future and outperform most laptops and are the best value out there and I just wanted to clarify some points.

EDIT:
Ok I been researching the M1 more. It only consumes 15Watts max for the CPU alone. Thats very impressive.
The 4700U Ryzen costumes 40 watts max, not really as the spec sheet states which is 15watts. But after a while it comes to 15 watts.

Whereas the M1 goes up to 15Watts for the CPU only. NOW that is impressive. Can't wait for future Apple Sillicon now.

View attachment 1755852
source for watt info: https://www.anandtech.com/show/16084/intel-tiger-lake-review-deep-dive-core-11th-gen/7
All I can tell you is that my M1 Mac mini is the best Mac I have ever owned. I love it, it does everything I need and it never seems to get hot. I don’t think I’ve ever actually heard the fan come on, even when playing World of Warcraft for an hour.

I don’t regarded as hype at all, these machines are fantastic. I would never go back to an Intel Mac now. Your mileage may vary, but I think you’re underestimating the M1, and I think you are going to be shocked as the years go by and we move from one generation to the next. We’re only at the beginning of this.
 
  • Like
Reactions: thedocbwarren

thedocbwarren

macrumors 6502
Nov 10, 2017
430
378
San Francisco, CA
All I can tell you is that my M1 Mac mini is the best Mac I have ever owned. I love it, it does everything I need and it never seems to get hot. I don’t think I’ve ever actually heard the fan come on, even when playing World of Warcraft for an hour.

I don’t regarded as hype at all, these machines are fantastic. I would never go back to an Intel Mac now. Your mileage may vary, but I think you’re underestimating the M1, and I think you are going to be shocked as the years go by and we move from one generation to the next. We’re only at the beginning of this.
Same for my MacBook Pro 13. These are amazing and I now never touch the 16. The one recent time I did I was shocked on how hot, slower, and cumbersome it is.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.