Unlike WSL 2, WSL is an api layer over the Windows NT kernel, it is not running in a VM so I guess technically it has "near bare metal performance". It's Cywin on steroids.
WSL is basically an inverted WINE? Interesting ?
Unlike WSL 2, WSL is an api layer over the Windows NT kernel, it is not running in a VM so I guess technically it has "near bare metal performance". It's Cywin on steroids.
So, in Powershell can you install virtually any open source package, like gcc for example, and string commands together? No.
Now if I want to write .Net cmdlets and do Windows things, great. It’s still not a robust sw development environment for anything other than the Windows ecosystem.
The capability is literally baked into the OS:
How to install GCC and GDB for WSL(windows subsytem for linux)?
I need gcc to compile a c code and unable to install gcc on wsl. I tried sudo apt-get install gcc but it doesn't work. This is the error. hack@DESKTOP-VMQA3JB:~$ sudo apt-get install gcc Reading p...stackoverflow.com
PowerShell puts bash to shame. String commands together, yup. Now I hate PowerShell’s syntax, but feature wise, it has no competition.
You started off talking about "software professionals". Now it's CS students. Yes, we know, Macs are popular in US Education, always have been - it's one of Apple's niches. That's not "software professionals" or even developers as a whole.For the sake of argument, you think that even 10% of Computer Science students are doing development on Windows machines? My daughter goes to Santa Clara, huge CS department, and it’s 0%, my son is at Colorado State and it’s 0%, as it was for me at CU.
It's curious why Microsoft started off down the "API layer" route and then switched to VM. I guess that the compatibility of running a kosher Linux kernel outweighed any performance advantage of an API layer - or maybe there wasn't any advantage? Thing is, I think VMs are part of the furniture - now most serious CPUs have hardware virtualisation support - and not something to be stigmatised...Unlike WSL 2, WSL is an api layer over the Windows NT kernel, it is not running in a VM so I guess technically it has "near bare metal performance". It's Cywin on steroids.
The M1 Mac Mini seems more credible to me than the Intel Mac Mini which lacks a real GPU.
...for hobby work, maybe, but for professional use the cost of a few Linux instances is trivial - cloud providers make their money when production stuff goes to scale... and, as I said, for web/server development it makes sense for your development machine to have the fat pipe to the internet and proper IP.I think it is often cheaper to use your hardware than rent it from a cloud provider.
My reason for mentioning MacOS containerisation was that it would provide some benefit for having your development machine running Unix - a lot of web development work wouldn't care if it was running in a Darwin sandbox or a Linux sandbox, and you save the overhead of a VM. Once you end up using a Linux VM anyway that takes another bite out of the advantages of MacOS over Windows. But, as you said, only a minority of web/server developers are ultimately targeting MacOS, using a Linux VM lets you go that extra mile in simulating your target environment and - as I said - efficient VM support is a must-have for any development system these days.However, Linux is the standard platform for containerized work loads and you don't have to pay Microsoft a licensing fee. I don't see much value in using anything else.
That is true but Apple Silicon is great for running ARM 64 Linux containers and VMs.
Back when the first versions of NT were released, it shipped with a POSIX compatibility layer. I guess they thought they could just dust that off. One advantage to that approach is RAM. WSL can run on any Windows 10 machine or VM. For WSL 2 you need at least 8GB of RAM and be able to run the Hypervisor (not always possible in VMs).It's curious why Microsoft started off down the "API layer" route and then switched to VM. I guess that the compatibility of running a kosher Linux kernel outweighed any performance advantage of an API layer - or maybe there wasn't any advantage? Thing is, I think VMs are part of the furniture - now most serious CPUs have hardware virtualisation support - and not something to be stigmatised...
...for hobby work, maybe, but for professional use the cost of a few Linux instances is trivial - cloud providers make their money when production stuff goes to scale... and, as I said, for web/server development it makes sense for your development machine to have the fat pipe to the internet and proper IP.
In the past, it has always been an advantage to have everything local, and I never went to a demo without a local server set up on my laptop (lest someone had forgot to book WiFi for the conference room) - but these days, if your network goes down then you're basically humped 'cos that's all of your documentation and software repositories offline, hot-and-cold-running WiFi and 4G is de rigeur for meetings (assuming that we're going to have face-to-face meetings again) so times are changing. I suspect even the "Oh Noes!!! Security!!! If I put stuff in the cloud I'll be fired!!!" line is going to change to "Oh Noes!!! if I take stuff off the cloud and store it locally I'll be fired!!!" as companies sign off all their compliance and security duties to cloud providers.
My reason for mentioning MacOS containerisation was that it would provide some benefit for having your development machine running Unix - a lot of web development work wouldn't care if it was running in a Darwin sandbox or a Linux sandbox, and you save the overhead of a VM. Once you end up using a Linux VM anyway that takes another bite out of the advantages of MacOS over Windows. But, as you said, only a minority of web/server developers are ultimately targeting MacOS, using a Linux VM lets you go that extra mile in simulating your target environment and - as I said - efficient VM support is a must-have for any development system these days.
Sure, but it sucks at running x86 containers, and while x86 is still dominant, you're likely to have to support both architectures for the foreseeable future. Whichever you choose for your development hardware you're going to need access to the other - be it physical, cloud-based or (slowly) emulated. If you're developing for ARM64 servers you presumably have access to ARM64 server hardware... I'd love an ARM home server with a bit more clout than a Raspberry Pi, but whether its worth paying the price of an M1 Mini I don't know.
Don't get me wrong - I'm enthusiastic about Apple Silicon, but it's important to get the advantages in perspective.
Because of Powershell? Ok. You’re right. <sarcasm>Linux/Mac/Unix users are pining for that fantastic utility</sarcasm>.
somebody’s credibility is shot though
So, in Powershell can you install virtually any open source package, like gcc for example, and string commands together? No.
"Oh Noes!!! Security!!! If I put stuff in the cloud I'll be fired!!!" line is going to change to "Oh Noes!!! if I take stuff off the cloud and store it locally I'll be fired!!!"
Security wise, no one solution is absolute, which has been proven, so it doesn't make any difference where it lives.
It is about preventing security breaches. The largest threat is theft of IP. Security wise that's why USB is disabled and business laptops have multiple passwords for UEFI, TPM, Anti-Theft, Security Chip, Encryption, 2FA, etc. No IP can be downloaded to laptop, only viewed for a predetermined amount of time. All other documents are watermarked and password protected. All connections are recorded by SIM.In the corporate world, I cynically suspect that it's not so much about preventing actual security breaches as making sure that, when the auditors arrive, all of the appropriate compliance logs and paper trails are in place to prove that all of your security theatre is present and correct... I'm sure cloud providers can provide really impressive, 100% tick-list compliant, reports & lists of their security accreditations... and if AWS or Azure get hacked then that's their (insurers') problem.
If someone leaves a USB stick on a train or gets their laptop stolen, then chances are it's sitting safely in a lost property office, or smashed up and deep-sixed in the nearest river, but you don't know and what can you do...? whereas you can change their cloud password in a jiffy, tick box, issue resolved as per item 134b on the Risk Register, between "office struck by asteroid" (action: procure office space on alternative continent/planet) and "uniquely talented chief software engineer falls under a train" (action: recruit new uniquely talented engineer). That may not be the real risk (vs. someone bribing an employee to take some screenshots) but it's what managers have nightmares about.
Even though Alyssa is extremely talented, I remain skeptical. M1 is not Bifrost... and unless Apple GPU is autonomous (with very little driver control), there will be a lot of things to reverse engineer and discover.
At the moment Alyssa is poking around in the command buffers containing the shaders. As far as i see it, there is not even a complete disassembler for shaders yet and only a vague idea how these shaders are sheduled. Thats very early work - in particular compared to how long it took for Freedreno and Panfrost to become even only somewhat stable.
But hey, nothing is impossible per se
Really good progress on the graphics driver:
Dissecting the Apple M1 GPU, part III
rosenzweig.io
Still early days but that’s big!
There has never been no hope. It's not the first time Linux DEVs write drivers for devices that are not documented. Sure, it will take time. But once the foundation is done things will get easier even for future Apple Silicon.
In addition, Apple GPUs being derived from PowerVR GPUs (yes, they are very different though) makes things a tiny bit easier.
All I can tell you is that my M1 Mac mini is the best Mac I have ever owned. I love it, it does everything I need and it never seems to get hot. I don’t think I’ve ever actually heard the fan come on, even when playing World of Warcraft for an hour.What is SO great about the M1 macs?
They offer less than Windows counterparts. No real gaming support, no support for other OS natively, no touch and VERY VERY limited app compatibly. Sure its faster than i7 11th gen but AMD processors offer greater performance and around the same battery life as the M1.
The AMD Ryzen 7 4800U offers faster performance than an M1 Air/Pro and there are laptops that have that processor that are cheaper than the M1 Air with upgradable SSD and RAM.
Now with the SSD swap issue that Apple is quiet on is very serious IMO. I have an intel 16" MBP and I have written about 7TBW and I got this machine around January 2020 and I use this laptop very heavily everyday. The fact that I see people writing over 15TBW on their M1 macs that they got 5-6 months ago is very concerning.
All I am saying is look beyond the M1 hype and see that you are getting a computer with less features, no upgradeability and limited third party software. I say this because I see some people say the M1 Air is the best deal for an Ultrabook, I strongly disagree with that claim.
The reason the M1 macs seem so good is because the previous Macs were utter garbage in terms of specs and price to performance ratio.
Ever wonder why Rosseta 2 runs Intel software better on M1 macs than on intel macs is because those intel's that Apple replaced were not at all performant.
The M1 Air had a quad core i7 a weak one at that, the M1 Pro had a 8th gen i5/i7.
For $920 on the Windows side you can get a HP ENVY x360 with a FHD screen(1080p), Ryzen 7 4700U, 16GB RAM, a 256GB SSD(user upgradable) and a 1000 NITS display with touch. Click here to see HP Envy configure page. Yes it comes with Windows but Windows can do a LOT more than macOS can ever can.
The argument that macOS is better than Windows is no longer true as Windows vastly outperforms macOS in almost everyway. It's now even more obvious with the M1 macs.
I know I can't tell people what to buy or not, but people have been making extraordinary claims on YouTube, twitter and other social media
forums that M1 macs is the future and outperform most laptops and are the best value out there and I just wanted to clarify some points.
EDIT:
Ok I been researching the M1 more. It only consumes 15Watts max for the CPU alone. Thats very impressive.
The 4700U Ryzen costumes 40 watts max, not really as the spec sheet states which is 15watts. But after a while it comes to 15 watts.
Whereas the M1 goes up to 15Watts for the CPU only. NOW that is impressive. Can't wait for future Apple Sillicon now.
View attachment 1755852
source for watt info: https://www.anandtech.com/show/16084/intel-tiger-lake-review-deep-dive-core-11th-gen/7
Same for my MacBook Pro 13. These are amazing and I now never touch the 16. The one recent time I did I was shocked on how hot, slower, and cumbersome it is.All I can tell you is that my M1 Mac mini is the best Mac I have ever owned. I love it, it does everything I need and it never seems to get hot. I don’t think I’ve ever actually heard the fan come on, even when playing World of Warcraft for an hour.
I don’t regarded as hype at all, these machines are fantastic. I would never go back to an Intel Mac now. Your mileage may vary, but I think you’re underestimating the M1, and I think you are going to be shocked as the years go by and we move from one generation to the next. We’re only at the beginning of this.