Its really pretty evident on the Apple side.
1. Cost - Apple products are exceedingly expensive and given depreciation rules it makes zero sense. Using the straight line method, companies will only write off products for 5 years. So do they spend 2,500 dollars for a MBP X number of employees or do they spend 1,000 dollars for a Dell Lattitude X number of employees (Every 5 years)? Giving 100 employees a new laptop will cost a company 150,000 more if they go with Apple. Because of depreciation rules, it actually impacts the bottom line to carry fully depreciated assets, so companies tend to replace assets once their useful life has been expended
2. Hardware - What servers does apple currently sell? Hard to have a company "replace their servers and everything with apple products" when apple doesn't sell servers.
3. Software - What software does Apple use to manage networks, group policies, packaging updates? To my knowledge I don't think they have much if any. I know they still have an enterprise page but I don't know how much products are available to help manage the infrastructure.
4. Compatibility, many if not most enterprise software (that is client based) runs on windows and only windows. So that makes less sense to buy every employee a Mac when it fails to run the required software. Oh install windows you say? Sure but then that 2,500 dollars person cost, will now need at least 200 dollars for a windows license.
As for Ubuntu, well lets drop that specific distro and use Linux in general and doing the smallest bit of googling we can see that Linux servers are by far more common and preferred then Windows.
Reason #1 doesn't fly. Companies do not use straight-line, 5-year depreciation for desktop/laptop machines. Section 179 of the IRS Code allows the vast majority of computing equipment to be fully-deducted in the year of purchase, so that's what "everyone" does. This was even true in the 1990s, when I was an IT manager. Since then, the Section 179 per-item limit has been bumped higher and higher. Back then it was something like $10,000 per item. I think it's now closer to $1 million for each item of equipment.
In theory, factory machinery/office equipment is an investment of capital. You're using money you have (or borrow) in order to make more money, and the proceeds (profits) of that investment are taxed.
The justification behind the depreciation deduction is that equipment and buildings wear out/become obsolete. Therefore, your initial investment declines in value, rather than holds steady or grows. Depreciation allows businesses to deduct that loss of value as an expense.
Classic depreciation methods like straight-line assume a business will deduct each year's (estimated) loss of value in the year it occurs, until the full value has been deducted. It's expected that the depreciation method used (and there are many) accurately predicts when the equipment becomes totally useless (is no longer an asset).
The only reason there's a "cost" to carrying fully-depreciated items on a balance sheet is that there's no depreciation deduction and that the item cannot be listed as an asset on the balance sheet (a hidden asset). A new bit of equipment adds value to the balance sheet and triggers new depreciation deductions, and sometimes a company needs one or both.
As long as the cost of using old equipment is reasonable, using old equipment is profitable - you're getting more life out of your initial investment than expected. If/when that older equipment imposes a higher cost in lost productivity, higher maintenance/repair requirements... it's time to retire that equipment.
Due to Section 179 computers are "expensed" - deducted when purchased. Everyone knows they'll be obsolete soon, why pretend otherwise? They're closer to an ordinary and necessary cost of doing business than a long-term capital investment. As such, plain old profit-and-loss accounting is applied. How much does it cost to buy? How much does it cost to maintain/support? How does it impact productivity?
The basic sales pitch from Apple (and IBM, which has been selling Macs for a fair number of years) is total cost of ownership - the substantially lower cost of IT support for Macs (and iPads and iPhones), which more than offsets the higher purchase price of the hardware. If you are willing to pay more up front you'll reap the benefits in subsequent years.
Now, when it comes to servers... Apple is most definitely a seller of end-user hardware. Everything about Apple is the end-user experience. There's little or nothing the presence of an Apple-branded server in a data center can do to affect end-user experience. Since Apple justifies its higher prices (and profit margins) on end-user experience... server farm hardware is a no-win proposition for Apple.
Arguably, the M-series SoCs could change that. However, a data center server does not need all the capabilities Apple builds into the M-series (and A-series). The performance advantage of Apple Silicon is in end-user devices - crafting the full hardware and software experience. Any chip-maker can design a server chipset with as many CISC and/or RISC cores as required, any hardware maker can package those chipsets into server farm-capable systems that meet whatever durability/serviceability demands the server farm operators make. That is not Apple's business at all.