Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

iHorseHead

macrumors 68000
Original poster
Jan 1, 2021
1,580
1,999
Hey,
for real, as someone who works in IT Support and the company provides IT Support for many companies (which is why they're basically swimming in money, while paying their stuff the average wage) I've noticed only a few companies using Macs. Most are using Windows and Windows Servers.
Why don't companies replace their servers and everything with Apple products or Ubuntu server? I don't get it.

This might be different in the US and in North America, but generally speaking everyone I know are stuck with Microsoft products, including Microsoft Teams, which is horrible.

I see a healthy mix of bag between iOS and Android, but the majority of the companies are still using Windows / Windows Servers. I don't even think I've ever had an honour of using macOS Server when it comes to work.
 

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
Why don't companies replace their servers and everything with Apple products or Ubuntu server? I don't get it.
Its really pretty evident on the Apple side.
1. Cost - Apple products are exceedingly expensive and given depreciation rules it makes zero sense. Using the straight line method, companies will only write off products for 5 years. So do they spend 2,500 dollars for a MBP X number of employees or do they spend 1,000 dollars for a Dell Lattitude X number of employees (Every 5 years)? Giving 100 employees a new laptop will cost a company 150,000 more if they go with Apple. Because of depreciation rules, it actually impacts the bottom line to carry fully depreciated assets, so companies tend to replace assets once their useful life has been expended

2. Hardware - What servers does apple currently sell? Hard to have a company "replace their servers and everything with apple products" when apple doesn't sell servers.

3. Software - What software does Apple use to manage networks, group policies, packaging updates? To my knowledge I don't think they have much if any. I know they still have an enterprise page but I don't know how much products are available to help manage the infrastructure.

4. Compatibility, many if not most enterprise software (that is client based) runs on windows and only windows. So that makes less sense to buy every employee a Mac when it fails to run the required software. Oh install windows you say? Sure but then that 2,500 dollars person cost, will now need at least 200 dollars for a windows license.

As for Ubuntu, well lets drop that specific distro and use Linux in general and doing the smallest bit of googling we can see that Linux servers are by far more common and preferred then Windows.
 

ApfelKuchen

macrumors 601
Aug 28, 2012
4,335
3,012
Between the coasts
Its really pretty evident on the Apple side.
1. Cost - Apple products are exceedingly expensive and given depreciation rules it makes zero sense. Using the straight line method, companies will only write off products for 5 years. So do they spend 2,500 dollars for a MBP X number of employees or do they spend 1,000 dollars for a Dell Lattitude X number of employees (Every 5 years)? Giving 100 employees a new laptop will cost a company 150,000 more if they go with Apple. Because of depreciation rules, it actually impacts the bottom line to carry fully depreciated assets, so companies tend to replace assets once their useful life has been expended

2. Hardware - What servers does apple currently sell? Hard to have a company "replace their servers and everything with apple products" when apple doesn't sell servers.

3. Software - What software does Apple use to manage networks, group policies, packaging updates? To my knowledge I don't think they have much if any. I know they still have an enterprise page but I don't know how much products are available to help manage the infrastructure.

4. Compatibility, many if not most enterprise software (that is client based) runs on windows and only windows. So that makes less sense to buy every employee a Mac when it fails to run the required software. Oh install windows you say? Sure but then that 2,500 dollars person cost, will now need at least 200 dollars for a windows license.

As for Ubuntu, well lets drop that specific distro and use Linux in general and doing the smallest bit of googling we can see that Linux servers are by far more common and preferred then Windows.
Reason #1 doesn't fly. Companies do not use straight-line, 5-year depreciation for desktop/laptop machines. Section 179 of the IRS Code allows the vast majority of computing equipment to be fully-deducted in the year of purchase, so that's what "everyone" does. This was even true in the 1990s, when I was an IT manager. Since then, the Section 179 per-item limit has been bumped higher and higher. Back then it was something like $10,000 per item. I think it's now closer to $1 million for each item of equipment.

In theory, factory machinery/office equipment is an investment of capital. You're using money you have (or borrow) in order to make more money, and the proceeds (profits) of that investment are taxed.

The justification behind the depreciation deduction is that equipment and buildings wear out/become obsolete. Therefore, your initial investment declines in value, rather than holds steady or grows. Depreciation allows businesses to deduct that loss of value as an expense.

Classic depreciation methods like straight-line assume a business will deduct each year's (estimated) loss of value in the year it occurs, until the full value has been deducted. It's expected that the depreciation method used (and there are many) accurately predicts when the equipment becomes totally useless (is no longer an asset).

The only reason there's a "cost" to carrying fully-depreciated items on a balance sheet is that there's no depreciation deduction and that the item cannot be listed as an asset on the balance sheet (a hidden asset). A new bit of equipment adds value to the balance sheet and triggers new depreciation deductions, and sometimes a company needs one or both.

As long as the cost of using old equipment is reasonable, using old equipment is profitable - you're getting more life out of your initial investment than expected. If/when that older equipment imposes a higher cost in lost productivity, higher maintenance/repair requirements... it's time to retire that equipment.

Due to Section 179 computers are "expensed" - deducted when purchased. Everyone knows they'll be obsolete soon, why pretend otherwise? They're closer to an ordinary and necessary cost of doing business than a long-term capital investment. As such, plain old profit-and-loss accounting is applied. How much does it cost to buy? How much does it cost to maintain/support? How does it impact productivity?

The basic sales pitch from Apple (and IBM, which has been selling Macs for a fair number of years) is total cost of ownership - the substantially lower cost of IT support for Macs (and iPads and iPhones), which more than offsets the higher purchase price of the hardware. If you are willing to pay more up front you'll reap the benefits in subsequent years.

Now, when it comes to servers... Apple is most definitely a seller of end-user hardware. Everything about Apple is the end-user experience. There's little or nothing the presence of an Apple-branded server in a data center can do to affect end-user experience. Since Apple justifies its higher prices (and profit margins) on end-user experience... server farm hardware is a no-win proposition for Apple.

Arguably, the M-series SoCs could change that. However, a data center server does not need all the capabilities Apple builds into the M-series (and A-series). The performance advantage of Apple Silicon is in end-user devices - crafting the full hardware and software experience. Any chip-maker can design a server chipset with as many CISC and/or RISC cores as required, any hardware maker can package those chipsets into server farm-capable systems that meet whatever durability/serviceability demands the server farm operators make. That is not Apple's business at all.
 

KaliYoni

macrumors 68000
Feb 19, 2016
1,785
3,928
The underlying principle in accounting for depreciation is that an expense should be matched to the time period in which the expense is incurred. The time period for many expenses is obvious. Monthly rent, for example, is clearly assignable to a specific month a building was occupied. But what about an expense, such as a desk, that is paid for in one month but is used over multiple months? Accounting's answer is to spread the expense over a certain period of time; in other words, a depreciation schedule.

This seemingly simple principle, however, can become complex because choosing an appropriate time period often is more of an art than a science. Further, depreciation time periods and allocation schedules are frequently manipulated by tax authorities and elected officials to influence business' behavior.
 

MacsRgr8

macrumors G3
Sep 8, 2002
8,316
1,832
The Netherlands
Most companies acquire / lease IT equipment based on cost and compatibility.

Even though the value of Apple hardware is higher after 3/4 years than PC hardware, if you lease Apple computers the price per month is higher for Apple hardware than other comparable hardware.

Almost all software runs natively on Windows and Intel. It's a "safe bet" for companies. Microsoft is the IT-backbone of most companies. AD, Azure, Office 365, Windows... has been used widespread for ages (since NT 4) and that in itself delivers a sense of "cannot go wrong".
Don't forget Apple was almost bankrupt mid-90s, which was the time when Windows 95 / NT 4 / MS Office became so popular in offices. Hardly any company would by Apple hardware back then, as that was seen as a huge risk.

Nowadays, Apple products are very, very consumer-focused. As "BYOD" and "employee-chosen" devices are becoming more popular, you do see a lot more Apple hardware in companies.
These devices do need to be managed if the company likes to enforce security.
Apple had introduced Xserve and Mac OS X Server back in the day. Great stuff for IT departments who solely managed Macs. But, that was a tiny market, and Apple had to pull the plug on Xserve as it did sell well, and not really help Macs expand in offices.
It was considered better to let Apple devices connect to existing It environments, i.e. Microsoft AD.
Now MDMs are used to manage Apple devices, and, one of the most used is InTune... by Microsoft. Microsoft is the "big player" in companies.

But as we see Apple devices grow in companies, we see Apple is trying again to help manage and secure with their own software and services: Apple Business Essentials.
 
  • Like
Reactions: AleRod

Ruggy

macrumors 65816
Jan 11, 2017
1,021
665
Let me link you to this article from 2016 about what happened when IBM decided to let their employees choose Mac or PC for the first time.
In case the significance didn't sink in straight away: this is IBM!!!
 

Spock

macrumors 68040
Jan 6, 2002
3,522
7,567
Vulcan
Getting rid of surplus Macs is a pain I can tell you that, maybe companies are finding that out the hard way..
 

dtm84

macrumors member
Oct 10, 2021
79
167
In my field macs are everywhere where end user has the choice of their computer. When they do not have the choice they end up with a dell box loaded up with trash software like mcafee. Windows computers are preferred by IT because they are more easily remotely administered via windows server. The same level of remote administration and lockdown is not as easily deployed on macs so what happens is that the salaried professionals get to use mactops while the hourly workers get windows workstations.
 

InuNacho

macrumors 68010
Apr 24, 2008
2,001
1,262
In that one place
Getting rid of surplus Macs is a pain I can tell you that, maybe companies are finding that out the hard way..
I've worked in a few ITAD facilities before. Some were fly by night and others reputable and R2 certified.

Apple stuff is a complete PITA to work on for the IT guys in the secondary market. I left the industry before Macs could be completely locked down but whenever we got the pallet of iPads or iPhones from a client there was a big collective groan by everyone. If it was locked, junk it.
In one of the fly by night facilities I was in we had this guy that used to come in and buy every locked iDevice we had and would flash and ship them to the Middle East and Africa to be used again.

Apple stuff is also extremely expensive to do relatively simple tasks. In said facilities I needed dual monitors for the sales room and a laptop that could take a bit of a beating when called out to the warehouse to look at incoming products. A nice Lenovo Thinkpad with dock solved that issue since the best Macbook Pro at the time could only output 1 monitor.
 
  • Like
Reactions: AleRod
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.