Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
My point is that premium speakers, screens, metal enclosure, touchpad, etc. These components should be more expensive than adding some inexpensive RAM and SSD. Every Macbook has these qualities. In order to subsidize these premium parts, Apple makes the 16/512 upgrade expensive.
No, they make the upgrades expensive so that everyone irrespective of their income can pay the maximum amount of money they have available. The need for vastly different price points comes first and then you need to consider, how the expensive models should differ from the cheaper ones? Good speakers and metal enclosure aren't expensive. Aluminium is one of the most abundant metals in the earth’s crust. And once designed Apple literally goes to the cheapest Chinese supplier to mass-produce their speakers. Furthermore the non-pro devices like the MacBook Air do indeed have worse screen technology.
For these PC makers, adding $15 extra to get to 16/512 is a very easy way to compete in value.
PC makers compete on the basis of spreadsheet comparisons, where the larger number on the same row is supposed to signal a better product. Quantifiable numbers are easier to compare for bean counters. Especially when there are way too many options to compare. That's how we got into the Gigahertz race and Megapixel cameras etc. The PC market trapped themselves in this cycle, where efficiency and elegance don't count. Just more of something for less money. Apple is almost trolling them with their pricing scheme.
 

dmccloud

macrumors 68040
Sep 7, 2009
3,142
1,899
Anchorage, AK
So tell us, what did you pay for your custom PC build? If it’s more expensive than a Mac, I’ll declare the end of DIY PCs. The big vendors are going to die anyway.

Somewhere between $2000 and $2500 once you factor in CPU, motherboard, RAM, videocard, storage (2 x M.2, 2x 2.5" SSD, 2x 8TB Seagate Barracudas for games), two USB PCI-E cards (to add both USB-A and USB-C ports for accessories such as my webcam, audio mixer, microphone, Stream Deck), and a liquid cooler for the CPU.

The bulk of the PC buying market does NOT want to worry about building a computer or making sure everything works together, so the big vendors aren't going anywhere. For those who want the ability to customize their builds without doing the assembly labor themselves, there are vendors such as iBuyPower, Doghouse Systems, and Digital Storm (among others). Even companies such as NZXT who have traditionally just sold cases and accessories such as fans are now building custom configs for customers, so it's not hard to buy PCs that are not of the cookie cutter variety sold by HP, Dell, Lenovo, etc.
 
Last edited:

mi7chy

macrumors G4
Oct 24, 2014
10,620
11,294
I don't think OP has ever used models above 7b or 13b. 65b is considered where it starts becoming somewhat usable relative to cloud Bing Chat and ChatGPT but it requires >32GB RAM. Even 30b requires >16GB RAM plus it becomes painfully slow without discrete GPU. So, it's not just about RAM but RAM + compute performance.
 
  • Like
Reactions: TechnoMonk

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
Somewhere between $2000 and $2500 once you factor in CPU, motherboard, RAM, videocard, storage (2 x M.2, 2x 2.5" SSD, 2x 8TB Seagate Barracudas for games), two USB PCI-E cards (to add both USB-A and USB-C ports for accessories such as my webcam, audio mixer, microphone, Stream Deck), and a liquid cooler for the CPU.
You know, you could buy an iMac and a PS5 for $1,500 combined? Right now a generation of young adults, who grew up without smartphones can afford and are inclined to spend $2500 on a Gaming PC. But I don't see the next generation ever starting with PC gaming at these prices.
 

TechnoMonk

macrumors 68030
Oct 15, 2022
2,605
4,112
I don't think OP has ever used models above 7b or 13b. 65b is considered where it starts becoming somewhat usable relative to cloud Bing Chat and ChatGPT but it requires >32GB RAM. Even 30b requires >16GB RAM plus it becomes painfully slow without discrete GPU. So, it's not just about RAM but RAM + compute performance.
The problem is affordable discrete GPU (~2k usd) don’t have anything more than 24 GB. Slow is better than out of memory.
 

ifxf

macrumors 6502a
Jun 7, 2011
606
1,006
GPT models are always learning, they need high bandwidth access to the web. This is a major reason that the best models will always be cloud based.
 

TechnoMonk

macrumors 68030
Oct 15, 2022
2,605
4,112
llama.cpp added hybrid GPU acceleration + system RAM so cloud > local dGPU > local CPU. Someone else also added support for AMD dGPUs via ROCm.

https://github.com/ggerganov/llama.cpp/pull/1375
Yikes. The baseline to increase speed was GPU twice slower than a CPU. Most of the guys not using a 1070 in that link barely reported performance improvements, in fact the guy using A100 reported slower performance.
I have tried low mem settings on 4090/3090. I can barely run a 7b let alone 13 B or 65 B. Even the hacks to run at lower memory are unusable, it takes around 30-45 mins of predict time at 4 bits. Now these work around seems to generate less than 100 tokens of prediction. Forget running running 30 B or 65 B on single 3090/4090. My 64 GB M1 Max starts running out of memory at 700-800 tokens.
Right now best and cheapest Nvidia GPU available to run these models is RTX 6000 (48 GB), which costs more than a MBP M1 Max 64 GB and M2 Max.
 

mi7chy

macrumors G4
Oct 24, 2014
10,620
11,294
It's early development that's continually being improved so don't know what the whining is about. Pros have 80GB A100 or H100. Smart and thrifty use slightly older NVLinked GPUs or the cloud. Nobody on r/LocalLLama where the competent discussions take place cares about waiting on M whatever.
 

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
Yikes. The baseline to increase speed was GPU twice slower than a CPU. Most of the guys not using a 1070 in that link barely reported performance improvements, in fact the guy using A100 reported slower performance.
I have tried low mem settings on 4090/3090. I can barely run a 7b let alone 13 B or 65 B. Even the hacks to run at lower memory are unusable, it takes around 30-45 mins of predict time at 4 bits. Now these work around seems to generate less than 100 tokens of prediction. Forget running running 30 B or 65 B on single 3090/4090. My 64 GB M1 Max starts running out of memory at 700-800 tokens.
Right now best and cheapest Nvidia GPU available to run these models is RTX 6000 (48 GB), which costs more than a MBP M1 Max 64 GB and M2 Max.

Alpaca 7B does run, albeit slowly, on a Core i7 with 8GB RAM, and okay-ish on a 32 GB RAM.
But I'm talking about using the CPUs instead of the GPUs.
Maybe a hybrid approach (CPU + GPU) will prove promising.

Google has also made some breakthroughs on stable diffusion to make RAM management much more efficient, but I don't know if they have committed the breakthroughs to Llama.
 

applepotato666

macrumors 6502a
Jun 25, 2016
515
1,080
I see Apple sticking with 8/256 for a while. Until hardware costs drop, they won't make a major shift in software that makes software incompatible with hardware that raises the price significantly or reduces their margins. After all, they have full control over software and hardware. It's more likely they'll find another way to implement this and further monetize it.

There is also the issue of competition. Here in Brazil, Apple sells an 8 GB/256 GB MacBook Air for about the same price Dells sells a 32 GB/1 TB 13-inch XPS Plus. One may say that a Mac has no competitors and blah blah blah, but this is starting to look ridiculous.
Outside of the US, yes. I'm a huge critic of Apple's pricing but the Dell would simply not be able to provide the same great design, battery life, silent and cool CPU performance, build quality. Unlike many of Apple's other product lines, the Mac currently has a very clear value proposition (At least in the US, everyone everywhere else got the middle finger by Apple this year). There, even the 8/256 version is worth the price considering everything else it provides as a whole package. M-series Macs are great at what they are.
 

dmccloud

macrumors 68040
Sep 7, 2009
3,142
1,899
Anchorage, AK
You know, you could buy an iMac and a PS5 for $1,500 combined? Right now a generation of young adults, who grew up without smartphones can afford and are inclined to spend $2500 on a Gaming PC. But I don't see the next generation ever starting with PC gaming at these prices.

You know, the iMac doesn't meet my needs in the least, but "thanks" for trying to tell me what I should and shouldn't buy. /s

As far as the "next generation" goes, you'd be surprised at just how many of them spend the big bucks on gaming in general, let alone tricking out gaming PCs.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
You know, you could buy an iMac and a PS5 for $1,500 combined? Right now a generation of young adults, who grew up without smartphones can afford and are inclined to spend $2500 on a Gaming PC. But I don't see the next generation ever starting with PC gaming at these prices.
PC gaming first became popular in the 90s. Adjusted for inflation, the typical gaming PC from that time was about as expensive as a Mac Studio with the M1 Ultra today. Adjusted for income, the Mac Studio looks rather cheap.

And even back then, PC gaming was not a particularly expensive hobby. People were spending far more money on other activities.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.