Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I’m going to try to respond to everything so please excuse me if I miss something - not intentionally cherry picking here.

Web 2.0 - the evolution of a product or products introduced to a market. People bit. So while we didn’t ask for it necessarily (we didn’t ask for horseless carriages either), we certainly liked it & a market, technology & product was born. Nothing at all wrong with this IMO. We at any time can stop using it should we choose in the same way we could stop driving cars but we’re not going to because they undeniably add value & productivity to our lives (and they’re fun).

Kiddos - Kiddos have no immediate need to internet & then limited access for education, controlled communication etc. when they turn 18 they can do whatever they like as a legal adult. Ideally I as parent have taught & provided the tools & experiences for relatively safe navigation of the inet w/ minimal mistakes (there will be a few). Until then, parent controls childs usage in how the see fit to do so. If techno giants choose to omit their children, it is their choice & legal obligation to control. I agree that it would be irresponsible of any parent not to control access. It is a tool.

Environment - I don’t disagree with the environmental impact of Ewaste. I do however look towards innovation & bright administrative minds to solve this admitted problem - from govt tax incentives to trade caps tied to recycling efforts (not my first choice mind you) to reuse & recycle trade market initiatives in economically developing nations, to better access & substantial buyback & repurposing programs benefitting both consumer (savings) & business (marketable angles, serving EC demographics etc. growing adoption & market share) to driving the need for reducing component toxicity. There’s a lot that gov’t can do to both regulate & incentivize responsible Ewaste handling.

Govt - regulation. Regulation choices are (to regulate & what not to) actually key to a number of opportunities - firstly the sustainability & effect of Ewaste as spoken to above. Without it, I agree that it is not a sustainable venture. Aside from that control, stepping out of the way allowing for innovation to happen freely & for entrepreneurs & large businesses alike to bring products to market is key as to drive wealth creation for everyone - from the developer to factory to retail to support. All of these spaces hire people supporting & reinvesting into our local economies. If we choose to significantly limit or stop innovation & product development through heavy handed controls on market or business, we destroy that. As currency very much is how we move forward through life & how we fund our communities & societies, to destroy that with over reaching, fear based regulation, that serves no one at any economic strata. That’s where we go wrong.

Gov’t - surveillance. We know it happens in the name of nat’l Security. The best any one of us can do AFAIK is to advocate for strong personal liberty & limited gov’t founding document protections (wherever you’re from) & realize that one persons surveillance tracking is another parents trackability of their son/daughter out on their first school trip foray or Saturday trip with friends into a big world of twisted bad people meaning that there are limitations to, exceptions of & maleability towards the value, use & interpretative nature of law that people & representatives of said people within gov’t hold.

On a work break - may elaborate more later. Interesting food for thought none the less.
 
Last edited:
  • Like
Reactions: sparty411
Heck, Apple considers the Intel Core Duo processors (32-bit, before Core 2 Duo) obsolete, let alone the G4s!

Apple considers a 2011 Core i7 obsolete.

https://support.apple.com/en-us/HT201624

They also consider a 2009 Core 2 Duo (MacBook) more capable than an Octocore Xeon (Mac Pro) from the same year, as evidenced by the Sierra requirements.

I am not joking.

https://support.apple.com/kb/SP742?locale=en_US

Seriously, you can compare the two. (https://support.apple.com/kb/SP579?locale=en_US), (https://support.apple.com/kb/SP506?locale=en_US)
 
Last edited:
While I understand that it can be quicker and easier to throw hardware at a problem I also feel that developers should be taught to write compact and efficient code.

I don't disagree, I was just offering some context as to why it happens as it's not just laziness, it's an entire culture/situation thing that means since it's never been (and likely won't be) that necessary from a commercial POV that it's less of a driver than it could be.

I'm not advocating for a return for those days but it might be a good idea for developers to learn / develop on less capable systems. Perhaps doing so would cause developers to think more about writing compact and efficient code. Instead developers tend to have very capable systems utilizing the fastest processors, disk systems, memory, GPU, etc. I doubt this will ever happen as the way it is is too engrained to change.

Devices like the Raspberry Pi have actually helped here as a lot of younger people are using them in schools and such and although they are very capable compared to older hardware they are less capable than modern desktops so some thought is going into the code written for them.

Your last sentence is the key bit, in a world driven by commercials and 'making stuff work now' where hardware is cheap there is little incentive to try and be more efficient. That doesn't make it right, but is how it is.

If a company is offered the choice of pay £X for beefy hardware, Vs pay 5x£X in dev time to optimise or attempt to make things more efficient then the hardware wins every time, especially when next years hardware will be quicker again and may actually allow further £ savings purely by getting equal performance out of less electricity. I don't necessarily like it, but I do understand it.
 
Last edited:
  • Like
Reactions: weckart
I don't disagree, I was just offering some context as to why it happens as it's not just laziness, it's an entire culture/situation thing that means since it's never been (and likely won't be) that necessary from a commercial POV that it's less of a driver than it could be.
Apologies if it sounded as if my response came across as disagreement with what you had written, it was not. It was intended as a general statement.

Devices like the Raspberry Pi have actually helped here as a lot of younger people are using them in schools and such and although they are very capable compared to older hardware they are less capable than modern desktops so some thought is going into the code written for them.

Your last sentence is the key bit, in a world driven by commercials and 'making stuff work now' where hardware is cheap there is little incentive to try and be more efficient. That doesn't make it right, but is how it is.

If a company is offered the choice of pay £X for beefy hardware, Vs pay 5x£X in dev time to optimise or attempt to make things more efficient then the hardware wins every time, especially when next years hardware will be quicker again and may actually allow further £ savings purely by getting equal performance out of less electricity. I don't necessarily like it, but I do understand it.
I think it would be an interesting experiment to see just how useable old computers could be if developers optimized instead of just coded. It'll never happen but I like to think about it. I am still amazed that a 1.33GHz G4 system struggles with the modern Internet when it used to work just fine back in the day.
 
While I understand that it can be quicker and easier to throw hardware at a problem I also feel that developers should be taught to write compact and efficient code.

Recently I've been playing with my Apple II systems and I am amazed at what developers were able to do with a 1MHz system with 128KB of RAM. That useable programs could be squeezed into such constraints. Granted they're not as full featured as today's programs but they did quite a lot given the constraints.

I'm not advocating for a return for those days but it might be a good idea for developers to learn / develop on less capable systems. Perhaps doing so would cause developers to think more about writing compact and efficient code. Instead developers tend to have very capable systems utilizing the fastest processors, disk systems, memory, GPU, etc. I doubt this will ever happen as the way it is is too engrained to change.

Growing up with a ColecoVision ADAM to 8086 - Pentium II and then having bought a (black) Macbook to use during
my Networking studies, then reverting to older and slower , arriving at PowerPC , I feel you .

I'm going back to my school roots when I studied for programmer and decided on the less popular 8 bit platform ,
Atari ( most people would pick a C64 ), and back to Assembler .
The Polish Atari scene and others are squeezing things out of this 8 bit machine that I deemed impossible.


So older hardware still more than capable.
 
@Lastic That video reminds me of the BB demo, which was created to showcase how far you could push ASCII to render things onscreen. It's got killer music to boot.


I think both demos are very effective in what they try to do.
 
  • Like
Reactions: Lastic
@Lastic That video reminds me of the BB demo, which was created to showcase how far you could push ASCII to render things onscreen. It's got killer music to boot.


I think both demos are very effective in what they try to do.

Amazing, I have always and still am a huge follower of the Demoscene.
Have a look at tAAt and TMDC , at a certain time people made 3D ASCII engines and just started releasing ASCII only
demos http://www.pouet.net/party.php?which=167&when=2017

And now I will stop derailing the original topic :)
 
  • Like
Reactions: z970
Apologies if it sounded as if my response came across as disagreement with what you had written, it was not. It was intended as a general statement.


I think it would be an interesting experiment to see just how useable old computers could be if developers optimized instead of just coded. It'll never happen but I like to think about it. I am still amazed that a 1.33GHz G4 system struggles with the modern Internet when it used to work just fine back in the day.

I think they’d be an order of magnitude more useful, just look at coreplayer compared to every other option for a good example!

It really is a shame what has happened to PPC support and dev, it could so easily have been different...
 
Gov’t - surveillance.

I'll cherry pick: I was addressing private/corporate surveillance. I wasn't even thinking about public surveillance.

That's all I need to say.
[doublepost=1551911356][/doublepost]
I think they’d be an order of magnitude more useful, just look at coreplayer compared to every other option for a good example!

It really is a shame what has happened to PPC support and dev, it could so easily have been different...

It still can be different, but that difference must come without the backing of Apple.
 
  • Like
Reactions: sparty411
I'll cherry pick: I was addressing private/corporate surveillance. I wasn't even thinking about public surveillance.

That's all I need to say.

Honestly, I doubt there is much difference between the two but en masse public surveillance is the greater threat & nefarious intent imo. Private business wants your data to more efficiently understand their market & how to sell to you. They’ve already been doing that to society for the last 100 years at least - just that digital is new & different in many ways. Anyhow, That’s not to say said data could not be or wouldn’t be provided should gov’t come knocking but the original intent if anything is to study & understand how to engage numerous digital trade areas & the demographics that populate them, not outright surveillance & control ala 1984.
 
"Is the G4 dead?" That's subjective, really. For my use cases, it is still very much 'alive.'

I had a somewhat adjacent thought about this a while ago and was reminded of it again this day, so enjoy this hecking hot take:

“Abandonware, including (and especially) operating systems no longer being supported by the company which made them, must release all source code to the FOSS community so that developers can continue community-driven development for them on legacy hardware — not only to patch up security holes and to give companies pause when rushing new OS builds without cultivating them to their fullest, but also as a way to divert e-waste by bringing legacy hardware back into use.

“Which means Apple has to publicly release all source code for every Mac OS build from OS X 10.0 Cheetah, through ::checking notes:: macOS 10.11 El Capitan. Sorry but I'm making these rules.”

:)
 
  • Like
Reactions: z970 and sparty411
I had a somewhat adjacent thought about this a while ago and was reminded of it again this day, so enjoy this hecking hot take:

“Abandonware, including (and especially) operating systems no longer being supported by the company which made them, must release all source code to the FOSS community so that developers can continue community-driven development for them on legacy hardware — not only to patch up security holes and to give companies pause when rushing new OS builds without cultivating them to their fullest, but also as a way to divert e-waste by bringing legacy hardware back into use.

“Which means Apple has to publicly release all source code for every Mac OS build from OS X 10.0 Cheetah, through ::checking notes:: macOS 10.11 El Capitan. Sorry but I'm making these rules.”

:)
Stallman was unironically right about everything, you know.
 
  • Like
Reactions: B S Magnet and z970
Stallman was unironically right about everything, you know.

Most of his ideas are spot on, though the man has absolutely no idea how to present himself to the public (you can look proof up...).

He did a whole lot, though, and deserves recognition for what was put to the table.



Personally, I see him as a funnyman (from the 60's) to be heeded. At least as far as technology goes.
 
It still can be different, but that difference must come without the backing of Apple.

Oh yes it could, but it'll need a massive commercial incentive to be anything more than dedicated hobbyists producing occasional gems, but I remain hopeful ;-)
 
another fun topic full of rowdy moral & ethical argument on the il/legality of abandoneware.
 
For me, the answer is "more or less". The only PPC Mac I have in any kind of service is my 12" PowerBook, and it serves as an alarm clock and nothing else - and the only reason for that is I really like the app I use on it for that purpose, of which there is no Intel version. I still browse the web on it from time to time, but not for anything serious. Unfortunately my beloved eMac bit the dust a few weeks back, and quite honestly, as sad as I am about that I simply don't have the motivation to fix it and keep it going at this point when it was already semi-retired anyway.

I guess I'm changing - I used to really enjoy spending hours finding workarounds to make these machines still useful in the modern world, but the fun of that has dropped off considerably in the last year or two, replaced only with frustration. Even if it were still as fun as it once was, I doubt I could justify spending the requisite amount of time needed to pursue it, I'm a lot busier than I once was. Kudos to y'all still fighting the good fight here, but I suppose as much as I hate to let it go, my PowerPC days ended in 2019.
 
The biggest factor in my opinion is having more than one CPU.

My 1GHz Dual Quicksilver seems to run faster for most tasks, definitely web browsing, than my 1.67GHz PBG4 DLSD.

My Quicksilver running Tiger might be the zippiest non-SSD desktop computer I've ever used in my life.
 
  • Like
Reactions: timidpimpin
The biggest factor in my opinion is having more than one CPU.

My 1GHz Dual Quicksilver seems to run faster for most tasks, definitely web browsing, than my 1.67GHz PBG4 DLSD.

My Quicksilver running Tiger might be the zippiest non-SSD desktop computer I've ever used in my life.

Dual processors aren't the only thing you're feeling. It's also the 1MB L3 cache.

Later and consumer G4s, and all G5s, did not have an L3 cache, which I think was a stupid decision on their part. The excuse was that it was "fast enough" without one, but then why is a 2002 QS outrunning a 2005 top-of-the-line professional notebook if that is indeed the case? Given that, I'd like to see a Leopard comparison between a 1Ghz Ti and a DLSD. Not only that, 1.42 MDDs usually outpace 1.6 G5s, not just in use, but numbers as well.

Unfortunately, Apple ended up unnecessarily crippling the PowerPC machines / lineup in more small ways than one. - Which may or may not have been related to that recently-uncovered AudioIPCDriver issue...
 
Last edited:
L3 can certainly help supplement a G4's slow bus, but wasn't really needed on the G5 architecture because of the much higher bus speed. Even the single 1.6GHz G5 has an 800MHz bus. The fastest G4 bus Apple used was 167MHz.

In my experiences L3 doesn't help a lot with web browsing or common lighter tasks. It comes into play with heavy crunching.
 
  • Like
Reactions: pl1984
L3 can certainly help supplement a G4's slow bus, but wasn't really needed on the G5 architecture because of the much higher bus speed.

Even so, L3 still was not present on later G4s and consumer G4s, such as the eMac, iMac, aluminum PowerBook, iBook, etc., which was suspicious when it was used for the earlier professional lineups, and demonstrated to help increase a machine's "zip".

Actually, L3 cache was not present on most G4s save for the Power Mac and titanium PowerBook. Were there any others I'm missing?

At the very least, the G5s should have had a larger L2 cache. It took them until the final revision to add 1 MB to L2.
 
  • Like
Reactions: dextructor
Software comes and goes. Hardware is forever.
Halt and Catch Fire, S1 E3 High Plains Hardware​

Given this forum, the G4 is very much alive.

  • I switched back from ThunderBird (Intel Linux) to Apple Mail (PowerBook G4.)
  • I use Terminal on my Macs, just as much as I do on the Linux machines.
  • I use TenFourFox, and it is good enough. The so-called "modern" web sucks, regardless of what I use. I like the uMatrix Add-On.
    "Block pop-up windows" has been enabled by default for the past 20 years or so. It's taken 10 years of Web 2.0 (bloated EcmaScript) to create work-arounds -- those dialog boxes that cover web content. If the Web 2.0 page comes up blank, so be it.​
Moore's Law is over. Intel is lame:

Code:
cat /proc/cpuinfo
...
model name: Intel(R) Celeron(R) CPU N2830 @ 2.16GHz
...
bugs: cpu_meltdown spectre_v1 spectre_v2
...
address sizes: 36 bits physical, 48 bits virtual
...

I'll mention again Intel Management Engine (ME): it is in there for the past 10 years, an undocumented system with access to everything.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.